-
Notifications
You must be signed in to change notification settings - Fork 10.3k
Splitting dotnet.wasm when publishing #51772
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@elepner thanks for contacting us.
The server should serve the compressed version by default on hosted scenarios and the browser will automatically handle the decompression. I'm not sure if I am missing anything with regards to the statement above. The only case where you might need to manually decompress the file is when deploying to a host where you don't have control of the response headers. Even in that case, I would check what size you get with gzip and potentially use the CompressionStream API in JS directly instead of Brotli.
|
Hi @elepner. We have added the "Needs: Author Feedback" label to this issue, which indicates that we have an open question for you before we can take further action. This issue will be closed automatically in 7 days if we do not hear back from you by then - please feel free to re-open it if you come back to this issue after that time. |
We host the web app on Azure CDN and I'm pretty sure that dotnet.wasm is not compressed by the CDN and sent as is. In order to fix this issue we follow the procedures explained here. We deploy brotli compressed files produced by |
@elepner thanks for the additional details. @danroth27 this is something we could bring up with the Azure CDN folks? According to this Azure CDN already supports compression but not brotli compression. That seems to be the missing gap. |
Thanks for contacting us. We're moving this issue to the |
I have clarified feature request information. |
One more thing if somebody comes across this issue. After RTFM I have figured out that at least on Azure it's possible to serve large but manually compressed files, which Blazor publish kindly provides. One needs to upload compressed files without gz/br extension to the Table Storage but with Content-Encoding: br or gzip and Azure CDN picks it up and serves compressed files. Also, I have tried splitting dotnet.wasm and merge it back inside Blazor loader (Tried this idea). It's turned out that Azure uses lower level of compression than Blazor publish does. In my case it's 9.4 vs 17 Mb for 44Mb dotnet.wasm file. |
Is there an existing issue for this?
Is your feature request related to a problem? Please describe the problem.
It's known issue that dotnet.wasm is a large file that every user of blazor browser should download. It's 44Mb for the project that I work on. Brotli compresses it to 9MB which is way better. However this compression comes at the cost that the client app needs to deal with the decompression by itself and needs brotly-dec-wasm library and each developer manually should mess around with the files and boot blazor.
Describe the solution you'd like
It'd be better if such job is done natively between server-browser. However there's a problem, that for instance, Azure CDN has a limit of 8MB for a file to be eligible for compressing. If it was possible to split dotnet.wasm during publishing to multiple chunks then this would problem resolved automatically on the browser level.
Additional context
UPD 02.11.2023. Clarifying feature request a little bit here.
The way how Blazor publishes bundles does not play way with major CDN providers (Azure and AWS and GCP) because they have limit ~10Mb for the file to be compressed. Blazor produces files within the bundle (dotnet.wasm specifically) which are larger than that thus making developers implement their own DIY loader if they want both consume compressed bundle and use those CDN providers.
One of ways to overcome this is to give developers the control of the maximum size of an item in the bundle. E.g. in
.csproj
file like:The metadata about file chunks can be reflected in
blazor.boot.json
file and used by the loaded to download and merge the chunks back.The text was updated successfully, but these errors were encountered: