Closed benaadams closed 4 years ago
A middleware to handle the precompressed files could be something along the lines of:
public class PrecompressedBlazorMiddleware
{
private readonly ILogger<PrecompressedBlazorMiddleware> _logger;
private readonly RequestDelegate _next;
public PrecompressedBlazorMiddleware(RequestDelegate next, ILogger<PrecompressedBlazorMiddleware> logger)
{
_logger = logger;
_next = next;
}
public Task InvokeAsync(HttpContext context)
{
var request = context.Request;
var path = request.Path.Value;
var extraExtension = string.Empty;
var responseHeaders = context.Response.Headers;
if (path.StartsWith("/_framework/", StringComparison.Ordinal))
{
responseHeaders[HeaderNames.Vary] = HeaderNames.AcceptEncoding;
var acceptEncoding = request.Headers[HeaderNames.AcceptEncoding].ToString();
if (acceptEncoding.Length > 64)
{
// Not happy parsing, this is far too long
context.Response.StatusCode = StatusCodes.Status431RequestHeaderFieldsTooLarge;
return Task.CompletedTask;
}
extraExtension = GetCompressionExtension(acceptEncoding);
}
if (extraExtension.Length > 0)
{
// Accept a compression type; so change path so
// StaticFiles picks up right file
request.Path = path + extraExtension;
// Note: Content-types still need to be set
switch (extraExtension)
{
case ".br":
responseHeaders[HeaderNames.ContentEncoding] = "gzip";
break;
case ".gz":
responseHeaders[HeaderNames.ContentEncoding] = "br";
break;
}
_logger.CompressionApplied(extraExtension);
}
return _next(context);
}
private static string GetCompressionExtension(ReadOnlySpan<char> acceptEncoding)
{
var extraExtension = string.Empty;
foreach (var range in acceptEncoding.Split(','))
{
var encoding = acceptEncoding[range];
// Check if is a Quality
var qualityStart = encoding.IndexOf(';');
if (qualityStart > 0)
{
// Remove Quality
encoding = encoding[..qualityStart];
}
// Remove any additional spaces
encoding = encoding.Trim(' ');
if (encoding.SequenceEqual("br"))
{
// Brotli accepted, set the additional file extension
extraExtension = ".br";
// This is our preferred compression so exit the loop
break;
}
else if (encoding.SequenceEqual("gzip"))
{
// Gzip accepted, we'll set the extension, but keep looking
extraExtension = ".gz";
}
}
return extraExtension;
}
}
internal static class PrecompressedBlazorLoggerExtensions
{
private static readonly Action<ILogger, string, Exception> _compressionApplied =
LoggerMessage.Define<string>(LogLevel.Debug, new EventId(1, nameof(CompressionApplied)), "Response Compression Applied: {CompressionExtension}");
public static void CompressionApplied(this ILogger<PrecompressedBlazorMiddleware> logger, string compressionExtension) => _compressionApplied(logger, compressionExtension, null);
}
public static class PrecompressedBlazorMiddlewareExtensions
{
public static IApplicationBuilder UsePrecompressedPrecompressedBlazor(this IApplicationBuilder builder)
{
return builder.UseMiddleware<PrecompressedBlazorMiddleware>();
}
}
// To enumerate AcceptEncoding in a non-allocating way
internal static partial class MemoryExtensions
{
public static SpanSplitEnumerator<char> Split(this ReadOnlySpan<char> span, char separator)
=> new SpanSplitEnumerator<char>(span, separator);
public ref struct SpanSplitEnumerator<T>
#nullable disable // to enable use with both T and T? for reference types due to IEquatable<T> being invariant
where T : IEquatable<T>
#nullable restore
{
private readonly ReadOnlySpan<char> _span;
private readonly char _separatorChar;
private int _start;
private bool _started;
private bool _ended;
private Range _current;
public SpanSplitEnumerator<T> GetEnumerator() => this;
public Range Current
{
get
{
if (!_started || _ended)
{
Throw();
}
return _current;
static void Throw()
{
throw new InvalidOperationException();
}
}
}
internal SpanSplitEnumerator(ReadOnlySpan<char> span, char separator) : this()
{
_span = span;
_separatorChar = separator;
}
public bool MoveNext()
{
_started = true;
if (_start > _span.Length)
{
_ended = true;
return false;
}
ReadOnlySpan<char> slice = _start == 0
? _span
: _span.Slice(_start);
int end = _start;
if (slice.Length > 0)
{
int index = slice.IndexOf(_separatorChar);
if (index == -1)
{
index = slice.Length;
}
end += index;
}
_current = new Range(_start, end);
_start = end + 1;
return true;
}
}
}
Blazor files (wasm+dlls) are currently served uncompressed
Under what hosting model? AFAIK compression is on by default in the templates.
If you’re finding compression of these files is not on by default, can you provide repro steps?
I think it is a great idea to pre-compress all files in _framework directory.
I think I was looking at a misconfigured webserver as it was also serving the dlls as application/x-msdownload
rather than application/octet-stream
.
Publishing to AppService it looks to use web.config to get IIS to do dynamic compression so it clocks in at 2.7MB rather than 5.4MB (saving 50%)
However, the dynamic compression is still leaving something on the table. As its dynamic it has to be less optimal to get the speed so mono.wasm
comes down gzipped at 986kB; when a more optimal gzip (as above) would be 703kB and the Brotli 560kB. The Brotli in total taking it down further from the dynamic compression to 1.7MB, a further saving of 37%.
Doing the pre-compression (as per the for loop in the summary); the middleware above would work when hosting in Kestrel or other ASP.NET Core webserver (which isn't currently shown as an option on Host and deploy ASP.NET Core Blazor WebAssembly 😢)
And for the web.config the rewrite rules in Components/Blazor/Build/src/targets/Standalone.Web.config could change as follows:
<configuration>
<system.webServer>
<staticContent>
<remove fileExtension=".dll" />
<remove fileExtension=".json" />
<remove fileExtension=".wasm" />
<remove fileExtension=".woff" />
<remove fileExtension=".woff2" />
+ <remove fileExtension=".js.gz" />
+ <remove fileExtension=".dll.gz" />
+ <remove fileExtension=".json.gz" />
+ <remove fileExtension=".wasm.gz" />
+ <remove fileExtension=".js.br" />
+ <remove fileExtension=".dll.br" />
+ <remove fileExtension=".json.br" />
+ <remove fileExtension=".wasm.br" />
<mimeMap fileExtension=".dll" mimeType="application/octet-stream" />
<mimeMap fileExtension=".json" mimeType="application/json" />
<mimeMap fileExtension=".wasm" mimeType="application/wasm" />
<mimeMap fileExtension=".woff" mimeType="application/font-woff" />
<mimeMap fileExtension=".woff2" mimeType="application/font-woff" />
+ <mimeMap fileExtension=".js.gz" mimeType="application/javascript" />
+ <mimeMap fileExtension=".dll.gz" mimeType="application/octet-stream" />
+ <mimeMap fileExtension=".json.gz" mimeType="application/json" />
+ <mimeMap fileExtension=".wasm.gz" mimeType="application/wasm" />
+ <mimeMap fileExtension=".js.br" mimeType="application/javascript" />
+ <mimeMap fileExtension=".dll.br" mimeType="application/octet-stream" />
+ <mimeMap fileExtension=".json.br" mimeType="application/json" />
+ <mimeMap fileExtension=".wasm.br" mimeType="application/wasm" />
</staticContent>
<httpCompression>
<dynamicTypes>
- <add mimeType="application/octet-stream" enabled="true" />
- <add mimeType="application/wasm" enabled="true" />
+ <remove mimeType="application/javascript" />
+ <remove mimeType="application/json" />
+ <remove mimeType="application/octet-stream" />
+ <remove mimeType="application/wasm" />
</dynamicTypes>
+ <staticTypes>
+ <remove mimeType="application/javascript" />
+ <remove mimeType="application/json" />
+ <remove mimeType="application/octet-stream" />
+ <remove mimeType="application/wasm" />
+ </staticTypes>
</httpCompression>
<rewrite>
+ <outboundRules rewriteBeforeCache="true">
+ <rule name="Add Vary Accept-Encoding" preCondition="PreCompressedFile" enabled="true">
+ <match serverVariable="RESPONSE_Vary" pattern=".*" />
+ <action type="Rewrite" value="Accept-Encoding" />
+ </rule>
+ <rule name="Add Encoding Brotli" preCondition="PreCompressedBrotli" enabled="true" stopProcessing="true">
+ <match serverVariable="RESPONSE_Content_Encoding" pattern=".*" />
+ <action type="Rewrite" value="br" />
+ </rule>
+ <rule name="Add Encoding Gzip" preCondition="PreCompressedGzip" enabled="true" stopProcessing="true">
+ <match serverVariable="RESPONSE_Content_Encoding" pattern=".*" />
+ <action type="Rewrite" value="gzip" />
+ </rule>
+ <preConditions>
+ <preCondition name="PreCompressedFile">
+ <add input="{HTTP_URL}" pattern="\.(gz|br)$" />
+ </preCondition>
+ <preCondition name="PreCompressedGzip">
+ <add input="{HTTP_URL}" pattern="\.gz$" />
+ </preCondition>
+ <preCondition name="PreCompressedBrotli">
+ <add input="{HTTP_URL}" pattern="\.br$" />
+ </preCondition>
+ </preConditions>
+ </outboundRules>
<rules>
<rule name="Serve subdir">
<match url=".*" />
<action type="Rewrite" url="[ServeSubdirectory]{R:0}" />
</rule>
+ <rule name="Rewrite brotli file" stopProcessing="true">
+ <match url="(.*)"/>
+ <conditions>
+ <add input="{HTTP_ACCEPT_ENCODING}" pattern="br" />
+ <add input="{REQUEST_FILENAME}.br" matchType="IsFile" />
+ </conditions>
+ <action type="Rewrite" url="{R:1}.br" />
+ </rule>
+ <rule name="Rewrite gzip file" stopProcessing="true">
+ <match url="(.*)"/>
+ <conditions>
+ <add input="{HTTP_ACCEPT_ENCODING}" pattern="gzip" />
+ <add input="{REQUEST_FILENAME}.gz" matchType="IsFile" />
+ </conditions>
+ <action type="Rewrite" url="{R:1}.gz" />
+ </rule>
<rule name="SPA fallback routing" stopProcessing="true">
<match url=".*" />
<conditions logicalGrouping="MatchAll">
<add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true" />
</conditions>
<action type="Rewrite" url="[ServeSubdirectory]" />
</rule>
</rules>
</rewrite>
</system.webServer>
</configuration>
This is then lighter on the server and gives a further saving of 37% on transfer bandwidth:
Going from
To:
@benaadams I was investigating this same thing the other day. Thanks for your more detailed approach :)
I have one great idea for doing this, but I want to discuss it among the team first.
Your data definitely helps make my point :)
@benaadams Further thoughts here if you want to try things. I tried to ILMerge the output and wasn't able to do so, but one thing I thought is that maybe we can save a bit more extra space by bundling all assemblies together into one file and compressing that.
The idea would be that we save on the compression tables (likely doesn't matter a lot) and that by producing a bigger payload we have more opportunities to find longer/more common sequences across assembly boundaries.
@javiercn maybe with only small amount of additional work you will be able to implement minimization and pre-compressing for .css and .js. Probably all C# programmers would like to throw away all JavaScript tools and have everything integrated in Visual Studio.
@Andrzej-W To some degree yes, but in general no.
We don't plan to implement bundling and minification of js/css as it's a huge investment and there are existing tools out there that would do a much better job than us. (Webpack or Rollup for example).
With regards to compressing/serving pre-compressed assets, that's something we can do.
@javiercn @benaadams A few thoughts on merging.
Of course after compression one file will be smaller than a sum of individually compressed files and it is better from the point of view of server bandwidth. Saving can be quite big because we have a few big files and a lot of small DLL's.
On the other hand we can often achieve higher total download speed when we use a few concurrent downloads. Maybe we should merge only small files, let's say smaller than 100KB or 200KB. Here we will have the biggest saving in size and at the same we will still have a few concurrent downloads.
one thing I thought is that maybe we can save a bit more extra space by bundling all assemblies together into one file and compressing that.
I'd agree; you'd probably want the file to be uncompressed as far as the Js is concerned (e.g. naive List<name+offset+size>+dlls appended or ilmerge); then do the precompression on it and let the browser/http do the heavy lifting of decompression rather than trying to decompress in Js.
On the other hand we can often achieve higher total download speed when we use a few concurrent downloads.
This is true, however there are other files also that are part of the load e.g.
blazor.webassembly.js
, blazor.boot.json
, mono.wasm
, etc
Can also experiment with preload for these files that aren't bundled dlls with a Link
header on the root request so the browser knows it needs to start downloading them when it gets the root document (though its can backfire if you go too far)
A change to add blazor.webassembly.js
and mono.wasm
to the above rewrite rules for example would be:
<outboundRules rewriteBeforeCache="true">
<rule name="Add Vary Accept-Encoding" preCondition="PreCompressedFile" enabled="true">
<match serverVariable="RESPONSE_Vary" pattern=".*" />
<action type="Rewrite" value="Accept-Encoding" />
</rule>
<rule name="Add Encoding Brotli" preCondition="PreCompressedBrotli" enabled="true" stopProcessing="true">
<match serverVariable="RESPONSE_Content_Encoding" pattern=".*" />
<action type="Rewrite" value="br" />
</rule>
<rule name="Add Encoding Gzip" preCondition="PreCompressedGzip" enabled="true" stopProcessing="true">
<match serverVariable="RESPONSE_Content_Encoding" pattern=".*" />
<action type="Rewrite" value="gzip" />
</rule>
+ <rule name="Add wasm Preload" preCondition="IsRoot" enabled="true">
+ <match serverVariable="RESPONSE_Link" pattern="^$" />
+ <action type="Rewrite" value="</_framework/blazor.webassembly.js>; rel=preload; as=script;, </_framework/wasm/mono.wasm>; rel=preload; as=fetch; crossorigin" />
+ </rule>
<preConditions>
+ <preCondition name="IsRoot">
+ <add input="{HTTP_X_ORIGINAL_URL}" pattern="/+$" />
+ </preCondition>
<preCondition name="PreCompressedFile">
<add input="{HTTP_URL}" pattern="\.(gz|br)$" />
</preCondition>
<preCondition name="PreCompressedGzip">
<add input="{HTTP_URL}" pattern="\.gz$" />
</preCondition>
<preCondition name="PreCompressedBrotli">
<add input="{HTTP_URL}" pattern="\.br$" />
</preCondition>
</preConditions>
</outboundRules>
For example if it was a single file for the dlls; then a Link header on the mono.js for that single dll archive file would help fill the bandwidth lull between parsing and the start of the dlls download:
https://github.com/aspnet/AspNetCore/issues/16956#issuecomment-552474457 Yep, I did this too as part of my experiments.
You can essentially put all of these inside a link header with preload to speed things up.
I'd agree; you'd probably want the file to be uncompressed as far as the Js is concerned (e.g. naive List<name+offset+size>+dlls appended or ilmerge); then do the precompression on it and let the browser/http do the heavy lifting of decompression rather than trying to decompress in Js.
Yeah, my idea was to use multipart/mixed for that as it can be easily read on the client-side.
Great discussion :-)
Pre-compression sounds like a good approach. Did we already think about leveraging HTTP/2 with multiple files? Experience shows that having multiple files being loaded via HTTP/2 may trump 1/few large files via HTTP 1.1. But of course, testing is needed here.
I'd love to see something along the lines of:
dotnet publish -r wasm -c Release --self-contained br,gzip
or
dotnet publish -r wasm -c Release --compressed br,gzip
@javiercn Should we have a separate issue tracking the follow-up Brotli piece?
@benaadams FYI:
With Nginx (Ubuntu 18.04 VPS) serving a static (without Kestrel) Blazor WASM website with compression enabled it's now down to 1.9 MB
Nginx config: https://github.com/h5bp/server-configs-nginx/blob/master/h5bp/web_performance/compression.conf
I've added application/octet-stream
to gzip_types
and application/wasm wasm
to mime.types (Nginx config file)
It's a default Blazor Client Side template running at http://openwiki.com, you can test it yourself in case you want to.
Blazor files (wasm+dlls) are currently served not with maximal compression. As part of dotnet publish; aspnet should be pre-compressing everything in the
_framework
directory; and then serving the appropriate file based on theAccept-Encoding
header; in preference order:br
,gzip
,uncompressed
.They should be precompressed rather than using dynamic compression as there is no need to keep performing the same compression again and again when it can be done once as part of publish and it means a higher compression rate can be used.
This will reduce the data transfer by ~68% vs uncompressed and an additional ~ 37% vs dynamic compression e.g.
For the request
mono.wasm
br
is accepted servemono.wasm.br
(572,786 bytes) a 68% savinggzip
is accepted servemono.wasm.gz
(718,878 bytes) a 60% savingmono.wasm
(1,778,165 bytes)For the request
mscorlib.dll
br
is accepted servemscorlib.dll.br
(452,766 bytes) a 68% savinggzip
is accepted servemscorlib.dll.gz
(547,109 bytes) a 62% savingmscorlib.dll
(1,435,648 bytes)For the request
System.Text.Json.dll
br
is accepted serveSystem.Text.Json.dll.br
(84,464 bytes) a 70% savinggzip
is accepted serveSystem.Text.Json.dll.gz
(547,109 bytes) a 64% savingSystem.Text.Json.dll
(282,624 bytes)Brotli reduces download size for everything in the
_framework
directory from 5.37MB to 1.70MB a 68% saving.etc.
The numbers above are using the standard .NET Core compression e.g.