dotnet / aspnetcore

ASP.NET Core is a cross-platform .NET framework for building modern cloud-based web applications on Windows, Mac, or Linux.
https://asp.net
MIT License
35.21k stars 9.95k forks source link

Support serving of pre-GZIP encoded files #2458

Closed aspnet-hello closed 3 months ago

aspnet-hello commented 6 years ago

From @DamianEdwards on Friday, July 25, 2014 11:42:51 AM

Pre-GZIPping files is seemingly becoming more popular. This involves running a tool ahead of deployment that creates GZIPped copies of suitable files in the site, e.g. site.js => site.js.gzip. Then the file serving aspect of the web server will serve the GZIPped file when appropriate.

Copied from original issue: aspnet/StaticFiles#7

aspnet-hello commented 6 years ago

From @tugberkugurlu on Friday, July 25, 2014 1:24:16 PM

:thumbsup: for this one.

aspnet-hello commented 6 years ago

From @nikmd23 on Wednesday, September 10, 2014 1:47:47 PM

I'd also like to point out that this shouldn't be about "pre-gziping", but rather pre-compressing in general, no matter what the encoding format is.

For example, Microsoft and Google have both played with improved encoding schemes (peerdist and sdch). At one point Chrome even included sdch in its Accept-Encoding header. Of course these two examples are a bit crumby since neither caught on, but new compression techniques come up all the time.

What this means is, in reality, I might want to pre-compile a file into multiple formats and have the proper one selected based on the Accept-Encoding. My content directory could have several "gzip compatible" files encoded with zopfli (*.gz), the same files encoded in sdch (*.sdch) as well as the original uncompressed files for really old browsers.

I think of this as being very analogous to keeping images in both *.jpg and *.webp formats, and serving the appropriate ones at the appropriate times.

aspnet-hello commented 6 years ago

From @resnyanskiy on Monday, January 18, 2016 6:37:35 PM

I found simple workaround based on URL redirect supported in HTTP:

class Startup
{
    private StaticFileOptions StaticFileOptions
    {
        get
        {
            return new StaticFileOptions
            {
                OnPrepareResponse = OnPrepareResponse
            };                
        }
    }

    private void OnPrepareResponse(StaticFileResponseContext context)
    {
        var file = context.File;
        var request = context.Context.Request;
        var response = context.Context.Response;

        if (file.Name.EndsWith(".gz"))
        {
            response.Headers[HeaderNames.ContentEncoding] = "gzip";
            return;
        }

        if (file.Name.IndexOf(".min.", StringComparison.OrdinalIgnoreCase) != -1)
        {
            var requestPath = request.Path.Value;
            var filePath = file.PhysicalPath;

            if (IsDevelopment)
            {
                if (File.Exists(filePath.Replace(".min.", ".")))
                {
                    response.StatusCode = (int)HttpStatusCode.TemporaryRedirect;
                    response.Headers[HeaderNames.Location] = requestPath.Replace(".min.", ".");
                }
            }
            else
            {
                var acceptEncoding = (string)request.Headers[HeaderNames.AcceptEncoding];
                if (acceptEncoding.IndexOf("gzip", StringComparison.OrdinalIgnoreCase) != -1)
                {
                    if (File.Exists(filePath + ".gz"))
                    {
                        response.StatusCode = (int)HttpStatusCode.MovedPermanently;
                        response.Headers[HeaderNames.Location] = requestPath + ".gz";
                    }
                }
            }
        }
    }

   public void Configure(IApplicationBuilder application)
   {
        application
            .UseDefaultFiles()
            .UseStaticFiles(StaticFileOptions)
    }
}

I used Wikipedia as reference. This approach also allows to use not minified files in development environment with no need to change links on client side.

I found that currect implementation of 'aspnet/StaticFiles' lacks of two feature:

  1. Declarative setup of 'Content-Encoding' header. I suppose it can be implemented very easy - by adding second dictionary to FileExtensionContentTypeProvider.
  2. Redirect support. Current design doesn't provide the ability to cancel invoke 'IHttpSendFileFeature.SendFileAsync/StreamCopyOperation.CopyToAsync'.

@davidfowl fyi.

aspnet-hello commented 6 years ago

From @nikmd23 on Monday, January 18, 2016 7:30:14 PM

In support of my comment above, both FireFox and Chrome will be shipping support for Brotli compression soon.

aspnet-hello commented 6 years ago

From @jods4 on Monday, February 22, 2016 8:44:29 AM

:+1:

Actually, even a good story for serving gzipped static files (with a cache, of course, since it's static compression) is currently lacking. I'm saying this in the "conventional", "IIS-kind" of way, as opposed to the build-time gzip generation. Or did I miss something?

For those coming here from Google with dynamic content compression needs, you might try this gist: https://gist.github.com/PinpointTownes/ac7059733afcf91ec319

aspnet-hello commented 6 years ago

From @JohannesRudolph on Tuesday, July 5, 2016 1:33:54 AM

Would it be beyond the scope of this to suggest that StaticFiles should optionally allow cached "on-demand" compression of files? (I.e. gzip on first request and put result to file system cache, serve gzip requests from there)

aspnet-hello commented 6 years ago

From @twilliamsgsnetx on Friday, July 22, 2016 4:39:03 AM

Really looking forward to being able to do this. I can go through all the trouble of bundling, minifying, tree shaking and gzipping to get things down to an absolute minimum size... but I can't serve those .js.gz files and it's problematic.

I'm also strugglin to get IIS to gzip itself as well, but that's for another area entirely.

aspnet-hello commented 6 years ago

From @neyromant on Friday, July 29, 2016 10:26:24 PM

https://github.com/aspnet/Home/issues/1584

https://github.com/AnderssonPeter/CompressedStaticFiles

aspnet-hello commented 6 years ago

From @dfaivre on Wednesday, February 15, 2017 4:34:00 AM

With Angular (2) creating *.gz files when doing --prod builds, it would be great to have this out of the box.

aspnet-hello commented 6 years ago

From @RehanSaeed on Wednesday, February 15, 2017 4:36:34 AM

Other related use cases are serving pre-brotli compressed (.br) files and also serving WebP (.webp) files instead of PNG/JPG/etc.

aspnet-hello commented 6 years ago

From @Tratcher on Friday, February 17, 2017 5:54:44 AM

@JohannesRudolph you should now be able to combine ResponseCaching, ResponseCompression, and StaticFiles to achieve dynamic compression and caching of static files. @JunTaoLuo this would be a good combo to test.

aspnet-hello commented 6 years ago

From @Tratcher on Friday, February 17, 2017 5:59:33 AM

Can folks clarify if they expect the original url to contain the compression extension (e.g. ".gz")? Some of the above samples do and some don't. I assume content negotiation based on accept headers would be the more general case.

aspnet-hello commented 6 years ago

From @JohannesRudolph on Friday, February 17, 2017 6:03:22 AM

Great! Content-negotiation is (from my experience) the far more often-used and preferred method.

On Fri, Feb 17, 2017 at 2:59 PM, Chris R notifications@github.com wrote:

Can folks clarify if they expect the original url to contain the compression extension (e.g. ".gz")? Some of the above samples do and some don't. I assume content negotiation based on accept headers would be the more general case.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/aspnet/StaticFiles/issues/7#issuecomment-280656573, or mute the thread https://github.com/notifications/unsubscribe-auth/AAH8NyZXuqJehp3rBf7VkFTq_XulZ8gYks5rdafHgaJpZM4CQ_LY .

aspnet-hello commented 6 years ago

From @dfaivre on Monday, February 20, 2017 5:31:20 AM

@Tratcher -- I would not expect the original URL to contain the extension.

Also, looks like the Angular CLI will no longer output the "*.gz" files, so @Tratcher's ResponseCaching and ResponseCompression solution may be sufficient in that case?

aspnet-hello commented 6 years ago

From @firecube on Thursday, March 23, 2017 1:27:59 PM

@Tratcher do you have an example of combining those three bits of middleware to suit this scenario?

Thank you.

aspnet-hello commented 6 years ago

From @Tratcher on Thursday, March 23, 2017 1:29:33 PM

@JunTaoLuo can you do a sample?

aspnet-hello commented 6 years ago

From @firecube on Sunday, April 2, 2017 5:22:57 AM

A small example would be really appreciated here if possible. Thanks in advance.

aspnet-hello commented 6 years ago

From @JunTaoLuo on Sunday, April 2, 2017 3:59:35 PM

You can take a look at the sample I have created at https://github.com/JunTaoLuo/MiddlewaresSample which uses ResponseCaching, ResponseCompression and StaticFiles to create, cache and serve different representations of the same resource.

Here's the sample output where I made 6 requests to http://localhost:5000/Foo.txt. The first was sent with no Accept-Encoding header, the second contained Accept-Encoding: gzip and the third had Accept-Encoding: custom. Then I made the first three requests again and checked that they were served from cache.

Hosting environment: Production
Content root path: C:\gh\tp\MiddlewaresSample\MiddlewaresSample\MiddlewaresSample
Now listening on: http://localhost:5000
Application started. Press Ctrl+C to shut down.
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[1]
      Request starting HTTP/1.1 GET http://localhost:5000/Foo.txt
info: Microsoft.AspNetCore.ResponseCaching.ResponseCachingMiddleware[24]
      No cached response available for this request.
info: Microsoft.AspNetCore.StaticFiles.StaticFileMiddleware[2]
      Sending file. Request path: '/Foo.txt'. Physical path: 'C:\gh\tp\MiddlewaresSample\MiddlewaresSample\MiddlewaresSample\wwwroot\Foo.txt'
info: Microsoft.AspNetCore.ResponseCaching.ResponseCachingMiddleware[26]
      The response has been cached.
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[2]
      Request finished in 206.5195ms 200 text/plain
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[1]
      Request starting HTTP/1.1 GET http://localhost:5000/Foo.txt
info: Microsoft.AspNetCore.ResponseCaching.ResponseCachingMiddleware[24]
      No cached response available for this request.
info: Microsoft.AspNetCore.StaticFiles.StaticFileMiddleware[2]
      Sending file. Request path: '/Foo.txt'. Physical path: 'C:\gh\tp\MiddlewaresSample\MiddlewaresSample\MiddlewaresSample\wwwroot\Foo.txt'
info: Microsoft.AspNetCore.ResponseCaching.ResponseCachingMiddleware[26]
      The response has been cached.
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[2]
      Request finished in 23.3728ms 200 text/plain
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[1]
      Request starting HTTP/1.1 GET http://localhost:5000/Foo.txt
info: Microsoft.AspNetCore.ResponseCaching.ResponseCachingMiddleware[24]
      No cached response available for this request.
info: Microsoft.AspNetCore.StaticFiles.StaticFileMiddleware[2]
      Sending file. Request path: '/Foo.txt'. Physical path: 'C:\gh\tp\MiddlewaresSample\MiddlewaresSample\MiddlewaresSample\wwwroot\Foo.txt'
info: Microsoft.AspNetCore.ResponseCaching.ResponseCachingMiddleware[26]
      The response has been cached.
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[2]
      Request finished in 3.5304ms 200 text/plain
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[1]
      Request starting HTTP/1.1 GET http://localhost:5000/Foo.txt
info: Microsoft.AspNetCore.ResponseCaching.ResponseCachingMiddleware[22]
      Serving response from cache.
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[2]
      Request finished in 7.703ms 200 text/plain
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[1]
      Request starting HTTP/1.1 GET http://localhost:5000/Foo.txt
info: Microsoft.AspNetCore.ResponseCaching.ResponseCachingMiddleware[22]
      Serving response from cache.
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[2]
      Request finished in 1.4551ms 200 text/plain
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[1]
      Request starting HTTP/1.1 GET http://localhost:5000/Foo.txt
info: Microsoft.AspNetCore.ResponseCaching.ResponseCachingMiddleware[22]
      Serving response from cache.
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[2]
      Request finished in 1.3005ms 200 text/plain
aspnet-hello commented 6 years ago

From @Tratcher on Sunday, April 2, 2017 8:24:55 PM

Hmm, no logs for the compression middleware...

aspnet-hello commented 6 years ago

From @RudeySH on Thursday, October 12, 2017 3:58:05 AM

@muratg why was this closed?

aspnet-hello commented 6 years ago

From @joeaudette on Sunday, October 22, 2017 1:55:08 PM

I came across this because I've started using webpack to pre-gzip my js and css. @resnyanskiy your code set me on the right path, but I wanted to avoid the redirect.

The solution I came up with was implementing a custom IFileProvider based on the code from CompositeFileProvider.

I set it up like this:

app.UseStaticFiles(new StaticFileOptions()
{
    OnPrepareResponse = GzipMappingFileProvider.OnPrepareResponse,
    FileProvider = new GzipMappingFileProvider(
    loggerFactory,
    true,
    Environment.WebRootFileProvider
    )
});

it uses a convention, if the requested file ends with .min.js or .min.css it will look for the same file name with .gz on the end, and if found it returns that. Then later I got the idea to add logic to try to create the .gz file if it does not exist, and then return that on success else return the original file.

It seems to be working well, would appreciate any feedback on the implementation, found here: https://github.com/joeaudette/cloudscribe/blob/master/src/cloudscribe.Web.Common/StaticFiles/GzipMappingFileProvider.cs

My solution is using standard .min.js and .min.css urls, but the .gz file is served, I'm not using .gz in my urls.

One known issue is that this solution is bypassing content negotiation, and just giving you gzip whether you like it or not, but really not a big issue in my view for real browsers.

aspnet-hello commented 6 years ago

From @Tratcher on Sunday, October 22, 2017 6:59:05 PM

That approach sounds like it would mess up the content-length and etag headers.

aspnet-hello commented 6 years ago

From @joeaudette on Monday, October 23, 2017 4:35:28 AM

@Tratcher could you elaborate on how that would get messed up? The IFileProvider is passing up the IFileInfo about the gz file so it would have the correct content length of the gz file, isn't that what it should have?

We still have an opportunity to tweak the headers in OnPrepareResponse if there is something messed up, but I'm trying to understand what would be messed up and why.

aspnet-hello commented 6 years ago

From @Tratcher on Monday, October 23, 2017 6:48:18 AM

Content-length and gzip are rarely used together because the implications are really confusing. I'll need to verify, but I think the content-length is supposed to be the uncompressed length rather than the compressed length.

StaticFiles also uses the length to calculate the etag, so your pre compressed file will have a different etag than the uncompressed version, even if the contents are the same.

StaticFiles also has built in support for range headers, which refer to offsets in the uncompressed file. This won't work with compressed files.

Managing pre compression in the file provider is inadequate, it needs to be built into StaticFiles to make the above scenarios work correctly (or at least be bypassed correctly).

aspnet-hello commented 6 years ago

From @joeaudette on Monday, October 23, 2017 6:58:57 AM

@Tratcher that makes sense for dynamic compression, but the issue here is pre-compressed static files and I think we would want content-length for any static file wouldn't we? and it should be used to calculate the etag I would think.

In my scenario the .gz file is created by the webpack build process, while my FileProvider can be configured to generate the .gz file that is a secondary concern, mainly I am serving static files that are already gzipped.

In my scenario without content negotiation the browser is only going to get the already compressed static file, the uncompressed file would not be returned unless the compressed file does not exist and could not be created. If the source file is modified newer than the compressed file I am regenerating the compressed file.

aspnet-hello commented 6 years ago

From @joeaudette on Monday, October 23, 2017 7:12:19 AM

@Tratcher so are you saying that for my pre-gzipped files I should remove the Content-Length and Accept-Range headers?

aspnet-hello commented 6 years ago

From @Tratcher on Tuesday, October 24, 2017 10:16:24 AM

Never mind on content-length, I re-checked the spec and verified that content-length and transfer-encoding don't mix, but content-encoding is OK.

I still think the Range requests will be broken though, those offsets should be for the un-compressed representation. So yes, at least remove the Accept-Ranges header. I haven't found any spec references to confirm this yet.

aspnet-hello commented 6 years ago

From @joeaudette on Tuesday, October 24, 2017 1:24:20 PM

@Tratcher thanks! I am by no means an expert, but I'm still a little doubtful that I should remove the Accept-Ranges:bytes header. Other samples I have found for pre-gzipped files seem to include that header. I would think the offsets sent by the browser would mainly be to recover from partial downloads and get the rest of the file and that would be true even if the file is pre-gzipped.

I found this which seems to indicate chrome and firefox for example would store the file in its content-encoding format ie .gz and will send range requests for those files to complete a failed partial download.

https://lists.w3.org/Archives/Public/ietf-http-wg/2014AprJun/0112.html

aspnet-hello commented 6 years ago

From @Tratcher on Tuesday, October 24, 2017 1:38:21 PM

I expect a problem due to the differences between static and dynamic compression. With dynamic compression a range request would be processed against the un-compressed content and then the result would be compressed. With static compression the range request is processed against the compressed content. How is the client supposed to tell the difference? There's no indication in the response that you used static or dynamic compression, except that dynamic compression often uses chunked rather than content-length.

Maybe it doesn't matter so long as the behavior is consistent on a per resource basis.

aspnet-hello commented 6 years ago

From @joeaudette on Tuesday, October 24, 2017 1:45:36 PM

@Tratcher I found this on SO, which seems to indicate content-encoding is a property of the entity whereas transfer-encoding is a property of the message, so it would seem logical if dynamic compression used transfer-encoding and pre compressed static files would use content-encoding https://stackoverflow.com/questions/11641923/transfer-encoding-gzip-vs-content-encoding-gzip

aspnet-hello commented 6 years ago

From @Tratcher on Tuesday, October 24, 2017 1:52:30 PM

The edit on that answer is really telling.. "Edit: That is, unless you want to serve gzipped content to clients that only understand Content-Encoding. Which, unfortunately, seems to be most of them. But be aware that you leave the realms of the spec"

aspnet-hello commented 6 years ago

From @joeaudette on Tuesday, October 24, 2017 2:08:35 PM

yes I saw that, you notice I said it would be logical, not that is how it is done, it sounds like in practice dynamic compression does it wrong, they should use transfer-encoding but they don't because of browsers :-D

but nevertheless I think it will be ok to keep the Accept-Ranges header from my previous link I suspect it would work the same way with dynamic compression the offsets would be based on the compressed file not the decompressed because the dynamic compression does use content-encoding header so the client doesn't know it is dynamic. I don't think it will use offsets based on the the decompressed file, the previous link it mentions chrome and ff cache it compressed and make range requests based on the file, IE would cache the file decompressed but would only make range requests for uncompressed files, if I understood it correctly. It would seem weird if a browser used offsets that were not against the actual file it downloaded but a version transformed after download. It has to use the content-length of what it downloaded and calculate offsets based on how much of that it got already I would think.

aspnet-hello commented 6 years ago

From @herecydev on Monday, December 4, 2017 1:30:06 AM

Adding my implementation into the mix. Use case, client calls /foo.js and the WebRootPath has foo.js.gz or foo.js.br and prefers the compressed versions.

public class CompressionFileProvider : IFileProvider
    {
        private readonly IFileProvider _fileProvider;
        private readonly IHttpContextAccessor _httpContextAccessor;
        private readonly string _root;

        public CompressionFileProvider(IHostingEnvironment hostingEnvironment, IHttpContextAccessor httpContextAccessor)
        {
            _fileProvider = hostingEnvironment.WebRootFileProvider;
            _httpContextAccessor = httpContextAccessor;
            _root = hostingEnvironment.WebRootPath;
        }

        public IDirectoryContents GetDirectoryContents(string subpath)
            => _fileProvider.GetDirectoryContents(subpath);

        public IFileInfo GetFileInfo(string subpath)
        {
            if (_httpContextAccessor.HttpContext.Request.Headers.TryGetValue("Accept-Encoding", out var encodings))
            {

                if (encodings.Any(encoding => encoding.Contains("br")))
                {
                    var compressedEncoding = _fileProvider.GetFileInfo(subpath + ".br");
                    if (compressedEncoding.Exists)
                        return compressedEncoding;
                }

                if (encodings.Any(encoding => encoding.Contains("gzip")))
                {
                    var compressedEncoding = _fileProvider.GetFileInfo(subpath + ".gz");
                    if (compressedEncoding.Exists)
                        return compressedEncoding;
                }
            }

            return _fileProvider.GetFileInfo(subpath);
        }

        public IChangeToken Watch(string filter)
            => _fileProvider.Watch(filter);
    }

public static class ApplicationBuilderExtensions
    {
        public static IApplicationBuilder UseCompressedStaticFiles(this IApplicationBuilder applicationBuilder, IHostingEnvironment hostingEnvironment, IHttpContextAccessor httpContextAccessor)
        {
            return applicationBuilder.UseStaticFiles(new StaticFileOptions
            {
                FileProvider = new CompressionFileProvider(hostingEnvironment, httpContextAccessor),
                OnPrepareResponse = ctx =>
                {
                    var headers = ctx.Context.Response.Headers;

                    if (ctx.File.Name.EndsWith(".br"))
                        headers.Add("Content-Encoding", "br");
                    else if (ctx.File.Name.EndsWith(".gz"))
                        headers.Add("Content-Encoding", "gzip");
                }
            });
        }
    }
RehanSaeed commented 6 years ago

I'm trying to use the ResponseCompression middleware after the StaticFiles middleware to server Brotli (.br) or GZIP (.gz) encoded files if available.

However, the ResponseCompression keeps GZIP encoding files that are already Brotli encoded. It also ends up double encoding when I serve a static GZIP file too. It seems it's not checking for already encoded files and the two pieces of middleware are clashing.

Tratcher commented 6 years ago

StaticFiles doesn't set the content-encoding headers, you have to do it yourself in the OnPrepareResponse event. ResponseCompression can't tell it's compressed unless you set that header.

deedubb commented 6 years ago

As requested by @DamianEdwards I am adding a note to remind him/others of the chain resulting from https://twitter.com/DamianEdwards/status/980814294756425728 which brings up the CRIME and BREACH exploits in HTTPS when using gzip compression on dynamic pages (any page containing user content). As per the last comment in the chain this issue should be reviewed for 2.2.

In the meantime: because of these security issues the official stance is disable compression for HTTPS by default. In my opinion, if you know you only put user-modified variables in text/html pages you could use dynamic compression and exclude HTML pages based on mime type. "As-is" rules apply...

services.AddResponseCompression(options => {
    options.EnableForHttps = true;
    /* Includued defaults: "text/plain", "text/css", "ap­pli­ca­tion/javascript", 
            "text/html", "ap­pli­ca­tion/xml", "text/xml", "ap­pli­ca­tion/json", "text/json" */
    options.MimeTypes = ResponseCompressionDefaults.MimeTypes
        .Where(mime => !mime.Contains("text/html"))
        .Concat(new[] {
            "application/font-woff",
            "font/woff2"
        });
});

/cc @blowdart

blowdart commented 6 years ago

This could be enhanced by supporting compression in the static files middleware.

phatcher commented 6 years ago

@deedubb Are we not protected as the CRSF token is randomized? (see https://blog.qualys.com/ssllabs/2013/08/07/defending-against-the-breach-attack)

DamianEdwards commented 6 years ago

I think the eventual fix here would be to introduce a new request feature that the static file middleware can set when serving files that indicates the response is from static content, thus allowing the compression middleware to safely compress responses only when the feature indicates it was static in nature. Of course it's still possible for another middleware in-between to change the content and introduce issues again. If we care enough about that we could have the feature actually poison the response headers and body (e.g. via wrapping) such that they can't be modified. Also, technically it isn't about what's static but what contains user manipulable content, so the feature should likely be built around that premise instead, allowing any response (including dynamic responses from things like MVC actions) to declare that they're free from user manipulation and thus safe to compress over HTTPS.

arex388 commented 3 years ago

I want to toss in my two cents here about this. From about half of the discussion that I read, people are trying to generate and cache the compressed files at application runtime and then go from there. To me that is wasteful.

Using Gulp I generate my pre-compressed static files during build time and I publish those, then serve the correct one using my PreCompressedStaticFiles middleware. This way I don't have to deal with server-side generating, caching, etc. That is how I serve the CSS and JS files on my site.

gulpfile.js

///  <binding AfterBuild="global:compile" Clean="global:clean-all" ProjectOpened="global:watch" />

const rootNode = "./node_modules",
    rootResources = "./Resources",
    rootWww = "./wwwroot",
    sourcesCss = [
        `${rootNode}/normalize.css/normalize.css`,
        `${rootResources}/Styles/**/*.css`
    ],
    sourcesJs = [
        `${rootResources}/Scripts/**/*.js`
    ],
    sourcesLess = [
        `${rootResources}/Styles/**/*.less`
    ],
    targetCss = "styles.min.css",
    targetJs = "scripts.min.js";

const del = require("del"),
    gulp = require("gulp"),
    gulpBrotli = require("gulp-brotli"),
    gulpCleanCss = require("gulp-clean-css"),
    gulpConcat = require("gulp-concat"),
    gulpGzip = require("gulp-gzip"),
    gulpLess = require("gulp-less"),
    gulpRename = require("gulp-rename"),
    gulpTerser = require("gulp-terser-js"),
    rollup = require("rollup");

//  ========================================================================
//  LESS to CSS
//  ========================================================================

/**
 * Delete CSS files. Excludes all Node CSS files.
 */
gulp.task("css:clean", () => del([
    `${rootWww}/**/*.css`,
    `!${rootNode}/**/*.css`
]));

/**
 * Delete all CSS files. Excludes all Node CSS files.
 */
gulp.task("css:clean-all", () => del([
    `./**/*.css`,
    `!${rootNode}/**/*.css`
]));

/**
 * Compress minified CSS with Brotli.
 */
gulp.task("css:compress-brotli", () => gulp.src(`${rootWww}/${targetCss}`)
    .pipe(gulpBrotli())
    .pipe(gulp.dest(rootWww)));

/**
 * Compress minified CSS with GZIP.
 */
gulp.task("css:compress-gzip", () => gulp.src(`${rootWww}/${targetCss}`)
    .pipe(gulpGzip({
        gzipOptions: {
            level: 9
        }
    }))
    .pipe(gulp.dest(rootWww)));

/**
 * Concatenate all CSS files into a single CSS file.
 */
gulp.task("css:concatenate", () => gulp.src(sourcesCss)
    .pipe(gulpConcat("concatenated.css"))
    .pipe(gulp.dest(rootWww)));

/**
 * Minify the concatenated CSS file.
 */
gulp.task("css:minify", () => gulp.src(`${rootWww}/concatenated.css`)
    .pipe(gulpCleanCss({
        level: {
            1: {
                specialComments: 0
            }
        }
    }))
    .pipe(gulpRename(targetCss))
    .pipe(gulp.dest(rootWww)));

/**
 * Delete all CSS files after minification.
 */
gulp.task("css:minify-clean", () => del([
    `${rootWww}/**/*.css`,
    `!${rootWww}/${targetCss}`
]));

/**
 * Compile all CSS files.
 */
gulp.task("css:compile", gulp.series(
    "css:concatenate",
    "css:minify",
    "css:minify-clean",
    "css:compress-brotli",
    "css:compress-gzip"
));

//  ========================================================================
//  LESS
//  ========================================================================

/**
 * Compile all LESS files.
 */
gulp.task("less:compile", () => gulp.src(sourcesLess, { base: "./" })
    .pipe(gulpLess())
    .pipe(gulp.dest("./")));

//  ========================================================================
//  JavaScript
//  ========================================================================

/**
 * Delete JS files. Excludes this gulpfile.js and
 * all Node JS files.
 */
gulp.task("js:clean", () => del([
    `${rootWww}/**/*.js`,
    `!${rootNode}/**/*.js`
]));

/**
 * Delete all JS files. Excludes this gulpfile.js and
 * all Node JS files.
 */
gulp.task("js:clean-all", () => del([
    `./**/*.js`,
    "!./gulpfile.js",
    `!${rootNode}/**/*.js`
]));

/**
 * Compress minified JS with Brotli.
 */
gulp.task("js:compress-brotli", () => gulp.src(`${rootWww}/${targetJs}`)
    .pipe(gulpBrotli())
    .pipe(gulp.dest(rootWww)));

/**
 * Compress minified JS with GZIP.
 */
gulp.task("js:compress-gzip", () => gulp.src(`${rootWww}/${targetJs}`)
    .pipe(gulpGzip({
        gzipOptions: {
            level: 9
        }
    }))
    .pipe(gulp.dest(rootWww)));

/**
 * Delete all JS files after minification.
 */
gulp.task("js:minify-clean", () => del([
    `${rootWww}/**/*.js`,
    `!${rootWww}/${targetJs}`
]));

/**
 * Minify the rolledup JS file.
 */
gulp.task("js:minify", () => gulp.src(`${rootWww}/scripts.rolledup.js`)
    .pipe(gulpTerser({
        ecma: 9
    }))
    .pipe(gulpRename(targetJs))
    .pipe(gulp.dest(rootWww)));

/**
 * Rollup the JS files.
 */
gulp.task("js:rollup", () => rollup.rollup({
    input: `${rootResources}/Scripts/Default.js`
}).then(bundle => bundle.write({
    file: `${rootWww}/scripts.rolledup.js`,
    format: "iife",
    sourcemap: false
})));

/**
 * Compile all JS files.
 */
gulp.task("js:compile", gulp.series(
    "js:rollup",
    "js:minify",
    "js:minify-clean",
    "js:compress-brotli",
    "js:compress-gzip"
));

//  ========================================================================
//  Global
//  ========================================================================

/**
 * Delete all CSS and JS files, with exclusions.
 */
gulp.task("global:clean-all", gulp.parallel(
    "css:clean-all",
    "js:clean-all"
));

/**
 * Compile all CSS and JS files.
 */
gulp.task("global:compile", gulp.parallel(
    gulp.series(
        "less:compile",
        "css:clean",
        "css:compile"
    ),
    gulp.series(
        "js:clean",
        "js:compile"
    )
));

/**
 * Watch the file system for CSS, JS, or LESS file changes.
 */
gulp.task("global:watch", () => {
    gulp.watch(sourcesCss, gulp.series(
        "css:clean",
        "css:compile"
    ));

    gulp.watch(sourcesJs, gulp.series(
        "js:clean",
        "js:compile"
    ));

    gulp.watch(sourcesLess, gulp.series(
        "less:compile"
    ));
});

package.json

{
  "devDependencies": {
    "del": "6.0.0",
    "gulp": "4.0.2",
    "gulp-brotli": "3.0.0",
    "gulp-clean-css": "4.3.0",
    "gulp-concat": "2.6.1",
    "gulp-gzip": "1.4.2",
    "gulp-less": "5.0.0",
    "gulp-rename": "2.0.0",
    "gulp-terser-js": "5.2.2",
    "normalize.css": "8.0.1",
    "rollup": "2.53.2"
  },
  "name": "asp.net",
  "private": true,
  "version": "1.0.0"
}

Startup.cs

public sealed class Startup {
    public void Configure(
        IApplicationBuilder app) {
        app.UsePreCompressedStaticFiles()
           .UseStaticFiles();
    }
}

web.config

<?xml version="1.0" encoding="utf-8"?>
<configuration>
    <system.webServer>
        <httpCompression>
            <staticTypes>
                <add mimeType="text/css" enabled="false"/>
                <add mimeType="text/javascript" enabled="false"/>
            </staticTypes>
        </httpCompression>
    </system.webServer>
</configuration>
Trolldemorted commented 3 years ago

@JunTaoLuo is a copy of https://github.com/JunTaoLuo/MiddlewaresSample still available somewhere?

And are there any plans to support this natively in kestrel? Having precompressed .br and .gz files and serving them instead of recompressing the original file when a request with a matching accept-encoding header comes in should not be an unusual use case.

davidfowl commented 3 years ago

Natively in Kestrel no? Via existing middleware in ASP.NET Core? Maybe in the future.

AnderssonPeter commented 2 years ago

I have created a nuget https://github.com/AnderssonPeter/CompressedStaticFiles that solves both the gzip/brotli and also alternative image files. Its based on neyromant's middelware!

javiercn commented 3 months ago

Fixed as part of https://github.com/dotnet/aspnetcore/pull/55558

AnderssonPeter commented 3 months ago

Fixed as part of https://github.com/dotnet/aspnetcore/pull/55558

Cool! any chance you could give us a few hints how to use the new functionality to achieve serving pre compressed content?