alloc / vite-plugin-compress

Compress your bundle + assets from Vite
MIT License
130 stars 13 forks source link

GZip compression #1

Open mseele opened 3 years ago

mseele commented 3 years ago

Cool plugin. Any plans to add gzip compression as well?

aleclarson commented 3 years ago

Any reason you prefer gzip over brotli?

mseele commented 3 years ago

To support "old browsers". I'm working in a cooperate environment where we need to support them :(

polarathene commented 3 years ago

Brotli isn't always an available option on the server end, even if a client supports it.

With nginx for example I think it requires a separate plugin to support (but that may only be for on-demand compression, not pre-compressed), this has been a common issue for some self-hosted deploys that use docker images but don't additionally extend them to include the plugin or be configured for serving brotli assets.

Traefik isn't a backend web server, only a reverse-proxy, but offers optionally gzip on-demand, but lacks brotli last I checked. Same with Caddy which is a web server like nginx. Some SaaS like Netlify did not support brotli for some time, so I wouldn't be surprised if others don't.

That said, I'm not sure if there's much value or reason to pre-compress gzip assets as you're usually fine with on-demand support via the server/service, a client will request the asset mentioning it has support for gzip, and the server will compress that asset into memory if it hasn't already cached it previously and send it to the client (ideally with cache-control headers to minimize bandwidth on both ends).


@mseele are you not able to provide gzip on demand? I understand there can still be benefit in avoiding any extra CPU to gzip static assets that you can pre-compress, but it's fairly common to have gzip on-demand support for things like dynamic content/responses (eg JSON API responses).

Click to see instructions for simple web server setup with gzip With Caddy, this is a simple `Caddyfile` config: ``` # adjust to your domain name example.com { # adjust path to where your web project root is, `file_server` enables serving static content root * /srv/my-website file_server # Files being requested with these extensions that you want to cache @assets { path_regexp \.(jpg|png|webp)$ } # Cache-control header example for 1 day cache header @assets Cache-Control "public, max-age=86400" # enables gzip, this will apply to every asset. # upcoming v2.4 release of Caddy will have more sane defaults for what content to compress, you can specify your own like the `@assets` example above. encode gzip } ``` Then to try that out on a Linux server you can use CLI to download a release, it's a single binary file no need to install any package: - Config: Modify the above config as needed, and place it in a file named `Caddyfile` where your web project root is. - Download: `curl -OL "https://github.com/caddyserver/caddy/releases/download/v2.3.0/caddy_2.3.0_linux_amd64.tar.gz"` (latest stable release from [Github Releases](https://github.com/caddyserver/caddy/releases/tag/v2.3.0)). - Extract: `tar -xvzf caddy_2.3.0_linux_amd64.tar.gz` - Run: `./caddy start` (runs process in background, you can exit and it keeps running) or `./caddy run` (runs process in foreground outputting log to stdout for testing). The default config will provide a bunch of implicit setup out of the box that older server software like nginx normally need to be setup for. Such as automatic HTTP to HTTPS redirect, LetsEncrypt / ZeroSSL automatic provisioning for your HTTPS certs, automatically renewing those for you. More info at [official docs](https://caddyserver.com/docs/). If you use Docker and prefer that, there's also a docker image. But if you already use a popular web server instead such as nginx or apache you should have no problem enabling gzip on-demand support. The v2.4 release of Caddy brings pre-compressed support, so you can also send gzip or brotli pre-compressed assets soon too (you can do this atm but it's not as simple config wise).
mseele commented 3 years ago

@polarathene i switched to https://github.com/anncwb/vite-plugin-compression it provides gzip and brotli support 👍

dlqqq commented 3 years ago

@mseele I'm curious what your use-case is. ES6, which is the lowest transpilation setting available to Vite without plugins, has lower browser support than brotli. So if you're using Vite's default configs, adding brotli won't decrease your browser compatibility at all.

polarathene commented 3 years ago

I'm curious what your use-case is.

For me, I was using Vite for convenience, but serving a static page that is built requiring no JS at runtime. Thus it's good to have gzip support.

As mentioned in my previous comment, even when browser support is a non-issue, your web server can be. Caddy is only just about to release with support for pre-compressed brotli, plenty of other popular software can be found setup/configured without brotli support and the user may not have the option of resolving that.

Personally I'm not fond of the alternative fork that was linked. The maintainer provided a poor response to the question I raised there. It does not support dual output either so I just manually compressed the output via Docker alpine image and adding the brotli package + zopfli (slightly better gzip compression for pre-compressed usage).

dlqqq commented 3 years ago

@polarathene I see; my apologies for not reading your comment more thoroughly. I still fail to understand the issue though; why can't the web server just be configured to serve responses with the

Content-Encoding: br, gzip

HTTP header configured? I'm sure that I'm oversimplifying a bit, but I'm just curious why that wouldn't work.

Back to the topic at hand, I am also dissatisfied with the current state of minification and compression plugins for Vite. It seems that most Vite plugins lack regular support and development, are highly specific to the author's use-case, and few have best practices like automated unit testing. I consider testing to be an essential requirement for production-ready code.

It's not well-documented, but Vite uses just 2 minification libraries: terser for JS, and clean-css for CSS. More minification and compression steps in the build process really should come built-in with Vite. I can hardly imagine a use-case where people want their websites to load slower and bundle sizes to be larger, so why do bundlers keep "forgetting" to implement this and delegate it to multiple separate plugins with varying code quality? Just seems silly. I'll make an issue in Vite and link it to you later.

aleclarson commented 3 years ago

highly specific to the author's use-case

That's typically how open source works in the absence of funding.

PR welcome for both GZIP compression and automated testing!

I can hardly imagine a use-case where people want their websites to load slower and bundle sizes to be larger, so why do bundlers keep "forgetting" to implement this and delegate it to multiple separate plugins with varying code quality?

I think the idea is that most servers run in front of caches, so on-demand compression is good enough. You only pay for the cost of compression once per asset, instead of once per request.

Generally speaking, we want to keep Vite's core as lightweight as possible, in order to reduce maintenance burden. In the case of JS and CSS, the minifiers you mentioned are not usually applied on the server-side, so it makes more sense for Vite to support them out-of-the-box. Other bundlers, like Parcel, also take this approach.

That said, Parcel is considering official support for various optimization plugins, so it's possible Vite will eventually follow their lead.

dlqqq commented 3 years ago

Generally speaking, we want to keep Vite's core as lightweight as possible, in order to reduce maintenance burden.

I can understand that. Thank you for explaining.

I think the idea is that most servers run in front of caches, so on-demand compression is good enough. You only pay for the cost of compression once per asset, instead of once per request.

You're right. However, I think that delegating as much responsibility as possible to the build step results in cleaner, smaller, and more maintainable code. But that's just a personal preference :stuck_out_tongue:

PR welcome for both GZIP compression and automated testing!

I'd be happy to contribute, since it seems you're still active on this project. I'll look into into it :+1:

polarathene commented 3 years ago

I'm sure that I'm oversimplifying a bit, but I'm just curious why that wouldn't work.

You need to instruct the web server to add those headers and send the file with the relevant pre-compressed extension instead of the uncompressed file. A little simpler if you only intend to serve a single pre-compressed asset always, but still requires additional work on behalf of the system admin (whomever is configuring the server).

Caddy for example can support brotli prior to version 2.4 but required much more manual config work, for example as someone blogged about here. With 2.4 release, you don't need such a verbose config to support brotli, and can just use encode zstd brotli gzip, single config line for Caddy, no additional modules required unlike nginx (common issue with several popular docker images that are used for deploying nginx with minimal config/modules).


It seems that most Vite plugins lack regular support and development, are highly specific to the author's use-case, and few have best practices like automated unit testing. I consider testing to be an essential requirement for production-ready code.

Vite is still fairly new, especially v2 AFAIK. It needs to gain more popularity / adoption along with existing plugins gaining more users and contributors. Initially you may find multiple plugins doing roughly the same thing, and through stats and word of mouth usually one establishes itself within a community, with the final factor of relevance being how well it remains maintained going forward.

Unfortunately one user who seems to maintain many plugins decided to fork this one to add the features they wanted and not contribute back here, while their README (and related issue I raised) doesn't help instill confidence as it continues to diverge from this plugin.

They provided no response when asked about contributing back to upstream here, but I can understand this as it can be quite consuming on time and effort, which doesn't always result in merging upstream (has happened a few times for me).


More minification and compression steps in the build process really should come built-in with Vite.

Probably will come in time. Vite can focus it's development efforts on plenty of other areas while the community establishes solutions via plugins. Vite can then later adopt it officially as a package, or internally like I believe is happening atm for SVG optimization improvements.