epacke / BigIPReport

Overview of your loadbalancer configuration
https://loadbalancing.se
8 stars 2 forks source link

pre-compress json files #175

Closed timriker closed 3 years ago

timriker commented 4 years ago

It might be nice for the script to have an option to create files like virtualservers.json.gz and then have those loaded and decompressed on the client side instead of recompressing them each time they go over the wire. These files can get quite large, and compressing them reduces the initial load and render times significantly.

epacke commented 4 years ago

Wouldn't a modern web server handle this with local caching? I run Bigipreport behind an F5 and the compression profile does wonders but I have not verified if it caches items.

timriker commented 4 years ago

I did some experimentation on this. Configuring the correct headers on the server side is non trivial for different platforms.

On my current setup I have the F5 doing compression for application/json. This means the files transfer much faster, especially at home over vpn, but it means that the Content-Length header is not filled in on the F5. It's possible to work around that by changing the F5 setup, but that means allocating more memory to compression,which I'm not eager to do.

Compressing the files as json/*.json.gz is easy. Getting the Mime type right so that the ajax code opens them correctly is not so easy. If they are already compressed, then the Content-Length header would be correct, and the pace progress bar is accurate. When the F5 is doing compression, the pace bar is a guess as I think it does not know the size of the json files.

The images seem to be cached. The json files I don't think we want cached. We want to reload all of them so they are in sync.

I guess if we close the page and re-open it, using cached json pages might be a Good Thing. My reporting runs every hour, so ideally the pages would get cached till about the time of the next update. I've not looked into this. I think I could set it up on the web server.

epacke commented 4 years ago

How about dockerizing the application now that it can run in Linux? Then we could pre-package it with a web server that does compression.

timriker commented 4 years ago

That's worth considering. The xml would need to be created as input for the docker build, right?

epacke commented 4 years ago

Suppose we use docker-compose, then we could expose the directory with through the yml volume setting. Or use a start script. I can look at this.

timriker commented 3 years ago

If we can use Brotli, then we should be able to have .js.bz and .json.bz files on disk with the Server configured to send the correct headers so that the client can decompress them. This means we only compress the files once on build, and not one for each client load. Article here: https://css-tricks.com/brotli-static-compression/ F5 has no timeline to include bz (Brotli) compression internally. https://support.f5.com/csp/article/K99782219 But if the files are pre-compressed, then the actual web server just needs to set the correct headers. ie:

Content-type: application/json; charset=utf-8
Content-Encoding: br

And the files under json/ can all be json/.json.br files. Similar js/ would be js/*js.br files. .NET should have System.IO.Compression.Brotli Not sure what PowerShell has for a stream. https://devblogs.microsoft.com/dotnet/introducing-support-for-brotli-compression/

timriker commented 3 years ago

In Apache this config works:

Options +MultiViews
AddEncoding br .brotli

Then gzip all the json files and remove the uncompressed versions to trigger multiviews:

brotli -f -Z -S .brotli *.json
rm *.json

Same thing for .html and .js files. This will break any client that does not support brotli files. Moving the originals to a double extension should work for both clients. ie: mv nat.json nat.json.json tedious, but it could be scripted in the powershell file. I have not found an IIS setting to add. This article might help: https://stackoverflow.com/questions/48889701/setup-iis10-to-serve-pre-compressed-files