Closed monfresh closed 9 years ago
Just to be clear, you are wanting to support uploading and downloading compressed files, or you are wanting files sent over the wire compressed?
Right now, we auto compress html/css/javascript with a 1K threshold. You can see this with @kennethormandy site. https://developers.google.com/speed/pagespeed/insights/?url=http%3A%2F%2Fkennethormandy.com%2F&tab=desktop
@monfresh surge gzips automatically for you. There is no need to activate anything. Here is my site hosted on surge...
http://checkgzipcompression.com/?url=http%3A%2F%2Fsintaxi.com
I knew there was a better link for checking compression Thanks @sintaxi
@monfresh Please update your comment, or close this issue.
That's good to know, thanks! The issue I ran into was that I tried to push content that I had already compressed, and when I visited my surge site, it was not rendering properly, as if it didn't know that the content-encoding was gzip. I'll try it again and send you the URL so you can see.
Here is the site uploaded without compression, and it works fine: http://abaft-carriage.surge.sh/
Here is the site uploaded with compression, and it doesn't render properly: http://dead-pig.surge.sh/
Now that I know that Surge compresses files, I won't compress them locally, but I'm still curious to know why uploading already-compressed files doesn't work, and whether that's a bug.
Interesting. So it appears to be serving the compressed file... compressed :) I'm really not sure what the right thing to do here is.. I don't think we know that the file is compressed ahead of time by inspecting it on disk nor would we want to as it would be slow. I also suspect if the file was named index.html.gzip it would download it properly (but not render it) as it would bypass the text/plain mime type.
So perhaps the answer here is documentation.
Better documentation sounds great to me. Should I leave this issue open until the documentation is added?
Ya, lets defer this to @kennethormandy and @sintaxi for final comments and to close it.
It looks like gzip is not enabled if I try the tool @sintaxi posted above. See: http://checkgzipcompression.com/?url=http%3A%2F%2Ftforssander2222.surge.sh%2F I'm a missing something here? :)
@tforssander If you check your stylesheet, it’s gzipped http://checkgzipcompression.com/?url=http%3A%2F%2Ftforssander2222.surge.sh%2Fstyle.css
My understanding is some files are just too small too small already to justify gzipping. There is a certain number of bytes we don’t gzip things under. If you publish something with just a <h1>Hello, world</h1>
it won’t be gzipped, but if you see a .html
page with more content, it will be: http://checkgzipcompression.com/?url=http%3A%2F%2Fkennethormandy.com
Let me know if you have any other questions about it!
@kennethormandy Ahhh... Thanks for the clarification!
We have a page about this now, so I’m going to close the issue. Thanks for the feedback, everyone!
Some modules to help with this potentially:
.gzip.whatever
files before uploadingHi, I'm working on a WebGL game with Unity3D. The uncompressed files for my game are around 50 MB, while the gzip'd files are 10 MB. Since I have roughly 30 KB/s upload speed, I'd rather compress the files locally, then upload them as "game.gz.js" and "level.gz.data" (rather than "game.js" and "level.data"). There are two problems with this: First, Surge will compress the .js file a second time, making it unreadable by the browser. Second, Surge won't send a "content-encoding:gzip" header for the "level.gz.data" file, since ".data" isn't one of the file extensions listed here: https://surge.sh/help/using-gzip-automatically
While I have no idea how a CDN like Surge is built, from my outside (and uninformed) perspective it seems like it would be possible to: 1: Not compress files with "gz" or "gzip" between the file name and file extension (ex: "game.gz.js" or "game.gzip.js"). 2: Always send a "content-encoding:gzip" header for the aforementioned files.
Thanks. :) (And sorry if I'm misunderstanding how something works..)
@Nolanes Thanks for suggestion! While it’s not something we have planned in the near future, it seems like something we could accommodate. Would you mind opening up a new issue (the exact same comment is fine) just so it doesn’t get lost?
Thanks, @kennethormandy, I opened a new issue. I wasn't sure whether I should comment or create a new issue, but the keeping track of it thing makes sense.
I just tried deploying my Octopress (based on Jekyll) static site, and it worked perfectly. This is a great product so far. One thing that would be great to add is support for compressed files. Currently, my site is hosted on S3, and I upload gzipped HTML and CSS, then set the
Content-Encoding
on S3 togzip
.Do you have any plans to support gzip content?