keybase / keybase-issues

A single repo for managing publicly recognized issues with the keybase client, installer, and website.
902 stars 37 forks source link

Website proof fails with Content-Encoding: gzip #1412

Open emk opened 9 years ago

emk commented 9 years ago

My website is served out an S3 bucket with compression enabled. When I try to validate it, I get:

✖ admin of www.randomhacks.net via HTTP:  (failed with code 201)

The HTTP request and response headers appear as follows:

> GET /keybase.txt HTTP/1.1
> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3
> Host: www.randomhacks.net
> Accept: */*
> 
< HTTP/1.1 200 OK
< x-amz-id-2: pdsy0zSPutcgmWzV/6ExWi5qOX8Gc4TDN77cBNvbAbjNDjkNSfiALvKrbQfNhezrIIjA63UyDSc=
< x-amz-request-id: B8B896385B9BD1CA
< Date: Wed, 25 Feb 2015 20:51:11 GMT
< Content-Encoding: gzip
< Cache-Control: max-age=300
< Last-Modified: Wed, 25 Feb 2015 20:16:59 GMT
< ETag: "04b3c69204443e22f578fa5db80f0b91"
< Content-Type: text/plain; charset=utf-8
< Content-Length: 1999
< Server: AmazonS3
< 

This may occur because this particular S3 setup always sends compressed content, whether or not you actually asked for it. To make this work, there's a cURL option which may help:

curl --compressed

Unfortunately, disabling compression for a single *.txt file is hard with my S3 uploading setup.

Thank you for a very cool project!

jdriscoll commented 9 years ago

I ran into this as well. I'm not sure how similar our setups are but I also serve my blog from S3. My workaround was to the keybase.txt file (and any other files I don't want gzipped) in a separate directory and then upload it after I sync using s3cmd without --deleted-removed and the gzip header. It gets deleted and then re-uploaded on each publish but it's easier than replacing it manually.

https://gist.github.com/jdriscoll/f6f56fd7f0f43409476c