Closed adamfeuer closed 8 years ago
It looks like Go does have a method for detecting encoding: https://godoc.org/golang.org/x/net/html/charset#DetermineEncoding
I have a few questions though (not necessarily for you to answer):
utf-8
in and expect it to work instead?Can you run that method on your file perhaps, and let me know if it does indeed return utf-8
.
@zackbloom Thanks for pointing out the method for determining encoding - I'll try to use that. Regarding your questions, always putting utf-8 was our fall-back if we couldn't get a proper encoding determination to work. S3 definitely needs encoding set, otherwise the static website server Amazon uses doesn't give back the right encoding headers... so we think it will help, the other software we're using (gulp-s3 does this and we get the right encoding headers back. But gulp-s3 doesn't have a zero-downtime-deploy option... :-)
I'll give this a shot today or tomorrow and let you know - if it works I'll send a pull request.
maybe an option to solve this can be allowing custom headers to be set in deploying files.
What's the issue with the original solution? Of figuring out the encoding and setting that?
Stout is already setting charset=utf-8
on S3. See here.
Hi,
stout is great! We are deploying an angularjs app to s3 with it. But are having issues because the charset needs to be set to utf-8 for all our files:
Content-Type: text/html; charset=utf-8
The go mime library doesn't guess that correctly.
Would you be interested in a pull request that adds a commandline charset option to explicitly set charset option?