Open AlexSchuetz opened 9 years ago
:+1: any updates @AlexSchuetz
No updates. I'm waiting for some response, since I don't have the time to dig into this any deeper at the moment. Until then the workaround is to simply not use gzip in combination with firefox on serverside.
I am running into the same issue. When GZip encoded-content is presented from the proxied source (a Django webserver, in my case), the encoding is functional. However, after making its way to my browser via grunt-connect-proxy, the content is corrupted.
Chrome throws a "ERR_CONTENT_DECODING_FAILED" error, and cUrl gives me "curl: (23) Error while processing content unencoding: invalid block type".
When I compare the working version of the content to the broken version, I notice that the broken version is longer.
When I look at the bytes in both files, I notice that the sequence 0xEF 0xBF 0xBD
is very common in the broken file, and seems to replace many (but not all) of the high-bit characters in the working file (i.e. 0x80-0xFF).
In UTF-8, this sequence resolves to U+FFFD REPLACEMENT CHARACTER character
, that little question mark icon you get when a UTF-8 file is corrupted.
My guess is that grunt-connect-proxy is reading the Gzipped stream as if it were text (in ASCII or UTF-8, which results in the U+FFFD
substitutions) and then encoding that back into UTF-8 (producing the 0xEF 0xBF 0xBD
sequence).
Well, same here. Took me some time to trace the problem to the fact that connect-proxy seems to have a problem with gzipped content. Lucky that it's just a vagrant machine with a Ubuntu/Apache/PHP backend in my case, so a simple a2dismod deflate
was a quick workaround for me. (Hower that should be fixed here)
Another workaround is to apply the fix mentioned in https://github.com/nodejitsu/node-http-proxy/issues/1007
connect: {
livereload: {
options: {
middleware: function (connect, options) {
function handleGzip(proxyRes, req, res) {
var gzipped = /gzip/.test(proxyRes.headers["content-encoding"]);
if (gzipped) {
res.write = (function(override) {
return function(chunk, encoding, callback) {
override.call(res, chunk, "binary", callback);
};
})(res.write);
res.end = (function(override) {
return function(chunk, encoding, callback) {
override.call(res, chunk, "binary", callback);
};
})(res.end);
}
}
require("grunt-connect-proxy/lib/utils").proxies().forEach(function(proxy) {
proxy.server.on("proxyRes", handleGzip);
});
// Setup the proxy
var middlewares = [require('grunt-connect-proxy/lib/utils').proxyRequest];
return middlewares;
}
}
}
}
I have an angular-app which consumes a rest-api served by tomcat. The tomcat-servlet uses gzip to compress responses, if "Accept-Encoding" contains gzip.
If I use Chrome, IE or Opera everthing works fine. Only if I use Firefox I get a "content encoding error" and I get this error only if I try to load data over the proxy. i.e. "http://localhost:9001/restapi/meta" -> "content-encoding-error" "http://localhost:8080/restapi" -> Valid JSON
If I use the poster-firefox addon and I send the compressed data without "content-encoding: gzip" (to get the raw data) I can see, that I get different raw data if I use the proxy. Using Chrome-DHC the data is the same.
My test server is using nginx without any issues.
Using "grunt-proxy" also works without this error (but has some problems with the URLs and not that much documentation).
Here is my grunt-contrib-connect configuration:
Best regards Alex