Closed GoogleCodeExporter closed 9 years ago
It seems to work for me, is this still claiming a content encoding error for
you?
Original comment by sligocki@google.com
on 14 Mar 2013 at 6:27
Hi sligocki,
Thanks for your response, yes I still got content encoding error, you could
check my sandbox here http://mps.lintas.me.
You could check my new sandbox URL here
http://lmecdn.antituhan.com/style/newlintasme_style/A.public.prod.gz.css,,qv==10
11+basic-jquery-slider.css+home.css,Mcc.JcmuxRxM4i.css.pagespeed.cf.AgPUZT9SVx.c
ss
Original comment by dewangg...@xtremenitro.org
on 14 Mar 2013 at 11:52
Attachments:
Hello there,
I just want to update the error will occurs if I enable this filter :
ModPagespeedEnableFilters combine_css,rewrite_css
ModPagespeedEnableFilters inline_css
ModPagespeedEnableFilters outline_css
Now you could check mps.lintas.me and compare with www.lintas.me, the
www.lintas.me doesn't use those filter above.
Original comment by dewangg...@xtremenitro.org
on 15 Mar 2013 at 3:41
Here is my sandbox environment, get broken pages ...
Original comment by dewangg...@xtremenitro.org
on 15 Mar 2013 at 9:54
Attachments:
I can confirm the following:
1. Fetching the reported URL in wget works fine.
2. Fetching the reported URL in Chrome results in a 330 error.
3. Fetching the reported URL in Firefox works fine.
4. Fetching each of the 3 components of the URL in Chrome works fine.
Ergo it would appear that our combined CSS is faulty in some way that Chrome
doesn't like.
Original comment by matterb...@google.com
on 15 Mar 2013 at 1:53
Hmmm... I'm using firefox, and try to access the following URL and normal, but
If I access from my sandbox on http://mps.lintas.me, my pages were broken, and
I've try to access the URL from source, the errors occurs.
Any hints guys?
Original comment by dewangg...@xtremenitro.org
on 15 Mar 2013 at 4:51
If you haven't already I suggest disabling combine_css for now until we work
out what's going on. From my tests I expect that to fix it.
Original comment by matterb...@google.com
on 15 Mar 2013 at 5:14
Hi,
I've disable combine_css and still error, the new rewrite URL were like this
http://lmecdn.antituhan.com/style/newlintasme_style/A.public.prod.gz.css,qv=1011
.pagespeed.cf.77RTnFMsuL.css
I think the error occurs when I enable inline_css + outline_css, lemme try to
enabling and/or disabling that filters one by one.
Original comment by dewangg...@xtremenitro.org
on 15 Mar 2013 at 5:25
Here are the new result of my sandbox and testing few rules belongs to
rewriting css..
The suspect are on the css filter, because if I didn't enable rewrite filter of
css, the pages should be fine and normal. I disable the css rewriting filters
on my live webpages on www.lintas.me, the mps is only for sandbox and testing
on mps.lintas.me
Filter: rewrite_css
<pre>
Orig URL: http://i.brta.in/style/newlintasme_style/public.prod.gz.css?v=1011
Result: FAIL
URL:
http://lmecdn.antituhan.com/style/newlintasme_style/A.public.prod.gz.css,qv=1011
.pagespeed.cf.KdY8IQ67pm.css
</pre>
Filter: rewrite_css, combine_css
<pre>
Orig URL:
1. http://i.brta.in/style/newlintasme_style/public.prod.gz.css?v=1011
2. http://i.brta.in/style/newlintasme_style/basic-jquery-slider.css
3. http://i.brta.in/style/newlintasme_style/home.css
Result: FAIL
URL:
http://lmecdn.antituhan.com/style/newlintasme_style/A.public.prod.gz.css,,qv==10
11+basic-jquery-slider.css+home.css,Mcc.JcmuxRxM4i.css.pagespeed.cf.AgPUZT9SVx.c
ss
</pre>
Filter: rewrite_css, combine_css, inline_css
The requested URL and the result URL are same with rewrite_css + combine_css
filter only.
Filter: rewrite_css, combine_css, inline_css, outline_css
The requested URL and the result URL are same with rewrite_css + combine_css
filter only.
Now, mps.lintas.me is using rewrite_css, combine_css, inline_css filters. Thank
you for your support guys.
Original comment by dewangg...@xtremenitro.org
on 15 Mar 2013 at 5:44
For the comparison, the rewrite_css, combine_css and inline_css are normal on
my new version. You can switch the pages version by triggering /switch on the
URL. By default, the pages show old version. To changes on new sites, click
http://mps.lintas.me/switch
On new version, the filters should be normal and OK. I think, the pagespeed
couldn't parsing something on my css.
On new version, the rewritten URL of css were like this
http://lmecdn.antituhan.com/style/newlintasme_style/A.global.css+facebox.css,Mcc
.YH8VaviiPo.css.pagespeed.cf.Mz79xXIKCa.css
Original comment by dewangg...@xtremenitro.org
on 15 Mar 2013 at 5:50
This is very strange, this seems like some sort of Transfer-Encoding: chunked
issue. I've attached a raw netcat pull of your site using "Accept-Encoding:
gzip".
@Matt, I think the reason that this works on wget and not on chrome or firefox
is that it only comes up for "Transfer-Encoding: chunked" or maybe
"Content-Encoding: gzip" which the servers don't send unless you explicitly
"Accept-Encoding: gzip".
When I send the attached file directly to the browser, it fails with the same
330 error. But if I manually de-chunk it (or fetch with curl which de-chunks)
and send the result to the browser, that seems to work fine, I'm very confused
on what's going on here.
Original comment by sligocki@google.com
on 15 Mar 2013 at 7:39
Attachments:
I noticed in Chrome that "Transfer-Encoding: chunked" appears twice. I haven't
seen that before. Maybe Chrome dislikes it? I'm surprised we are putting that
in.
Original comment by jmara...@google.com
on 15 Mar 2013 at 10:18
Original comment by jmara...@google.com
on 15 Mar 2013 at 10:18
Haha, that's it, this document has been chunked twice! So when it's been
de-chunked the first time, browsers fail to ungzip it because it needs to be
de-chunked again.
dewanggaba, do you know of any reason this might be getting "Transfer-Encoding:
chunked" applied to it twice? What other modules are you running on Apache?
Original comment by sligocki@google.com
on 15 Mar 2013 at 11:13
Slig: Standard module on apache and I remove some unneeded modules, here's my
modules http://fpaste.org/hwpO/. Anyway about transfer encoding, don't know why
the result get chunked, it's only happen on my old view only.
I thought that the pagespeed can't parse because of the file type .gz and
detected by "application/x-gzip gz tgz" by mime types. CMIIW. So, what should I
do, slig, J and Matt ? :D
Original comment by dewangg...@xtremenitro.org
on 15 Mar 2013 at 11:53
Hello there,
I'll try to catch the problem now by disallow 3 file that causes error.
ModPagespeedDisallow */public.prod.gz.css*
ModPagespeedDisallow */basic-jquery-slider.css
ModPagespeedDisallow */home.css
Hang on. I'll update this issue after allow and/or disallow one-by-one the
filter above. I thought that one of file above causes error (content encoding
error).
Original comment by dewangg...@xtremenitro.org
on 21 Mar 2013 at 12:37
Here is the result by enable and/or disable the suspect one-by-one.
Enable: public.prod.gz.css
Disable: basic-jquery-slider.css,home.css
Result: FAIL, page were broken
Enable: basic-jquery-slider.css,home.css
Disable: public.prod.gz.css
Result: OK
Now, I've disable only public.prod.gz.css and enable both of
basic-jquery-slider.css,home.css and the pages are OK right now. What's going
on with public.prod.gz.css ?
You can download the css file right there
http://i.brta.in/style/newlintasme_style/public.prod.gz.css?v=1011
Original comment by dewangg...@xtremenitro.org
on 21 Mar 2013 at 12:53
Not sure if this is relevant but ...
If I fetch those 3 files with 'wget --header="Accept-Encoding: gzip' I get:
* home.css: 822 bytes of plain text
* basic-jquery-slider.css: 733 of gzip'd content
* public.prod.gz.css: 32,197 of gzip'd content.
If I fetch public.prod.gz.css without the AE header I get 182,851 bytes of
plain text.
But I still don't know what's going on here sorry :(
Original comment by matterb...@google.com
on 21 Mar 2013 at 12:50
Yes, if you directly access to un-rewritten files, you'll get nothing. Because
of the direct files from i.brta.in domain are using nginx. I don't know why,
only public.prod.gz.css could not optimized. The other files are normal.
So, I do the tricks by using disallow directive.
Original comment by dewangg...@xtremenitro.org
on 21 Mar 2013 at 3:54
I have got some serious problems too. I am trying to use mod-pagespeed to
optimize JPEG images only on a server which delivers static assets. So I
disabled the core ruleset and enabled only the jpeg filters. All JS/CSS/SWF/PDF
and maybe other files are no longer available due to a content/transfer
encoding problem. I am using the latest beta versionof mod-pagespeed. The
reason to use this setup is to make use of the new in-place optimization for
images so we do not need to do this manually before upload.
As workaround I could use the ModPagespeedDisallow directive but I have to
exclude all other file extensions than .jp(e)g. It really seems to be a serious
problem with double encoding. For pictures we do use chunked encoding instead
of gzip as an optimized jpeg should not get smaller by zipping it.
I am also using CentOS 6.4 x64 with standard Apache 2.2 like the op.
Original comment by jor...@gmail.com
on 11 Apr 2013 at 7:46
Attachments:
joramk, can you provide a link to your site? I want to see if you are having
the same problem as dewanggaba or if it's a different issue.
Original comment by sligocki@google.com
on 11 Apr 2013 at 2:50
I just thought that this issue was same with ngx_pagespeed issue 482.
Is the gzip compression on apache break this things ?
I don't have any issue again after changes some design on my server(s). Putting
ngx_pagespeed in front of mod_pagespeed. Using 2 pagespeed daemon on 1 server,
and put it on same cache_path.
Original comment by dewangg...@xtremenitro.org
on 10 Aug 2013 at 12:48
Sorry, I forgot the ngx_pagespeed's issue link
https://github.com/pagespeed/ngx_pagespeed/issues/482
Original comment by dewangg...@xtremenitro.org
on 10 Aug 2013 at 12:49
Hello,
We also have this "double-encoding" problem as soon as we activate the In-Place
Resource Optimization (ModPagespeedInPlaceResourceOptimization)
Currently using 1.4.26.3-stable but also tried with 1.6.29.4-beta and no luck.
Just after activation, first request to any .js or .css optimized resource,
comes double encoded. After that, it happens randomly.
We really would like the In Place Optimization to work, as we use a CDN for
delivery of resources gathered from a single mod_pagespeed optimized server to
several other domain.
Original comment by i...@rimontgo.es
on 12 Aug 2013 at 10:52
It's worth trying MPS 1.7 which has a fix that might resolve this issue.
I think we were not able to reproduce this so it'd be great if you could
confirm it.
The fix is here:
https://code.google.com/p/modpagespeed/source/detail?r=3480
Original comment by jmara...@google.com
on 12 Nov 2013 at 5:22
Yes, the FetchWithGzip directive solve the problem. BUT!, the css still having
encoding error, altough, here is the PoC :
http://unik-aneh.lintas.me/assets/foundation/css/app.css+offcanvas.css.pagespeed
.cc.hsDY3_UCma.css
From this CSS below :
http://unik-aneh.lintas.me/assets/foundation/css/app.css
http://unik-aneh.lintas.me/assets/foundation/css/offcanvas.css
Original comment by dewangg...@xtremenitro.org
on 12 Nov 2013 at 5:26
Edited:
The CSS 's still have encoding error but the pages are fine
Original comment by dewangg...@xtremenitro.org
on 12 Nov 2013 at 5:27
Can you try flushing your cache after setting FetchWithGzip?
It looks like maybe we have captured some gzipped CSS content in the cache. I
have not seen the ".gz" syntax in CSS files before, but I guess that makes
sense in a way; if you put the precompressed files on disk then you don't have
to compress them when serving to gzip-accepting clients. The only thing I'm
unclear on is whether ngx_pagespeed/mod_pagespeed is seeing the
content-encoding:gzip header when it runs.
Original comment by jmara...@google.com
on 12 Nov 2013 at 9:17
Hi,
You can close this issue, I've update to 1.70.30.3 and do this things from
ngx_pagespeed issues https://github.com/pagespeed/ngx_pagespeed/issues/614
Nginx:
Remove pagespeed CustomFetchHeader Accept-Encoding gzip;
And enable pagespeed FetchWithGzip on;
I use reverse proxy and server the static files from nginx, I don't know is
that affect to Apache versions :)
Original comment by dewangg...@xtremenitro.org
on 8 Feb 2014 at 4:05
Original comment by jmara...@google.com
on 18 Sep 2014 at 5:42
Original issue reported on code.google.com by
dewangg...@xtremenitro.org
on 14 Mar 2013 at 5:32