Closed DromedaryKing closed 6 days ago
It seems a Firefox regression (it does not happen in Firefox ESR 115, it does in 128 and above).
The page rendering is stopped as soon as an asynchronous XMLHttpRequest is sent, which is a hack needed for NoScript to be configured and ready before any page script is able to run.
Investigating, thanks.
Synchronous XHR, I presume?
I recall a relatively recent report in Firefox originally attributed to NoScript and another extension, at https://bugzilla.mozilla.org/show_bug.cgi?id=1899786 where XML document rendering broke when OMT decompression was enabled by default. The test case here (standardebooks.org) is a XHTML document served with Content-Type application/xhtml+xml; charset=utf-8
. I haven't confirmed whether it would also be affected by the same issue, but wouldn't be surprised if it is.
Synchronous XHR, I presume?
Correct, fixed my comment.
Thank you, I can confirm it's the same issue: it goes away indeed as soon as you flip the network.decompression_off_mainthread2
preference to false.
Since it's marked as 129 wontfix, I presume I will need some kind of work around for NoScript and certainly patching Tor Browser 14.x (based on ESR128).
Added a work-around (another hack, of course!) in NoScript 11.4.34rc2. Please verify, thank you.
What do you expect to happen in Firefox by using https://github.com/hackademix/nscl/commit/397a6f9db8f6f7f95b6bf35efc3030577b4d623 ?
Do you get the same effect if you call filter.disconnect() from filter.onstart (instead of filter.close() from filter.onstop)? Forwarding the full response body without needing the response body itself seems rather wasteful.
What do you expect to happen in Firefox by using hackademix/nscl@397a6f9 ?
To force the (gzip) decoding to finish before the actual parsing / rendering / event dispatching starts.
Do you get the same effect if you call filter.disconnect() from filter.onstart (instead of filter.close() from filter.onstop)? Forwarding the full response body without needing the response body itself seems rather wasteful.
If I do as you suggest it seems nothing gets to the page and I consistently get the dreaded yellow/red XML parsing error page.
What does seem to work is calling filter.disconnect()
from ondata()
after writing the first chunk of data, but I'd like to check with a very big xml document forcing the streamfilter to split the data in multiple chunks before calling it a win.
@Rob--W wrote:
Do you get the same effect if you call filter.disconnect() from filter.onstart (instead of filter.close() from filter.onstop)? If I do as you suggest it seems nothing gets to the page and I consistently get the dreaded yellow/red XML parsing error page.
I'm not sure what I was looking at this morning, but in the meanwhile I managed to test with big XML files (~10MB), double checking they were bigger than the ondata buffer and passing through only the first ~16Kb chunk. Since it worked, as a second thought I retried filter.disconnect()
in filter.onstart
and, lo and behold, it worked as well!
So I'm likely going with your suggestion in rc3, thank you (hackademix/nscl@56a81d1ad8400c5bdb726aa251588af2ad9eb95c).
I wrote:
So I'm likely going with your suggestion in rc3, thank you (hackademix/nscl@56a81d1).
Unfortunately I had to revert to the original "wasteful" version, because calling filter.disconnect()
either in onstart
, or in ondata
after writing the first chunk, actually caused problems (broken rendering and unresponsive stop UI on long documents) even if not evenly reproducible.
Therefore I'm releasing 11.4.34 with the full passthrough work-around, which seems much more reliable, and working at a minimal test case to check what's actually wrong with disconnect()
.
I have "disable restrictions globally" enabled, but whenever I have NoScript active, standardebooks.org breaks; sometimes a page will load the first time I go to it, but a refresh or clicking any link to navigate further yields either a blank page, or nothing aside from the site header.
I'm using the latest version of Firefox.