Open Quicksaver opened 8 years ago
do you have any idea to better avoid those black canvases?
black indeed means there has not been any painting yet. under e10s this can happen too with apz + tiles enabled. you can get black tiles when scrolling really really fast.
but i have no idea what causes it in the context of thumbnail painting. are you sure we got a paint event from the currently loaded page (not the previous one) that triggers the thumbnail paint?
Or perhaps to better restrict the calls to toDataURL() at least
instead of transforming to a url doing canvas2dcontext.getImageData and checking whether the array contained in the ImageData
is all-zero might be faster. you wouldn't have to cache any black canvas things either.
are you sure we got a paint event from the currently loaded page (not the previous one) that triggers the thumbnail paint?
The tabs are already loaded when the thumbnail is fetched, so it's definitely from the current page; the page was unloaded before in my tests so there was no previous page to trigger it anyway. And even if there was, I can get black thumbs long after the page is loaded (as in close to a minute) because of the queued thumbnails system, so I don't think it's even possible it's trying to fetch the thumb from a previous page.
checking whether the array contained in the ImageData is all-zero might be faster
Iterating through all the entries in the ImageData's data array seems to be a whole order of magnitude slower for me than checking dataURLs, especially for larger thumbs; it's not really better even in smaller thumbs either.
This does seem to be a problem with some kind of delayed/uninitialized component that the thumbnails service relies on. browser-ctrlTab.js also stores black thumbs after restoring a tab if TabView is visible. I'll try to create a more simplified test case to file a bug report sometime.
Iterating through all the entries in the ImageData's data array seems to be a whole order of magnitude slower for me than checking dataURLs, especially for larger thumbs; it's not really better even in smaller thumbs either.
you have to account for generating the data url + comparison. generating the data url will have to read all the pixels anyway, turn it into a png (compression) and then copy it to a string. just reading the pixel data without the compression step should be faster, at least in theory.
It may indeed be, but then it has to do the compare in Javascript and that's probably the slower part. It's definitely faster to compare a 50-100 chars string once than to compare 7000 array indexes one at a time.
BTW 7400 indexes in the data array was for the smallest thumbnails I had in my profile. Larger thumbnails were a heck of a lot larger, since it creates 4 indexes of a byte each per pixel (red, green, blue, alpha).
but then it has to do the compare in Javascript
which the JIT compiler should optimize very well considering imagedata returns typed arrays. reading N bytes can't be any slower than reading, compressing N bytes and comparing M bytes.
Of course JIT optimizations won't show in a one-off test.
I don't see how any JIT optimizations could possibly help there. Even though they're typed and clamped arrays, they're still different arrays for every single canvas painted, so their entries will still need to be checked individually. What am I missing from that logic?
I tested something like:
isCanvasBlack: function(canvas) {
let ctx = canvas.getContext('2d');
let data = ctx.getImageData(0, 0, canvas.width, canvas.height).data;
for(let i = 0; i < data.length; i += 4) {
// i = red, i+1 = green, i+2 = blue, i+3 = alpha
if(data[i] > 0 || data[i+1] > 0 || data[i+2] > 0 || data[i+3] < 255) { return false; }
}
return true;
}
The lowest time for any canvas processed through that method was higher than the highest time I've seen so far for comparing data urls. Those are the only two methods I know of to check if a canvas is black. Do you know of a more effective way of using/checking ImageData arrays?
Neither toDataURL() or getImageData() run in JS, they're both native application functions (C++). I don't have the ability to read their implementation properly to fully understand them, but the expense of calling either of them is negligible when compared to processing their results in JS. The bottleneck lies fully in the JS checks, between comparing a text string and cycling through thousands of entries in an array; string wins by far.
This is another weird one. Apparently the very first canvas captured from a page can be fully black if:
In 815cae6aecc85984ae1327c9660e5191e76d0918 I implemented a cache of black canvas.toDataURL()s based on canvas size, and compare those to possible tab thumbnails.
It seems fairly easy to test for me. Without that commit just open groups view, right click on any unloaded tab (I used wikipedia pages) and choose to reload it. As soon as it stops loading, the thumb will be black for a moment before it shows the actual thumbnail.
Even worse case is open a few dozen wikipedia pages in a group. Restart the browser so that they're all unloaded. Right-click on one of those wiki pages, choose "Reload all tabs". Now all the thumbs will turn black (in order of loading!), and then they will show the thumbnails (still in order!); as in all go black one at a time and only then they all show the thumbs from black one at a time. I interpret this as it doesn't seem to be related with the timing of requesting the thumbnail...
This may be a bug with PageThumbUtils in Firefox, as from my testing it seems like snapshotCanvas is black already when it's copied to ours, although I'm too far from sure to open a bug about it (and I don't think the thumbnail service is something in which they'd currently like to invest a lot of time anyway).
@the8472 , as someone with knowledge in the area, do you have any idea to better avoid those black canvases? Or perhaps to better restrict the calls to toDataURL() at least, they're not too expensive but the difference isn't negligible either (varying anywhere between 2-50ms in my testing). Right now that check is only called the very first time after a tab is unloaded if it isn't a remote browser, so not at all in e10s. I'm fairly tempted to just keep that commit as-is and wait for e10s to arrive.