Open agersant opened 4 years ago
I can't seem reproduce this in ArchLinux, as pixels laid in (32, 24) are (0, 0, 0). Tried both with Intel GPU with Mesa 20.1.6 driver and NVidia GPU with proprietary 450.66 driver.
Here's my test code.
function love.draw()
love.graphics.setColor({0/255, 234/255, 255/255, 0.6});
love.graphics.circle("fill", 40, 40, 10, 16);
love.graphics.setColor({0/255, 234/255, 255/255});
love.graphics.circle("line", 40, 40, 10, 16);
end
function love.keyreleased(k)
if k == "space" then
love.graphics.captureScreenshot(function(imageData)
local r, g, b = imageData:getPixel(32, 24)
print(r * 255, g * 255, b * 255)
end)
end
end
Press spacebar to print the pixel at (32, 24).
Are you sure the pixel coordinates are correct? Do you altered the graphics transformation stack beforehand?
Is it possible the Linux/Github Actions machine you're testing on is using a graphics driver that uses a different internal backbuffer color format than other platforms? (For example 16 bit or 30 bit colors instead of 24 bit). Does the issue still happen if you render to a canvas and use Canvas:newImageData
instead of rendering to the backbuffer?
Thanks a lot for taking a look!
Is it possible the Linux/Github Actions machine you're testing on is using a graphics driver that uses a different internal backbuffer color format than other platforms?
I thought you hit the nail on the head with this one. I looked into it and the xvfb
virtual framebuffer defaults to only 8 bits per pixel! I updated the test setup to use 24 bits per pixels and it did not seem to make a difference :(
I also tried rendering onto a canvas (see below) and the issue still reproduces.
Are you sure the pixel coordinates are correct? Do you altered the graphics transformation stack beforehand?
The coordinates in my original report were referencing the attached images, not the simplified test code. You are correct that the test image had some translate
transforms in the stack. Sorry about not making that clearer.
I created a new repository with a clean setup for reproducing the bug. I expanded the test cases to drawing a filled circle, a line circle and a square, with various opacity and using either canvas/backbuffer.
Findings:
You can view the test results here. Unfortunately the Windows machines on Github actions don't support OpenGL 2 so the Windows tests have to be ran locally. They are all passing on my machine.
I tried to run your test repository under ArchLinux and here's the result
Intel HD Graphics 620, Mesa 2.0.16 mesa_2.0.16.log
NVidia GT 930 MX, proprietary 450.66 proprietary_450.66.log
So this is looks like Intel driver problem? Probably. I have to boot to Windows to make sure.
Yikes, the failures listed in your test using the Intel drivers are also not exactly the same as the ones in Github actions - although they are very similar. I can't find any info on what hardware/drivers are used on Github Actions.
For reference, my testing on Windows is being done with an Nvidia GPU (RTX 2070 Super) and recent official drivers (451.48). The 'expected' images in the test repository were generated on this setup.
I just booted up to Windows 10 2004 with same iGPU but with version 27.20.100.8587 and here's the output. 27.20.100.8587.txt
The NVidia driver in Windows is 452.06, slightly more updated than in my ArchLinux, but this doesn't seem to affect the output (all outputs are identical when using NVidia GPU).
We probably need someone to test with AMD GPU.
(accidentally clicked the close button, sorry about that)
I tested this again in laptop with integrated driver in Ryzen 7 4700U CPU, outputs are identical to Intel (except square-line test, is that added recently?).
So it's probably Nvidia that introduces inconsistency.
You are right, the square-line test was added a few hours after the others (reference image was from the same Nvidia GPU and driver). Interesting that it's the only one where AMD and Intel don't match :(
Are the filled shapes mismatching, or is it the lines? (Or both?)
If it's the lines, does it only happen with rendering smooth lines, or is it reproducible with rough lines (love.graphics.setLineStyle("rough")
) too? Narrowing this down will help inform if/how it can be addressed.
I just updated the test suite to run with both rough
and smooth
line styles (was always smooth
before).
Results: https://github.com/agersant/love-1618/runs/4812199379?check_suite_focus=true
In summary: ✅ Filled shapes ✅ Rough lines ❌Smooth lines
Hello!
While working on a hobby game project, I set up some integration tests which compare game frames with pre-existing screenshots (both obtained via
love.graphics.captureScreenshot
). The goal is to mark tests as failed when the result of the capture differs from the known correct screenshot. To implement this, I load the reference image usinglove.image.newImageData(path)
and and use:getPixel()
to compare each pixel against its equivalent in theImageData
obtained vialove.graphics.captureScreenshot
.This is working great on my Windows development machine. However, when running the tests on Linux (in Github Actions), some colors come out slightly different from the reference screenshots (taken on Windows). This results in false positives in the tests.
I attached two images to this ticket illustrating the problem on a simple test case. Notice how the pixel at (32, 24) is a slightly different shade of blue (
<Red 0, Green 42, Blue 45>
vs<Red 0, Green 49, Blue 53>
):Both images were saved using
captureScreenshot
and calling:encode():toString()
on the resultingImageData
. Specifically, I did not use OS print screen functionality or any image editing program which could have altered the colors.The code I used to draw this blue circle boils down to:
I understand this is a very minor issue and is probably difficult to investigate, so no worries if not much can be done about it.
In case you are curious about the test harness code, you can look at the
compareFrame
function here - although I did my best to mention all the important details in this ticket.