Open dati91 opened 6 years ago
The latest working servo with our gfx implementation: https://github.com/zakorgy/servo/tree/gfx_servo
The Vulkan and D3D12 builds are behind flags and can be build like mach build -d --features=vulkan
or --features=dx12
On windows it builds and runs with Vulkan and D3D12.
Some notes about what we experienced on windows:
Original WebRender (Reference image): WebRender with vulkan (Notice the missing background image at the GitHub title, and the missing border corners in the top right: Watch, Star and Fork buttons): WebRender with D3D12 (Same issues as above and some buttons don't have color):
On linux this servo builds, but fails with a segfault (Servo exited with return value -11
) related to the jemalloc-sys
when creating String
s. First we ran into this, when created a winit
window with a title, there is a gdb backtarce for this (no servo backtrace was produced). If we skip the with_title
call we run into the same fail when parsing the servo URL, which is also String
related stuff. For this we have a servo backtrace .
This issue doesn't occur with the upstream servo, so we assume there is something wrong on our side.
Update for the above mentioned missing image: Renderdoc shows that we issue the draw call, the image is in the Color0 texture, and the x,y vertex coordinates are correct, but somehow it is not presented on the screen. Maybe we should check, the z
vertex coordinates, if it is a rounding error, because the image should placed on the top of the gradient, and the gap between the z coordinates of the gradient (z = 1.0000) and the image (z = 0.9999) is very small.
I managed to run some wpt tests on linux with vulkan (actually 3 times, just to be sure):
./mach test-wpt --debug-build tests/wpt/mozilla/tests/css/
Running 604 tests in web-platform-tests
▶ PASS [expected FAIL] /_mozilla/css/border_radius_elliptical_a.html
▶ TIMEOUT [expected PASS] /_mozilla/css/iframe/hide_after_load.html
▶ PASS [expected FAIL] /_mozilla/css/font_fallback_02.html
▶ FAIL [expected PASS] /_mozilla/css/iframe/hide_and_show.html
└ → /_mozilla/css/iframe/hide_and_show.html 59b0f09656f8f368911966a3ec6c9eb3e620a50a
/_mozilla/css/iframe/hide_and_show_ref.html a36327217559a2b53aab5192e536c19446ee75b3
Testing 59b0f09656f8f368911966a3ec6c9eb3e620a50a == a36327217559a2b53aab5192e536c19446ee75b3
▶ PASS [expected FAIL] /_mozilla/css/font_fallback_01.html
▶ FAIL [expected PASS] /_mozilla/css/overflow_clipping.html
└ → /_mozilla/css/overflow_clipping.html 5530bf469bc6722ca3d302bf0b8b612ae5849800
/_mozilla/css/overflow_clipping_ref.html d7e45f35e33fe6f368243d70bcc70d6feb486aeb
Testing 5530bf469bc6722ca3d302bf0b8b612ae5849800 == d7e45f35e33fe6f368243d70bcc70d6feb486aeb
Ran 604 tests finished in 410.0 seconds.
• 598 ran as expected. 46 tests skipped.
• 2 tests failed unexpectedly
• 1 tests timed out unexpectedly
• 3 tests passed unexpectedly
./mach test-wpt --debug-build tests/wpt/mozilla/tests/css
Running 604 tests in web-platform-tests
▶ PASS [expected FAIL] /_mozilla/css/border_radius_elliptical_a.html
▶ PASS [expected FAIL] /_mozilla/css/font_fallback_02.html
▶ FAIL [expected PASS] /_mozilla/css/iframe/hide_and_show.html
└ → /_mozilla/css/iframe/hide_and_show.html 59b0f09656f8f368911966a3ec6c9eb3e620a50a
/_mozilla/css/iframe/hide_and_show_ref.html a36327217559a2b53aab5192e536c19446ee75b3
Testing 59b0f09656f8f368911966a3ec6c9eb3e620a50a == a36327217559a2b53aab5192e536c19446ee75b3
▶ FAIL [expected PASS] /_mozilla/css/iframe/overflow.html
└ → /_mozilla/css/iframe/overflow.html 59b0f09656f8f368911966a3ec6c9eb3e620a50a
/_mozilla/css/iframe/overflow_ref.html ba65002c4f0bd5eb47f41b0f48391fecf4da9723
Testing 59b0f09656f8f368911966a3ec6c9eb3e620a50a == ba65002c4f0bd5eb47f41b0f48391fecf4da9723
▶ PASS [expected FAIL] /_mozilla/css/font_fallback_01.html
▶ FAIL [expected PASS] /_mozilla/css/overflow_clipping.html
└ → /_mozilla/css/overflow_clipping.html 5530bf469bc6722ca3d302bf0b8b612ae5849800
/_mozilla/css/overflow_clipping_ref.html d7e45f35e33fe6f368243d70bcc70d6feb486aeb
Testing 5530bf469bc6722ca3d302bf0b8b612ae5849800 == d7e45f35e33fe6f368243d70bcc70d6feb486aeb
▶ PASS [expected TIMEOUT] /_mozilla/css/transition_calc_implicit.html
Ran 604 tests finished in 408.0 seconds.
• 597 ran as expected. 46 tests skipped.
• 3 tests failed unexpectedly
• 4 tests passed unexpectedly
./mach test-wpt --debug-build tests/wpt/mozilla/tests/css
Running 604 tests in web-platform-tests
▶ PASS [expected FAIL] /_mozilla/css/border_radius_elliptical_a.html
▶ TIMEOUT [expected PASS] /_mozilla/css/iframe/hide_after_load.html
▶ PASS [expected FAIL] /_mozilla/css/font_fallback_02.html
▶ PASS [expected FAIL] /_mozilla/css/font_fallback_01.html
▶ FAIL [expected PASS] /_mozilla/css/overflow_clipping.html
└ → /_mozilla/css/overflow_clipping.html 5530bf469bc6722ca3d302bf0b8b612ae5849800
/_mozilla/css/overflow_clipping_ref.html d7e45f35e33fe6f368243d70bcc70d6feb486aeb
Testing 5530bf469bc6722ca3d302bf0b8b612ae5849800 == d7e45f35e33fe6f368243d70bcc70d6feb486aeb
Ran 604 tests finished in 481.0 seconds.
• 599 ran as expected. 46 tests skipped.
• 1 tests failed unexpectedly
• 1 tests timed out unexpectedly
• 3 tests passed unexpectedly
Note: I only added a read_pixels
+image_write
. If I misses something please let me know.
Also I always afraid to trust html-to-html like image compare (at first :) ). Is there any test that compares with reference pictures? Just to be sure.
Holy Jesus having 1-3 failing tests out of 600 is super good news!
wpt-test for css
(release build, on linux, with vulkan)
Ran 14877 tests finished in 6249.0 seconds.
• 12118 ran as expected. 159 tests skipped.
• 1761 tests failed unexpectedly
• 85 tests timed out unexpectedly
• 18 tests passed unexpectedly
• 895 tests had unexpected subtest results
And the same version of WR with opengl
Ran 14877 tests finished in 2162.0 seconds.
• 12131 ran as expected. 159 tests skipped.
• 1755 tests failed unexpectedly
• 84 tests timed out unexpectedly
• 13 tests passed unexpectedly
• 894 tests had unexpected subtest results
The numbers are great. Looks like there is just a few things to address, really, comparing to the current GL build. One thing that caught my interest is that the Vulkan build took 3x more time. This is concerning, given that we ultimately want to see an improvement in performance. Please check it out and see where/why we are spending so much time.
Looks like there is just a few things to address, really, comparing to the current GL build.
I managed to hunt down the actual differences between the original WR (excluding the flaky tests):
▶ PASS [expected FAIL] /css/css-backgrounds/background-size-025.html ▶ PASS [expected FAIL] /css/css-backgrounds/background-size-027.html ▶ FAIL [expected PASS] /css/css-images/tiled-gradients.html ▶ FAIL [expected PASS] /css/css-images/tiled-radial-gradients.html ▶ TIMEOUT [expected FAIL] /css/compositing/mix-blend-mode/mix-blend-mode-animation.html ▶ TIMEOUT [expected PASS] /css/css-values/vh_not_refreshing_on_chrome.html
Please check it out and see where/why we are spending so much time.
One thing for sure, is that we use a "headless" winit (with_visibility(false)
) opposed to Headless OSMesa
. But I just checked and it didn't do that much impact on the time.
My guess would be that every test starts a servo, which means that we need to create and destroy pipelines, descriptor pools, buffers, etc. Which is an overhead, but only for the first few frames. But ofc the reftest is only using the first 2 frames (test + ref).
@dati91 thanks for looking into that! Aren't the tiled gradients differences expected given the change in dithering behavior? Those last two tests need some more attention. Maybe it's just the timeouts to bump :)
One thing for sure, is that we use a "headless" winit (with_visibility(false)) opposed to Headless OSMesa. But I just checked and it didn't do that much impact on the time.
I wonder if we should avoid using winit
at all for headless WR rendering. Given that we can assume only a single frame needs to be in flight for reftests, where headless is currently used, we can just create an image to render to, instead of spawning the window with framebuffer.
My guess would be that every test starts a servo, which means that we need to create and destroy pipelines, descriptor pools, buffers, etc.
Please clarify, was the "And the same version of WR with opengl" using the headless GL or native GL?
I can imagine starting a GPU context and waiting for the frames would have a higher latency than just doing that on CPU. In this case, we should compare the runs with native GL.
Please clarify, was the "And the same version of WR with opengl" using the headless GL or native GL?
As the default usage, which uses a headless context with OSMesa. From the wpt-log:
pid:20936 VMware, Inc.
pid:20936 softpipe
pid:20936 3.3 (Core Profile) Mesa 18.1.0-devel
Aren't the tiled gradients differences expected given the change in dithering behavior?
I didn't looked into the tests yet, but this is my thought as well.
A quick update on these errors
▶ PASS [expected FAIL] /css/css-backgrounds/background-size-025.html The original WR has some differences with the text. (red dots where they differ) And ours draws it fine.
▶ PASS [expected FAIL] /css/css-backgrounds/background-size-027.html Same things happens here. (red dots)
▶ FAIL [expected PASS] /css/css-images/tiled-gradients.html We didn't draw some pixels at the edge. (green dots) Most probably related to #158
▶ FAIL [expected PASS] /css/css-images/tiled-radial-gradients.html The dithering seems off. (blue dots) Most probably related to #158
▶ TIMEOUT [expected FAIL] /css/compositing/mix-blend-mode/mix-blend-mode-animation.html ▶ TIMEOUT [expected PASS] /css/css-values/vh_not_refreshing_on_chrome.html
Unfortunately, the last two can't be solved with a simple timeout bump. It seems like servo just runs into an infinite loop, and also allocate things, because it will run out of memory quickly and starts to swap and crash my terminal. fun. If I run those without the wptrunner, they both run fine. I will look into whats going on with these more closely.
FAIL [expected PASS] /css/css-images/tiled-radial-gradients.html
I disabled the dithering and this one passed, so this is definitely #158
But the other one still looks the same (incorrect edge), so that's a different issue :/
Sneak peek (servo with metal backend)
Status update on the tests/wpt/mozilla/tests/css/
tests:
Running 604 tests in web-platform-tests
▶ PASS [expected FAIL] /_mozilla/css/font_fallback_01.html
▶ PASS [expected FAIL] /_mozilla/css/font_fallback_02.html
▶ PASS [expected TIMEOUT] /_mozilla/css/transition_calc_implicit.html *
Ran 604 tests finished in 68.0 seconds.
• 601 ran as expected. 46 tests skipped.
• 3 tests passed unexpectedly
* It passes sometimes.
Running 604 tests in web-platform-tests
▶ PASS [expected FAIL] /_mozilla/css/font_fallback_01.html
▶ PASS [expected TIMEOUT] /_mozilla/css/transition_calc_implicit.html
▶ PASS [expected FAIL] /_mozilla/css/font_fallback_02.html
Ran 604 tests finished in 238.0 seconds.
• 601 ran as expected. 46 tests skipped.
• 3 tests passed unexpectedly
Running 604 tests in web-platform-tests
▶ PASS [expected TIMEOUT] /_mozilla/css/transition_calc_implicit.html
▶ FAIL [expected PASS] /_mozilla/css/viewport_meta.html
└ → /_mozilla/css/viewport_meta.html 01167f1f8b941ded476a36306ed31a011abf73cc
/_mozilla/css/viewport_rule_ref.html c6e879462fb4cfbe41a480e2692c7b09ab05cb27
Testing 01167f1f8b941ded476a36306ed31a011abf73cc == c6e879462fb4cfbe41a480e2692c7b09ab05cb27
▶ FAIL [expected PASS] /_mozilla/css/viewport_percentage_vmin_vmax_b.html
└ → /_mozilla/css/viewport_percentage_vmin_vmax_b.html c26cebd88147b53d3477e28f9dad07df2adb2231
/_mozilla/css/viewport_percentage_vmin_vmax_ref.html 295a17bcdb060d449c6ea6e2a295804f1b024e8d
Testing c26cebd88147b53d3477e28f9dad07df2adb2231 == 295a17bcdb060d449c6ea6e2a295804f1b024e8d
▶ FAIL [expected PASS] /_mozilla/css/transform_3d_from_outside_viewport.html
└ → /_mozilla/css/transform_3d_from_outside_viewport.html 0365932f7bec9e14bef7b8517bee9d6f8778475e
/_mozilla/css/transform_3d_from_outside_viewport_ref.html 0365932f7bec9e14bef7b8517bee9d6f8778475e
Testing 0365932f7bec9e14bef7b8517bee9d6f8778475e != 0365932f7bec9e14bef7b8517bee9d6f8778475e
▶ FAIL [expected PASS] /_mozilla/css/viewport_percentage_vw_vh_b.html
└ → /_mozilla/css/viewport_percentage_vw_vh_b.html fcfa50faaedb474f18223595cb67cd3507e15931
/_mozilla/css/viewport_percentage_vw_vh_ref.html a704c6d9b0a0ae501437564c7b216b0481206668
Testing fcfa50faaedb474f18223595cb67cd3507e15931 == a704c6d9b0a0ae501437564c7b216b0481206668
▶ FAIL [expected PASS] /_mozilla/css/viewport_rule.html
└ → /_mozilla/css/viewport_rule.html 01167f1f8b941ded476a36306ed31a011abf73cc
/_mozilla/css/viewport_rule_ref.html c6e879462fb4cfbe41a480e2692c7b09ab05cb27
Testing 01167f1f8b941ded476a36306ed31a011abf73cc == c6e879462fb4cfbe41a480e2692c7b09ab05cb27
Ran 604 tests finished in 255.0 seconds.
• 598 ran as expected. 46 tests skipped.
• 5 tests failed unexpectedly
• 1 tests passed unexpectedly
Servo branch for the vulkan tests: https://github.com/zakorgy/servo/tree/master. Servo branch for the metal tests: https://github.com/dati91/servo/tree/master.
An update on the above Mac errors:
▶ FAIL [expected PASS] /_mozilla/css/viewport_meta.html:
▶ FAIL [expected PASS] /_mozilla/css/viewport_percentage_vmin_vmax_b.html:
▶ FAIL [expected PASS] /_mozilla/css/viewport_percentage_vw_vh_b.html:
▶ FAIL [expected PASS] /_mozilla/css/viewport_rule.html
▶ FAIL [expected PASS] /_mozilla/css/transform_3d_from_outside_viewport.html: In this case the online diff tool we use didn't show any difference, also the prefix of the compared hashes in the previous comment are the same for this test:
/_mozilla/css/transform_3d_from_outside_viewport.html 0365932f7bec9e14bef7b8517bee9d6f8778475e /_mozilla/css/transform_3d_from_outside_viewport_ref.html 0365932f7bec9e14bef7b8517bee9d6f8778475e Testing 0365932f7bec9e14bef7b8517bee9d6f8778475e != 0365932f7bec9e14bef7b8517bee9d6f8778475e
It looks like the same issue occurs in 4 cases, the vertical size of the elements are shorter than expected.
transform_3d_from_outside_viewport.html is testing that the reference renders differently than the test file, which is why it's considered a failure when the hashes match.
Thanks @jdm for the info. I have compared our result image to an image created with an original servo, and it is visible now, that we have a missing quad. The highlighted area on the image below is a green quad, which we don't draw to the screen. ▶ FAIL [expected PASS] /_mozilla/css/transform_3d_from_outside_viewport.html:
We ran the css wpt tests (under tests/wpt/web-platform-tests/css/) with the servo versions from https://github.com/szeged/webrender/issues/173#issuecomment-415373702, and created a spreadsheet from the results. The total number of tests is 15260. A short summary:
We have the following differences with the wpt css tests with gfx-metal:
▶ FAIL [expected PASS] /css/css-backgrounds/background-clip/clip-rounded-corner.html: The reference image we draw is wrong, the highlighted area should be green (Otherwise the test image looks the same as with OpenGL):
▶ FAIL [expected PASS] /css/css-backgrounds/border-bottom-left-radius-010.xht ▶ FAIL [expected PASS] /css/css-backgrounds/border-bottom-left-radius-011.xht ▶ FAIL [expected PASS] /css/css-backgrounds/border-bottom-right-radius-010.xht ▶ FAIL [expected PASS] /css/css-backgrounds/border-bottom-right-radius-011.xht ▶ FAIL [expected PASS] /css/css-backgrounds/border-top-left-radius-010.xht ▶ FAIL [expected PASS] /css/css-backgrounds/border-top-left-radius-011.xht ▶ FAIL [expected PASS] /css/css-backgrounds/border-top-right-radius-010.xht ▶ FAIL [expected PASS] /css/css-backgrounds/border-top-right-radius-011.xht These tests look the same and most probably are related. The result image speaks for himself:
▶ FAIL [expected PASS] /css/css-flexbox/flex-align-items-center.html: Three green rectangles are missing from the dotted area.
▶ FAIL [expected PASS] /css/css-flexbox/flex-items-flexibility.html: A green rect is missing from the dotted area.
▶ FAIL [expected PASS] /css/css-backgrounds/background-attachment-local/attachment-local-positioning-2.html The text position is wrong. Reference image: Result image:
▶ FAIL [expected PASS] /css/css-backgrounds/background-attachment-local/attachment-scroll-positioning-1.html The text position is wrong. Probably related to the above test. Reference image: Result image:
▶ FAIL [expected PASS] /css/css-images/tiled-gradients.html ▶ FAIL [expected PASS] /css/css-images/tiled-radial-gradients.html These tests have the same issues as with Vulkan: https://github.com/szeged/webrender/issues/173#issuecomment-398462707
▶ PASS [expected FAIL] /css/compositing/mix-blend-mode/mix-blend-mode-parent-element-overflow-hidden-and-border-radius.html This passes because the test and reference images are both blank.
▶ PASS [expected FAIL] /css/css-backgrounds/background-attachment-local/attachment-local-clipping-color-6.html ▶ PASS [expected FAIL] /css/css-backgrounds/background-attachment-local/attachment-local-clipping-image-6.html These pass because both the test and reference images are not drawn properly. A text and a rectangle is missing from the circle
▶ PASS [expected FAIL] /css/css-backgrounds/background-size-025.html ▶ PASS [expected FAIL] /css/css-backgrounds/background-size-027.html ▶ PASS [expected FAIL] /css/filter-effects/filter-contrast-003.html These are actually passing
After the gfx-rs/gfx#2403 fix the remaining failing tests are:
mozilla/tests/css
folder: All fixed ( https://github.com/szeged/webrender/issues/173#issuecomment-415701153 and https://github.com/szeged/webrender/issues/173#issuecomment-416137756)Css tests from web-platform-tests/css
folder: (We have two failing tests, and these are probably related, because the text position is wrong in booth cases)
▶ FAIL [expected PASS] /css/css-backgrounds/background-attachment-local/attachment-local-positioning-2.html
Reference image: Result image:
▶ FAIL [expected PASS] /css/css-backgrounds/background-attachment-local/attachment-scroll-positioning-1.html
Reference image: Result image:
These appear to be the same issues as in https://github.com/servo/servo/pull/21725 So it's not your fault, something regressed in WR and we are trying to narrow it down now.
Status update on the wpt-tests for tests/wpt/mozilla/tests/css/
(release build, linux):
vulkan:
Ran 604 tests finished in 269.0 seconds.
• 591 ran as expected. 46 tests skipped.
• 11 tests failed unexpectedly
• 2 tests passed unexpectedly
OpenGL:
Ran 604 tests finished in 85.0 seconds.
• 601 ran as expected. 46 tests skipped.
• 1 tests timed out unexpectedly
• 2 tests passed unexpectedly
wpt-tests for tests/wpt/mozilla/tests/css/
(release build, linux):
vulkan:
Ran 604 tests finished in 245.0 seconds.
• 598 ran as expected. 46 tests skipped.
• 3 tests failed unexpectedly
• 3 tests passed unexpectedly
OpenGL:
Ran 604 tests finished in 78.0 seconds.
• 601 ran as expected. 46 tests skipped.
• 3 tests passed unexpectedly
Failed tests:
FAIL [expected PASS] /_mozilla/css/stacked_layers.html
test:
reference:
FAIL [expected PASS] /_mozilla/css/iframe/hide_and_show.html
test:
reference:
FAIL [expected PASS] /_mozilla/css/linear_gradients_parsing_a.html
test:
reference:
We can track the servo related issues/goals here.
wpt
tests