garris / BackstopJS

Catch CSS curve balls.
http://backstopjs.org
MIT License
6.71k stars 602 forks source link

Puppeteer dev branch! #704

Open garris opened 6 years ago

garris commented 6 years ago

This is happening! image

This branch... https://github.com/garris/BackstopJS/tree/puppeteer

is tracking this branch... https://github.com/krisdigital/BackstopJS

Interested in contributing? ...please coordinate with @krisdigital.

What you need to know...

krisdigital commented 6 years ago

Hello to all potential puppet players!

Here is a little road map for all who want to try it out or even contribute. In general, for me it works much more reliable and it is also faster.

Take it for a spin an let me know what you think ๐Ÿš€

garris commented 6 years ago

Just bumped to 3.2.0 and pushed to NPM @beta!

TESTERS & COMMENTS! ๐Ÿ™

npm install backstopjs@beta
Aeotrin commented 6 years ago

Works for me :) NOTE: the engine needs to be puppet Node version requirement of > v8 as well.

This also fixed the issue of a white screenshot I was having when running against a local (.local or .dev) domain #706 .

matthiaskomarek commented 6 years ago

I use the reference urls for our test suite with the following command:

backstop reference && backstop test

And it looks like that the test suite hangs up when an error occurs. In the console i get

COMMAND | Command `reference` ended with an error after [95.102s]
COMMAND | Error: Navigation Timeout Exceeded: 60000ms exceeded
                    at Promise.then (/Users/matthiaskomarek/Documents/Projects/zurich/visual-regression/node_modules/puppeteer/lib/NavigatorWatcher.js:71:21)
                    at <anonymous>
(node:35439) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): Error: Navigation Timeout Exceeded: 60000ms exceeded
(node:35439) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.

After this, it just stops running. But doesn't terminate the terminal process.

If no error occurs, everything is running fine!

krisdigital commented 6 years ago

@matthiaskomarek do you get a different behavior using Chromy in the stable branch? Not sure if it has something to do with Puppeteer..

AkA84 commented 6 years ago

Hey @krisdigital , thanks a lot for doing this! I'm testing it out and I saw that when I set debugWindow: true, there is no chrome windows opened when I run the tests. Has the param name changed?

EDIT: I also see no difference between debug: false and debug: true

krisdigital commented 6 years ago

@AkA84 Both true, did not implement the debugWindow option, but it is not a problem to do... I never could find any use for the debug option though, as it throws just too much text out, I think it only makes sense if you have no debugWindow option like phantomJS.. https://github.com/garris/BackstopJS/pull/713

AkA84 commented 6 years ago

@krisdigital thanks a lot again, and yeah I agree I didn't find the debug option useful unless i wanted to stare at a big dump of html for some reason!

matthiaskomarek commented 6 years ago

Hey @krisdigital i have tried the same with the current 3.1.21 branch and if something fails with the chrome engine. The script goes on and just shows an empty entry in the report. So this is an issue (the hung up script) with the puppeteer engine.

krisdigital commented 6 years ago

@matthiaskomarek Thank's for checking, I will look into it!

AkA84 commented 6 years ago

anyone has any problems with running captures in parallel (say, 5 at a time)? I have random errors (puppeteer that can't find an element, for example) for scenarios with onReady scripts that run just fine when doing one capture at a time.

I'm still not sure if this is something wrong with my setup/scripts, but just wanted to bring it up in case someone encountered the same issue. At the moment I can rely on the results only if i don't do parallel capturing, which really is a shame :(

garris commented 6 years ago

@krisdigital this issue will probably be addressed by the work happening in #712.

garris commented 6 years ago

@aka84 we are still working on the migrationโ€” once all of our feature tests pass weโ€™ll be looking at why some parallel cases seem to work fine but others have issues. This could be a performance issue in backstopโ€” but it could also be on the network, the server or something else in the environment.

iain17 commented 6 years ago

Hey @krisdigital We're super excited for puppeteer. I'm currently trying to get it to run in Docker with the latest changes from https://github.com/garris/BackstopJS/tree/puppeteer.

One thing I noticed is that we seem to create a new browser instance for each test case. We're currently experimenting with just having one browser instance and creating only new page objects (tabs) for each test. Let me know what you think.

My repo is at: https://github.com/iain17/BackstopJS/tree/puppeteer

kiran-redhat commented 6 years ago

This will increase the speed exponentially. having browser opened and keep using same instance

On Tue, Mar 27, 2018, 9:45 PM Iain Munro notifications@github.com wrote:

Hey @krisdigital https://github.com/krisdigital We're super excited for puppeteer. I'm currently trying to get it to run in Docker with the latest changes from https://github.com/garris/BackstopJS/tree/puppeteer.

One thing I noticed is that we seem to create a new browser instance for each test case. We're currently experimenting with just having one browser instance and creating only new page objects (tabs) for each test. Let me know what you think.

My repo is at: https://github.com/iain17/BackstopJS/tree/puppeteer

โ€” You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/garris/BackstopJS/issues/704#issuecomment-376445216, or mute the thread https://github.com/notifications/unsubscribe-auth/AQUrI1R8a6oVu4fYTqyOwiE05fU7IR0rks5tifwvgaJpZM4SuC7d .

krisdigital commented 6 years ago

@iain17 The current behaviour with new browser instances mirrors the behaviour of Chromy. Opening a tab is probably more ressource friendly, see this discussion: https://github.com/GoogleChrome/puppeteer/issues/85 Maybe it could be provided as an option for users who do not need a new session for each scenario, afaik this would be also possible with Chromy.

krisdigital commented 6 years ago

@garris @matthiaskomarek Just made a pull request that should handle that problem https://github.com/garris/BackstopJS/pull/718

garris commented 6 years ago

๐Ÿ“ฃ Puppeteer branch is merged to master and pushed to npm @canary! ๐ŸŽ†

npm install backstopjs@canary

krisdigital commented 6 years ago

๐ŸŽ‰๐Ÿพ๐Ÿป

cactusa commented 6 years ago

Just trying puppeteer out in v3.2.3 on a docker setup and got an error:

BackstopJS v3.2.3
Loading config:  /src/backstop.config.js 

COMMAND | Executing core for `test`
createBitmaps | Selected 45 of 45 scenarios.
      COMMAND | Command `test` ended with an error after [0.06s]
      COMMAND | Error: Failed to launch chrome!
                [0406/123126.823853:ERROR:zygote_host_impl_linux.cc(88)] Running as root without --no-sandbox is not supported. See https://crbug.com/638180.

                TROUBLESHOOTING: https://github.com/GoogleChrome/puppeteer/blob/master/docs/troubleshooting.md

                    at onClose (/usr/local/lib/node_modules/backstopjs/node_modules/puppeteer/lib/Launcher.js:246:14)
                    at Interface.helper.addEventListener (/usr/local/lib/node_modules/backstopjs/node_modules/puppeteer/lib/Launcher.js:235:50)
                    at emitNone (events.js:110:20)
                    at Interface.emit (events.js:207:7)
                    at Interface.close (readline.js:367:8)
                    at Socket.onend (readline.js:147:10)
                    at emitNone (events.js:110:20)
                    at Socket.emit (events.js:207:7)
                    at endReadableNT (_stream_readable.js:1059:12)
                    at _combinedTickCallback (internal/process/next_tick.js:138:11)
>> Exited with code: 1.
>> Error executing child process: Error: Process exited with code 1.

Anyone experiencing the same?

krisdigital commented 6 years ago

@cactusa see the link in the error: https://github.com/GoogleChrome/puppeteer/blob/master/docs/troubleshooting.md, especially the sandbox section, also this PR https://github.com/garris/BackstopJS/pull/722

cactusa commented 6 years ago

@krisdigital Just pulled the latest commit to try out the PR you linked above, but it didn't work for me, absolutely no change at all. I still get that same error. And I rebuild the docker image as well and followed the documentation from the readme.

FYI @VesterDe

garris commented 6 years ago

@krisdigital or @here Anybody know how to handle CSP issues with puppeteer? I am getting errors like...

######## Error running Puppeteer ######### Error: Evaluation failed: EvalError: Refused to evaluate a string as JavaScript because 'unsafe-eval' is not an allowed source of script in the following Content Security Policy directive ...
VesterDe commented 6 years ago

@cactusa Hey, did you also include the engineOptions to your config file?

Specifically, this line:

"engineOptions": {"args": ["--no-sandbox", "--disable-setuid-sandbox"]}

That solved your exact problem for me. I'm not sure how to help if that doesn't change the error, at least...

garris commented 6 years ago

@VesterDe unfortunately no. same exact behavior. ๐Ÿ˜ข

This is weird -- I didn't have this problem with the Chromy engine. ๐Ÿค”

krisdigital commented 6 years ago

@garris This seems to be a known issue https://github.com/GoogleChrome/puppeteer/issues/1814 ...

cactusa commented 6 years ago

@VesterDe Yes I did indeed include that exact line in my config. ๐Ÿ˜ž

garris commented 6 years ago

Thanks guys! Fixed by adding this onBefore script ๐Ÿ‘‰ https://github.com/garris/BackstopJS/commit/67801c259f340c986bd0c25281a7eb5a79c7bc25

garris commented 6 years ago

I am hoping this will be a much better fix in the near future! https://github.com/GoogleChrome/puppeteer/pull/2324

cactusa commented 6 years ago

Cool. Update from me - I finally had time to make backstop run with puppeteer, but most of my scenarios now fail for two reasons. For some reason the font renders differently now, which shouldn't be possible if I am running a docker container with consistent versions.

screen shot 2018-04-18 at 14 47 21

The second and bigger issue us that for most scenarios the captured test image is just grey, nothing else on it. Naturally it detects close to 100% difference.

screen shot 2018-04-18 at 14 34 39

So far I have tried adding a delay to each scenario also tried with 'asyncCaptureLimit': 1 Couldn't see any other issues posted with this same problem. Any suggestions?

VesterDe commented 6 years ago

I can confirm that I get both of those errors intermittently as well, even if running reference and test one after another on the same machine.

Are your test images gray or empty? Mine are actually just WIDTHxHEIGHT of empty pixels, maybe the gray is just the css on the diffing tool?

Do any of your reference images appear empty as well? I also get empty references sometimes...

Although, as far as I can tell, this never happens if there are no selectors present for the run. So, if I'm just capturing the entire website in one image. In that case, it's never empty...

cactusa commented 6 years ago

Not sure, but I think it's just grey image of the same size as the reference. I have spotted some of the tests images are half captured and half grey. I have one example where the test capture is of an area slightly above the intended selector...

I am also getting this error at the very start:

>> (node:1) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 exit listeners added. Use emitter.setMaxListeners() to increase limit

And this error on a couple scenarios where the test image is completely missing:

ENGINE ERROR: Navigation Timeout Exceeded: 60000ms exceeded

I guess am not switching to puppeteer just yet. I was really hoping this to work because it doesn't randomly hang executions as chromy does and is faster ๐Ÿ˜ข

VesterDe commented 6 years ago

Ok, I've found a bit of a curve ball here... I'd be really grateful if someone can confirm this for me.

I've isolated the failing cases on one of my projects, and it seems that the screenshot will be empty if any of the numbers pertaining to the bounding box of the selected element have decimals. So, if I log the bounding boxes these values produce a screenshot:

Screenshoting header:
{"x":0,"y":0,"width":800,"height":66}
Screenshoting footer:
{"x":0,"y":948,"width":800,"height":703}

And these produce an empty file:

Screenshoting #testimonials:
{"x":0,"y":3475.296875,"width":800,"height":140}
Screenshoting #cta1:
{"x":0,"y":421.203125,"width":800,"height":918.109375}

This is... unusual... But seems consistent on my machine. What also happens is that as soon as one selector in an array fails because of this reason, all the remaining selector in the scenario fail.

It does provide an insight into the randomness we've been seeing, because while width and height are constant, X and Y are relative to the current viewport, and may have different values on different runs.

I've tested this out on a standalone puppeteer run and it seems to happen as well. If someone could reproduce these findings we can make an issue with the puppeteer team.

Edit: Perhaps this issue will lead to a good outcome.

Edit2: Indeed, as per the above issue, Puppeteer@1.1.1 doesn't have this problem, it seems to start with 1.2.0.

krisdigital commented 6 years ago

@VesterDe Great debugging! Maybe the Puppeteer version should be locked to 1.1.1 until the bug is fixed then..

garris commented 6 years ago

@vesterDe Thank you for taking time to investigate this!

cactusa commented 6 years ago

@garris is it OK to move puppeteer to 1.1.1 as it is more stable? I can test and make a pull request.

garris commented 6 years ago

Looks like puppeteer is at 1.4.0. Yes, we definitely want to bump this.

cactusa commented 6 years ago

@garris Sadly version 1.4.0 still exhibits the same bug. Some test images are blank some are partially captured. Will try with 1.1.1 next

Update: Weirdly v1.1.1 has the exact same problem. Maybe puppeteer is not the problem here ๐Ÿ˜ญ

jsmecham commented 6 years ago

I am also experiencing this issue with the latest BackstopJS and Puppeteer. Sometimes the images will be blank, sometimes they will be cropped incorrectly and sometimes they will display part of the content and white for the rest of it. Have not been able to find a workaround, yet (tried delays, etc).

garris commented 6 years ago

Is this fixable by adding a Math.floor() to our capture flow?

cactusa commented 6 years ago

@garris could you point me to the file I need to the change in and I'll try to tests this as soon as I have some spare time.

garris commented 6 years ago

Here is where selector screenshots are called by backstop...

https://github.com/garris/BackstopJS/blob/master/core/util/runPuppet.js#L365

You could look into changing this to a different puppeteer call where you specify the screen coordinates explicitly using rounded boundingBox values.

cactusa commented 6 years ago

@VesterDe I have been doing some debugging of my own and I managed to reproduce the same results as you, but in my case I get the empty screenshots even for entries that do not have the decimals in their boundingBox object. I think the issue is caused by something else.

@garris I am testing a bug fix for this and will be making a pull request shortly. I have basically done what you suggested, but without rounding the numbers and it worked.

cactusa commented 6 years ago

My pull request https://github.com/garris/BackstopJS/pull/808

VesterDe commented 6 years ago

Cool, can't wait to test it out.

Edit: At first glance, your pull request seems to fix it for me! Great job @cactusa :)