hoodiehq / hoodie-css

Framework for all hoodie sites
http://hoodiehq.github.io/hoodie-css/
31 stars 13 forks source link

Question: Regression testing our CSS on remote sites is tough #38

Closed lewiscowper closed 8 years ago

lewiscowper commented 9 years ago

The problems we’re currently facing on the feature/build-automation branch.

1 The casperjs/phantomcss tests do not exit with an exit code of 0/1 (as far as I can figure out), which means our server doesn’t also stop (I believe that’d be the correct behaviour looking through npm-run-all.

1 addendum: We have to spin up a separate “server” type thing to proxy to each site, (i.e. “hood.ie”, “faq.hood.ie” and “docs.hood.ie”), which means we have to set up 3 different test suites for each one so that the appropriate links and pages are tested. (e.g. “faq.hood.ie” has no “/blog”, whereas “hood.ie” does)

2 Our “server” type thing doesn’t start up immediately, which can mean we take screenshots of a fail-to-load page. We could solve this with a wait or similar, but that’d be prone to breakage, and because we’re testing our CSS on a live site, a sleep/wait is not ideal in case the live site is under heavy load for example.

Our main problem is that the kind of testing we need to do just doesn’t seem to have been catered for, because it’s a super edge case. We want to take css running locally, and apply it to a series of live sites. This means that the standard regression testing model where we would have the site we want to test, and run the tests from there, is incompatible with our workflow

So basically, everything we try to do winds up being hacks on hacks on hacks The worst part is, we’ll not need to regression test in this way after the refactor, because we’ll be able to isolate components and test how they render without worry about breaking the pages, because our components will be the things rendered on the live site too.

But yeah, if anyone knows anything to get around either of those errors, potentially even just writing a script to tie the two scripts we need (each set is 1 script to set up the proxy, and another to run the casperjs test-suite) so that when scriptA exits, it closes scriptB down too.

kevinSuttle commented 9 years ago

May I suggest: https://speedcurve.com/blog/visual-diffs-on-every-deploy/

martin-hewitt commented 9 years ago

Would BackstopJS (https://garris.github.io/BackstopJS/) be of any use?

sthulb commented 9 years ago

When I was at the BBC, we (mostly @daveblooman) developed a tool called Wraith, which allows for visual regression testing.

lewiscowper commented 9 years ago

@kevinSuttle That won't work for us, because we'd need to deploy our code to test it. We want to test it without necessarily committing, and certainly without deploying. Our tests need to run as part of our testing process, and much like you don't deploy your JavaScript to run tests on it, we want to run tests on CSS at the current state, so we can determine whether the deploy should go ahead.

Also, it's a paid for service, and however low the cost, this is an OSS project, so we'd like to use free software (beer + speech) where possible.

@martin-hewitt, we've got a CasperJS setup already, we need to find a good way to test our local CSS on a remote site, via a proxy or similar, for which we're currently using a custom commit of https://www.npmjs.com/package/livestyle for.

@sthulb Unfortunately Wraith falls out by being a Ruby based tool, and we'd really like to accomplish this with a JS or NodeJS based tool if at all possible. Thank you though :)

dblooman commented 9 years ago

:(

lewiscowper commented 9 years ago

If it comes down to it and Wraith supports what we need it to, we can consider it, but I'm still not sure that we have a great fit there.

Edit: I guess I can ask @daveblooman and @sthulb while they're here.

So we have this repository, which serves up css via gh-pages.

The gh-pages link is then used in the <head> tag of hood.ie, faq.hood.ie and docs.hood.ie. (Or that's the intent anyway). When testing we need to take CSS from this repository, apply it to the remote site (ideally without cloning and running them all locally, because the main site's repository is very heavy due to all the blog content ever). We want something that we can run on our own machines/CI servers etc.

So we need to take the current state of the site at the time of each test, then take screenshots to compare against. Then we plug in our current dev CSS, and run the same tests, and compare the screenshots.

Double bonus, we can't easily do a staging environment because everything is served up from gh-pages.

To be honest, we can do some npm script magic to run gem install if we need to, we just need some way of testing our changes without deploying them properly, because deploying means pushing to the gh-pages branch, which then means potential breakage of the site and reverts and git history being messy and stuff.

kevinSuttle commented 9 years ago

@lewiscowper http://tldr.huddle.com/blog/css-testing/ ?

lewiscowper commented 9 years ago

@kevinSuttle So the tool they are talking about there, phantomCSS mixed with CasperJS are the tools we're using currently to create our test suites, so that's along the same lines. However the problem we're having is that regression testing is almost always done from the same repository/project as the content, and all our content is spread across multiple repos, with CSS coming from this global repository. Unfortunately, without regression testing in place, we can't change things and rely on the outcome to work without manually testing every page on every site. Hence the need to run some kind of proxy server that takes our local CSS and loads it instead of the page's CSS. It's a tough problem and I don't think any regression testing software is designed to do it outright, which is why we are using livestyle just now.

kevinSuttle commented 9 years ago

Gotcha. Yeah that does sound tricky.

danielputerman commented 9 years ago

@lewiscowper check us out - http://www.applitools.com . Just to name a few pros:

If you want, I'd be glad to chat and see what can we do to help. Feel free to contact me: daniel.puterman at applitools.com