codeforamerica / chime

A city-focused content management system
http://chimecms.org/
BSD 3-Clause "New" or "Revised" License
35 stars 12 forks source link

Configure acceptance tests for nightly runs #507

Closed migurski closed 8 years ago

migurski commented 8 years ago

Acceptance tests now upload data to S3 in a publicly-accessible JSON file.

Sample data:

[
  {
    "start": 1440721931.888421
  }, 
  {
    "status": "done", 
    "browser": "Windows 7 IE 9.0", 
    "ok": true, 
    "elapsed": 120.39418697357178
  }, 
  {
    "status": "errored", 
    "browser": "Windows 8.1 Firefox 35.0", 
    "exception": "(<class 'selenium.common.exceptions.NoSuchElementException'>, NoSuchElementException(), <traceback object at 0x7fef36cd8488>)", 
    "ok": false, 
    "elapsed": 123.96230697631836
  }, 
  {
    "status": "failed", 
    "browser": "Windows 7 IE 8.0", 
    "exception": "(<type 'exceptions.AssertionError'>, AssertionError('\\'You deleted\\' not found in u\"Saved changes to the article-916104 article! Remember to submit this change for feedback when you\\'re ready to go live.\"',), <traceback object at 0x7fef36cf3cf8>)", 
    "ok": false, 
    "elapsed": 148.9050600528717
  }, 
  {
    "ok": false, 
    "end": 1440722146.508441
  }
]
migurski commented 8 years ago

Updated the sample JSON per @wpietri’s suggestion.

wpietri commented 8 years ago

This looks good to me. I think the OUTPUT_FILE code more properly belongs in the superclass in that we want it to happen for all acceptance tests rather than just this one, but since we currently have only one, it's a distinction without a difference. However, I'd rather not merge the Travis bit until the nightly job is up, running, and producing a report that people will actually see on a daily basis.

tmaybe commented 8 years ago

Latest version looks fine to me too. There are a few things that could be changed to make the file more human-readable and -useful if that's desired – like readable dates and links to browserstack jobs (though from what I hear that's not really posssible).

Also, will we have a hook into Slack so we'll see when these tests have been run?

tmaybe commented 8 years ago

Oh, and saving archived test results maybe?

tmaybe commented 8 years ago

None of my suggestions are intended to delay merging this PR though.

migurski commented 8 years ago

I’m putting the finishing touches on the nightly job now!

tmaybe commented 8 years ago

It looks like if the tests are run n times per os/browser, we will see every test represented in the JSON. Might be cool to have one entry per os/browser combination, a pass/fail ratio, and a list of exceptions.

wpietri commented 8 years ago

Ah, I was thinking that the summarization would be taken care of by whatever script was doing the user-visible reporting. E.g., that there would be an HTML page that would load this stuff with JS. Or that some script would grab the data after a run and send us email.

tmaybe commented 8 years ago

I'm fine with that. I wasn't sure whether this was supposed to be user-visible or consumed by another service (like your dashboard?).

migurski commented 8 years ago

I believe this is read to merge.