department-of-veterans-affairs / va.gov-team

Public resources for building on and in support of VA.gov. Visit complete Knowledge Hub:
https://depo-platform-documentation.scrollhelp.site/index.html
283 stars 204 forks source link

Implement automated testing of login flow through integration with Jenkins #842

Closed brandonrapp closed 5 years ago

brandonrapp commented 5 years ago

Problem

Our CI does not currently include E2E tests that run against the actual API and simulate a logged in session.

Goal

We should have a Jenkins job we can integrate into our release deploy process that runs E2E tests and reports any egregious failures that are worthy of blocking deploy.

Artifact

Notes

Part of this work involves introducing a new framework (TestCafe) into our build and deploy process. PR

Consider setting up a Docker container that already sets up a logged in session.

U-DON commented 5 years ago

After further thought, I will hold off on the actual implementation of a full E2E test as a deploy-blocking Jenkins job, as I don't believe the test, as designed, is suitable for CI. I elaborated much more on the state of this discovery in the PR where I introduce TestCafe, but I will try to summarize. https://github.com/department-of-veterans-affairs/vets-website/pull/10300#issuecomment-522643678

Integrating with the API will involve external services, and while we are able to mock many interactions, there are others that might be tricky or require special handling to mock. One such interaction is the very login flow that we wanted to include in these tests. The currently proposed automated tests go through an actual login flow with ID.me, since I found it difficult to mock that whole process. Because the tests would run across multiple browsers and involve the external services, reducing the reliability of the results, I don't see this being practical to run for every deploy. I wouldn't be comfortable with tests blocking deploy if an un-mocked external service were to be down at the time of build. However, I will propose a parallel job that runs at the time of the deploy cutoff (2pm Eastern) that could report failures for us to investigate and determine whether to postpone the deploy.

We discussed our overall testing strategy, and consumer-driven contract testing (CDCT) was an interesting idea that came out of that and seems like it would be worth some discovery (#1005). The notion behind it is that the consumers of the API should verify their expectations of API responses with the API itself, and a framework that implements consumer-driven contract testing will aid in preserving this contract.

I think the real impetus behind the desire for full E2E testing is that we don't have a large degree of confidence in how current E2E tests mock API responses without any verification that they are what the client should expect. That approach leaves us prone to missing any errors that could arise as a result of the API changing how responses are formatted. CDCT seems like a promising approach to bridging this gap.

I'd like to revisit the full E2E testing after we consider how CDCT fits into the bigger picture of our testing strategy.