Open CJ-Wright opened 6 years ago
Make sure it is worth the effort. There will be some cost-benefit analysis that should be done as we make this more elaborate. On Wed, Oct 4, 2017 at 7:51 PM Christopher J. Wright < notifications@github.com> wrote:
We should have a simulated beamline repo who's job it is to run the xpdAcq/xpdAn integration and acceptance tests on a CI. This will a) give us a place to put all the interesting breaking use cases we run into and b) allow us to run tests that normally require our physical presence at the beamline on a CI, and c) means that our acceptance tests at the beamline are more likely to be fully featured and successful. Expected Behavior
Use cases get written up as tests. Acceptance tests get written up as tests. CI runs tests. We fix breakages. Current Behavior
None of this happens Possible Solution
One interesting part of this is that we can have all the inputs (motors, detectors, etc.) to the tests as pytest features. When it comes time to run the acceptance tests at the beamline we just import all the tests and then hand them all the previously simulated things as the real hardware and run them. Context
Integration tests may save us from a world of pain. Acceptance tests were requested by the BLS Priority
High
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/xpdAcq/mission-control/issues/49, or mute the thread https://github.com/notifications/unsubscribe-auth/AEDrUSxP2YAWNiodSIBJNOGoab_9tFpCks5spBnlgaJpZM4PuY-5 .
Moving this to cycle 2 as cycle 1 release is getting loaded up now and this is important but lower priority (not mission critical)
It would be good to have acceptance tests first.
We should have a simulated beamline repo who's job it is to run the xpdAcq/xpdAn integration and acceptance tests on a CI. This will a) give us a place to put all the interesting breaking use cases we run into and b) allow us to run tests that normally require our physical presence at the beamline on a CI, and c) means that our acceptance tests at the beamline are more likely to be fully featured and successful.
Expected Behavior
Use cases get written up as tests. Acceptance tests get written up as tests. CI runs tests. We fix breakages.
Current Behavior
None of this happens
Possible Solution
One interesting part of this is that we can have all the inputs (motors, detectors, etc.) to the tests as pytest features. When it comes time to run the acceptance tests at the beamline we just import all the tests and then hand them all the previously simulated things as the real hardware and run them.
Context
Integration tests may save us from a world of pain. Acceptance tests were requested by the BLS
Priority
High