ISISComputingGroup / IBEX

Top level repository for IBEX stories
5 stars 2 forks source link

IOC Test Framework: Testing with a device #3936

Open LiamPanchaud opened 5 years ago

LiamPanchaud commented 5 years ago

As a developer, it would be useful if the test suite could run not only in devsim and recsim (interfacing with the emulator and/or the IOC) but also with the device.

We rely on python emulators to ensure that the IOC and command set will work properly with a device, but with unfamiliar/complex devices it can be difficult to know that the emulator is behaving as the device would. Allowing the test suite to run against the device would help catch parts of the emulator that do not correctly reflect the device. Such errors can cause problems to slip through the net.

It would also allow us to trouble shoot actual devices on which our IOCs have been deployed. If an instrumnent scientist comes to us with a problem, a first step can be to run the test suite against the device. This could help catch issues introduced by power cycles, front panel changes, etc.

It could be implemented by adding another TestMode and some more test case decorators, so that tests designed to integrate with devices can be ignored by buildservers.

John-Holt-Tessella commented 5 years ago

I do not quite understand what this ticket does. Can you give us an example or two of a test that would not be caught by booting up the IOC and looking at the alarms panel?

KathrynBaker commented 5 years ago

Errors in dealing with calibrations, the “it isn’t talking” support call when the alarm isn’t there – far easier to run a test to ask for the basic behaviours to realise that this is an undocumented requirement. In development it has already been seen that whilst an IOC can work with an emulator, the device itself behaves differently, we run those tests that we can against the device and we improve our emulators and testing abilities longer term without having to necessarily notice everything manually. It proved a simpler way to do full end to end testing when we move to a new version of EPICS/ASYN/STREAM, if we want to change the way we provide the access to the devices, if we just think the device is dodgy.

I know those aren’t examples of tests – but they are examples of use cases. Most of those wouldn’t necessarily be caught by alarms, and knowing if the fault is in the protocol file or elsewhere can be handy. You know when you run your tests, you can look at the error log with better knowledge of what was going on.

John-Holt-Tessella commented 5 years ago

Notes after a brief discussion (See Tom's example): Allows you to run a manual test plan for a device. This is reproducible and consistent between people and requires less device knowledge. We can record results to see when devices were last tested. E.g.

@ManualTest(run_if="Set point to 30 ok?")
def test_GIVEN_device_WHEN_set_setpoint_to_30_THEN_set_point_readback_is_30():

Can we run against the emulator but also against the device.

E.g. 2:

@ManualTest()
def test_GIVEN_device_with_temp_10_WHEN_get_temp_THEN_readback_is_10():
   expected_result = user_answer("What is the temp shown?", default=10)
   assert_that_pv_is_about(result, expected_result, 0.1)

At expected result user is prompter to type in current reading if it is run as a emmulator/rec sim test the default value is used.

The overhead for each IOC is thinking what tests can be run manually.

ChrisM-S commented 5 years ago

How far are we from being able to run some/all of these as a diagnostics on an instrument? There are several tests which would as @LiamPanchaud and @KathrynBaker point out would potentially isolate the cause of errors on a beamline. They would also potentially allow validation of new firmware versions and other changes on equipment we do not control. is the test library already included from the client?

John-Holt-Tessella commented 5 years ago

This ticket has not been prioritised; so nowhere close. I suspect that were we to implement this it would take a day but this would be without any tests that run on instruments. It would then make each IOC take longer so we need some discussion beforehand. However, both Kathryn, Liam and Tom were keen when I questioned the utility and I have been persuaded.

ChrisM-S commented 5 years ago

I was more talking about the technical distance from one working manual test (as discussed) to executing that one test though the framework on an instrument (if the device was moved and port number macro changed)? Would the scaling to more tests be dependent only on having them written? which I accept might be a longer term thing!