Closed podhrmic closed 3 years ago
In GitLab by @EthanJamesLewon Aug 14, 2020, 10:40
This sounds good, and I believe the time response is akin to something that @cslockett mentioned. This issue may start to inform the analysis part of CSAF.
In GitLab by @zutshi on Aug 14, 2020, 13:31
I really like Control System Fuzzer. But let's start with a static/manual test suite, and then move on to a automatic fuzzer. The manually written tests should be cheap to build, but will highlight the objectives.
Step responses are easy and cheap to build, and when tried with edge cases for initial conditions (x0) and params can highlight potential unstable responses. That said, I would love frequency based tests, as they complement time domain testing.
Software-in-the-loop (SIL) testing is what I had in mind. We can provide an API to help users write specifications over the time domain step response and frequency domain response and check them. For example,
Test_StepResponse.settling_time(controller, 10.0s, plant, steady_state=..., plant_params=...)
Test_StepResponse.max_overshoot(controller, 2.0, plant, steady_state=..., plant_params=...)
...
Test_FreqResponse...
Once we have the specification writing down, or even a simple API, we can think about fuzzing techniques to find violations.
In GitLab by @podhrmic on Aug 14, 2020, 14:42
I like the step response idea. I will start with that and see what comes up.
In GitLab by @mattclark on Aug 17, 2020, 08:19
All, Sorry for the delay. I agree that the control design may well have been accomplished in another tool However, I disagree with the following assumptions:
Not including Python's Controls module. I don't believe we should assume that the designer won't want to tweak their design in some way. Including the Python Controls Module is necessary for limited early analysis. Let's assume that a linear controller is designed as a recovery controller rather than for the main controller. Recovery controllers may be assumed to be part of the CSAF design. Therefore, simple controls design should be included. Michal, this also continues to allow the tie between future offers around the Autopilot design and Notebooks leveraging CSAF.
I do not agree that Non-linear MIMO systems do not need impulse and step and frequency response analysis. The analysis results simply have a slightly weaker strength from an evidence perspective (meaning you can't prove for the entire SS that the impulse and step responses hold... you have to do samples all over the SS. In simulation, Highly nonlinear systems still exhibit characteristic linear behaviors. I would rather see that our tools exercise in simulation: A. input by input and combinatorial step response of a MIMO system. Should be looking for the same criteria of overshoot, undershoot, SS response. B. Frequency analysis still have significant applicability. However, I recommend a frequency sweep of the inputs to monitor phase and gain tracking error.
does this make sense?
In GitLab by @mattclark on Aug 17, 2020, 09:08
I should also add I do think the SVD approach is a great one.
In GitLab by @zutshi on Aug 17, 2020, 11:15
@mattclark
Therefore, simple controls design should be included.
Can you say more on the specific features you might be thinking about? Classical design techniques like LQR, PID, or something very specific to recovery controller context? It looks to be the latter, so what kind of APIs should we support then?
In GitLab by @podhrmic on Aug 18, 2020, 14:33
In my understanding we include/let the user import the Control Systems Library, nothing less, nothing more.
In GitLab by @bauer-matthews on Sep 10, 2020, 08:17
@podhrmic Just to check where we are at on this. You are starting with some manual tests specific to the pendulum example? And from there we can think about packaging that into a more generic test library / test orchestration system.
In GitLab by @cslockett on Sep 10, 2020, 17:46
Ok this is a bit extensive and we don't need to implement or support all of these, but as promised, I'm outlining a basic functional** manual test methodology for multi-loop, nested control systems. This is not addressing state space coverage testing, just basic manual controls testing. In this case i'm thinking about our F-16 (outer loop auto-pilot, inner loop and inclusions of safety controllers.
I'll outline the general a notional test plan then provide thoughts about what aspect we do in simulation. This plan starts from a real CPS system in which you are hooking up HW to a controller for the first time and validating controller designs on a plant.
So in design and simulation you wouldn't do all this**, but I want to get the test engineers thinking process down for reference so we can be both smart and comprehensive about this...
The purpose is to show a practical, somewhat comprehensive methodology for functional and performance testing. We need to expand for FV, fuzz testing and code coverage tests.
Group A Basic Functional Controller/Sensor/Feedback Response** 1) Signal Polarity checks - 1Hz sine input for open loop response to check feedback/sensor polarity/expected gains. Is feedback/sensor signal positive/negative?, always do this before you close loop with real HW. Not needed in simulation 2) Impulse and step response, check basic dynamic response and gains 3) Verify in open loop that any actuator/sensor limiters are in fact doing what they should. Clever testing methodologies can combine these into a just a few tests.
Objective:** Polarity and Gain checks are as expected. check signal/sensor quality (not needed in simulation)
Group B System ID for multi-loop control systems AKA: Frequency Sweep Transfer Functions. Perform for each inner loop/outer loop controller as makes sense in the system 1) Open Loop Bode plot of inner loop controller/s (ie have multiple controllers such as a safety controller) 2) Open Loop Bode plot of Inner loop controller/s + plant 3) Closed loop bode plot of inner loop and controller 4) Open loop bode of Outer loop controller plus the closed inner loop and plant 5) Closed loop Bode Plot of System (outer and inner loops closed with plant)
Objective collect system ID artifacts of controllers, plant dynamics and system dynamics in open and closed loop configurations.
Group C Functional System Mode, Performance and Safety Testing** 1) When open and closed loop responses are as expected, now exercise in a methodical manner all sensor, controller mode switches. Generally, this is done with some pre-planning so you can write an automated test to step through all modes/controller
Group D** this would be an extensive set of testing related to our research areas. Fuzz testing, LEC testing and advanced testing suites for training/learning systems
Thats all for now, hope this helps show basic methodology. Very simple, very manual and something that can be fully automated
In GitLab by @podhrmic on Sep 15, 2020, 13:34
Yes, !26 will provide a prototype. Once merged, I am happy to close this.
In GitLab by @podhrmic on Sep 16, 2020, 10:31
In CSAF:
In GitLab by @podhrmic on Sep 19, 2020, 17:26
It turned out that the other outstanding issues took me more than expected, and as a result I don't have the control system tests yet ready.
Once we merge !40 I suggest we meet and discuss the next steps - either adding examples with what we have, or continue developing the analyzer (I estimate I need 8 hours to do so).
FYI @EthanJamesLew@bauer-matthews @cslockett @mattclark
In GitLab by @bauer-matthews on Oct 12, 2020, 06:45
changed time estimate to 1d
In GitLab by @bauer-matthews on Oct 12, 2020, 11:52
We want to develop a very clean and generic mechanism for defining tests (ala issue #69 ). The test harness is the most important thing. Then we want to have on-off tests specific to F16 as time permits. We need to have tests for a system with a LEC as well as classical controllers.
In GitLab by @podhrmic on Oct 12, 2020, 16:33
I think the last thing that needs to happen before closing this is to capture the discussion above in some sort of doc for future reference, perhaps updating the README.md
In GitLab by @podhrmic on Oct 28, 2020, 15:27
mentioned in merge request !61
In GitLab by @podhrmic on Nov 10, 2020, 13:00
mentioned in commit fe758b98214141e4db4920ae9a3f7e5044a4600d
In GitLab by @podhrmic on Aug 13, 2020, 14:41
aka Controls Regression Tests, Baseline controller test, Control System Fuzzer
Assumptions
I am assuming that the user of our tool will use various techniques for their control system design, such as Matlab, Simulink, Python Controls toolbox. They will often use a linearized model of the plant, and use typical methods for measuring the controller's performance (with the linear plant) - such as overshoot, damping ratios, pole placement, frequency response (bode plots, or for MIMO systems sigma values and SVD).
Once the controller is designed, they will likely implement it either manually or through code autogeneration, and end up with some discrete controller, that in some way resembles the original design. It could also just be a black box with a neural network, fuzzy rules, etc.
As a result, I don't think it makes sense to reproduce functionality of Matlab's control system toolbox, or including Python's Controls module. For non-linear MIMO systems the impulse and step response is less important, and bode frequency plot make less sense. I assume the user already did their homework and analyzed their controller in some way ahead of time.
Implications
We are given a controller as a black box - it can be a simple LQR, or a complex non-linear LEC (Neural network for example). We need to test controller's robustness wrt. plant perturbations, and especially look for the corner cases.
[Please correct me if I am wrong here MP] In the industry, the testing of final implemented controller is typically qualitative - either simulation with a non-linear plant, and observing the outputs, or some Software/Hardware-In-The-Loop setup with multiple simulation runs. We want to provide more automated version of such, look for the edge cases and provide examples of failures if we can find such.
We also need specifications of what the controller should actually achieve - then we can compare the resulting simulation trajectory with the specifications (treat is as assertions).
Proposed first step
I have two ideas:
[Time response] Probing the operating region
A controller is designed around some operating region. This region can be easily specified. Our tool would sample the initial conditions
x0
from the operating region, run the simulation, and compare the reference signal with the actual signal. Some measure of error would be necessary, based on the amount of error the controller either passes or fails.This technique could be used for both LLC (and track
ps
,Nz
etc.), and autopilots (for example: Altitude hold autopilot - track if the desired altitude is achieved in the specified time).Running multiple simulations in parallel would be necessary. Speeding up CSAF and 0MQ might be necessary, although it is unclear how much time to spend on it.
[Frequency response] Generalized SVD for nonlinear systems
Since for MIMO systems, bode plots don't accurately capture the behavior of the system in the frequency domain, Singular Value Decomposition is used for linear state space systems. If I can get a nonlinear state space model of the system, I can maybe analytically solve SVD and plot the sigma min/max values as well. But if I cannot get the state space representation of the system - is there anything else I can do to get similar information? Further research here is needed.
Summary
Please let me know what you think. I intend to use the new CSAF framework (hence I opened the issue here) for the simulations etc. Probably provide this "tester" as a script.
@zutshi @mattclark @bauer-matthews @cslockett @elew