Closed olivecha closed 3 months ago
Hey @olivecha 👋
Thanks for bringing this up!
We've created a validation function to help users verify unexpected changes in behavior due to different MATLAB/Octave versions or operating systems.
This validation function compares results with the Test1.mat data results and should be run with the user's data of interest.
Best,
Thiago
I looked at and ran the validation function, and it does not behave like a software test... The tests should not be run against arbitrary data (well they could to be more robust, but simpler tests would need to be run first). For your project, a simple test would be to run the main function and compare with known output:
load("Data/Test1.mat");
fff_segmenter(Acoustic_signal,Dir_X,Dir_Y,200e3);
% Load the result
load("Segmentation results/points segmentation results Test1.mat")
% Create a copy of the Segmentation results in another folder
% like "test_data/"
load("test_data/Points segmentation results test.mat")
% Do this for each result array
test1 = all(abs(result_contour(:) - test_contour(:)) < 1e-10);
if test1
disp('Contour match test reference')
else
disp('Test failed when comparing contour')
end
This would allow you to validate that changes in the code don't break the existing behaviour (like when accepting pull requests) and that the environments of other users provides the expected functions (various Matlab/Octave versions).
@olivecha
Understood. I have updated the validation function to follow the type of test that you proposed.
Cheers, Thiago
Closing as this is solved
It would be important for users to validate that the script has the expected behaviour. You could simply add a testing script which performs the segmentation on a dataset supplied in the repository and compare the result against a saved previous result (regression test). This would catch unexpected changes in behaviours due to different MATLAB versions or operating systems.