PIC-IRIS / PH5

Library of PH5 clients, apis, and utilities
Other
15 stars 9 forks source link

[BUG] Multiple Failing Test Cases #422

Closed timronan closed 4 years ago

timronan commented 4 years ago

Describe the bug When running runtests.py there are multiple failed test. Making the PH5 test suite unreliable.

Environment (please complete the following information):

To Reproduce Set up ph5 enviroment. Run ./runtest.py

Expected behavior All test should pass so new code can be incorporated.

Screenshots

======================================================================
FAIL: test_array_t (ph5.core.tests.test_ph5api.TestPH5API)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/tronan/Desktop/Projects/PH5/ph5/core/tests/test_ph5api.py", line 153, in test_array_t
    self.assertEqual(channel[0]['response_table_n_i'], 0)
AssertionError: 7 != 0

======================================================================
FAIL: test_cut (ph5.core.tests.test_ph5api.TestPH5API)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/tronan/Desktop/Projects/PH5/ph5/core/tests/test_ph5api.py", line 942, in test_cut
    traces[0].data[0])
AssertionError: 1317166976 != 1331852800

======================================================================
FAIL: test_response_t (ph5.core.tests.test_ph5api.TestPH5API)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/tronan/Desktop/Projects/PH5/ph5/core/tests/test_ph5api.py", line 468, in test_response_t
    16)
AssertionError: 1.88039941931e-05 != 1.85966491699e-05 within 16 places

----------------------------------------------------------------------
Ran 103 tests in 21.165s

FAILED (failures=3, expected failures=2)
timronan commented 4 years ago

Travis CI uses different data to test than are included in the Github directory /PH5/ph5/test_data. All data sources need to be consistent so we can write test cases that work both in the integration pipeline and on local machines. We have to have a consistent testing suit to make it valuable. Travis CI Test Result when local test pass

Screen Shot 2020-08-14 at 11 47 02 AM

Local test result when Travis CI test pass

Screen Shot 2020-08-14 at 12 00 29 PM
dsentinel commented 4 years ago

Please don't post screen grabs of text.

I think you need to run create_ph5.sh before running tests

timronan commented 4 years ago

Noted on both accounts. You are correct on running the 'create_ph5.sh' script. We should consider updating the CONTRIBUTING.md document to reflect this. Currently, CONTRIBUTING.md states:

Create a unit test(s) for your contribution and make sure it passes. If a test module doesn't already exist for the module you are updating, create one in the appropriate test package following the test_ file-name pattern and then update the PH5/runtests.py script.

This should say something about the 'create_ph5.sh' script. Something similar to the sentence below could be added.

To verify the test suit passes execute create_ph5.sh then runtests.py.

Another option could be to make create_ph5.sh automatically execute when executing runtests.py. This will make the test suite more robust and prevent users from assuming test are failing. Especially since the majority of the test pass if create_ph5.sh is not executed.