peterbrittain / asciimatics

A cross platform package to do curses-like operations, plus higher level APIs and widgets to create text UIs and ASCII art animations
Apache License 2.0
3.61k stars 238 forks source link

Rain test fails randomly #359

Closed bmwiedemann closed 1 year ago

bmwiedemann commented 1 year ago

Describe the bug Rain test fails after 2038-01-19

To Reproduce

on openSUSE, I build the package with

osc co openSUSE:Factory/python-asciimatics
osc build --vm-type=kvm --noservice --clean --build-opt=--vm-custom-opt="-rtc base=2038-01-20T00:00:00" --alternative-project=home:bmwiedemann:reproducible openSUSE_Tumbleweed

Expected behavior

tests should continue to pass in future

Screenshots

 =================================== FAILURES ===================================
 ___________________________ TestParticles.test_rain ____________________________

 self = <tests.test_particles.TestParticles testMethod=test_rain>

     def test_rain(self):
         """     
         Test that Rain works as expected.
         """     
         screen = MagicMock(spec=Screen, colours=8)
         canvas = Canvas(screen, 10, 40, 0, 0)
         effect = Rain(canvas, 200)
 >       self.check_effect(canvas,
                           effect,
                           lambda value: self.assertIn(chr(value[0]), ' `\\v'))

 tests/test_particles.py:98:
 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
 tests/test_particles.py:43: in check_effect
     self.assertTrue(changed, "failed at step %d %s" % (i, view))
 E   AssertionError: False is not true : failed at step 0

==== 1 failed, 149 passed, 41 skipped, 1 deselected, 966 warnings in 3.67s =====

System details (please complete the following information):

Additional context See also https://en.wikipedia.org/wiki/Year_2038_problem

As part of my work on reproducible builds for openSUSE, I check that software still gives identical build results in the future. The usual offset is +16 years, because that is how long I expect some software will be used in some places. This showed up failing tests in our package build. See https://reproducible-builds.org/ for why this matters.

peterbrittain commented 1 year ago

Interesting... I completely agree that future stability is important, but will struggle to test this as I don't have an openSUSE build env.

First thing to check here is how reproducible this failure is... These tests are for deliberately randomized effects and so sometimes hit failures due to the random seed. If so, I can tweak the warm up for this test to let it run a little before checking output.

Does this always fail like this, or was it a one off?

bmwiedemann commented 1 year ago

Hmm. I cannot reproduce it anymore, so it probably really is more of a random failure, independent of the date. Maybe use a fixed random seed then or let the test retry on failure?

peterbrittain commented 1 year ago

Yeah - seeding the RNG is a good idea. I'll push something to master.

stale[bot] commented 1 year ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

peterbrittain commented 1 year ago

Naughty stalebot! I'll fix this...

stale[bot] commented 1 year ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

bmwiedemann commented 1 year ago

I added random.seed(42) in various places but still get random failures in TestParticles.test_drop_screen and test_rain, unless I do it there:

--- asciimatics-1.14.0.orig/asciimatics/effects.py
+++ asciimatics-1.14.0/asciimatics/effects.py
@@ -12,7 +12,7 @@ from builtins import object
 from builtins import range
 from future.utils import with_metaclass
 from abc import ABCMeta, abstractmethod, abstractproperty
-from random import randint, random, choice
+from random import randint, random, choice, seed
 from math import sin, cos, pi
 from asciimatics.paths import DynamicPath
 from asciimatics.screen import Screen
@@ -54,6 +54,7 @@ class Effect(with_metaclass(ABCMeta, obj
         :param stop_frame: Stop index for the effect.
         :param delete_count: Number of frames before this effect is deleted.
         """
+        seed(42)
         self._screen = screen
         self._start_frame = start_frame
         self._stop_frame = stop_frame
peterbrittain commented 1 year ago

Interesting... Given that the random numbers are deterministic once the seed is set, the only reason it wouldn't follow the same path is because of some systemic difference. Are you running the test with multiple threads, or something like that?

bmwiedemann commented 1 year ago

My guess is that randomness is used before tests run random.seed(42) e.g. in https://github.com/peterbrittain/asciimatics/blob/31b3fe579af6ef08bcd8e01e33aeaf8d46805353/asciimatics/effects.py#L509

Which is why it became deterministic when adding it early in asciimatics/effects.py

peterbrittain commented 1 year ago

That doesn't quite make sense. Those should only get invoked once the class is constructed (i.e. in the test). I'd also like to avoid seeding inside the real code...

Did you try using the setUp() method of the unittest class to set the seed? Something like this:

class TestParticles(unittest.TestCase):
    def setUp(self):
        seed(42)