mk-fg / python-pulse-control

Python high-level interface and ctypes-based bindings for PulseAudio (libpulse)
https://pypi.org/project/pulsectl/
MIT License
171 stars 36 forks source link

test_get_peak_sample sometimes fails with "AssertionError: 0 not greater than 0" #58

Closed sbraz closed 3 years ago

sbraz commented 3 years ago

Hi, If I run the test suite multiple times, I can see this test failing. Could it be because /dev/urandom doesn't contain reliable data?

Here's one trace with pytest:

self = <pulsectl.tests.test_with_dummy_instance.DummyTests testMethod=test_get_peak_sample>                                                                                                                        

    def test_get_peak_sample(self):                                                                                                                                                                                
        # Note: this test takes at least multiple seconds to run                                                                                                                                                   
        with pulsectl.Pulse('t', server=self.sock_unix) as pulse:                                                                                                                                                  
                source_any = max(s.index for s in pulse.source_list())                                                                                                                                             
                source_nx = source_any + 1                                                                                                                                                                         

                time.sleep(0.3) # make sure previous streams die                                                                                                                                                   
                peak = pulse.get_peak_sample(source_any, 0.3)                                                                                                                                                      
                self.assertEqual(peak, 0)                                                                

                stream_started = list()                                                                                                                                                                            
                def stream_ev_cb(ev):                                                                                                                                                                              
                        if ev.t != 'new': return                                                                                                                                                                   
                        stream_started.append(ev.index)                                                                                                                                                            
                        raise pulsectl.PulseLoopStop                                                     
                pulse.event_mask_set('sink_input')                                                       
                pulse.event_callback_set(stream_ev_cb)                                                                                                                                                             

                paplay = subprocess.Popen(                                                                                                                                                                         
                        ['paplay', '--raw', '/dev/urandom'], env=dict(              
                                PATH=os.environ['PATH'], XDG_RUNTIME_DIR=self.tmp_dir ) )                
                try:                                                                                     
                        if not stream_started: pulse.event_listen()                                      
                        stream_idx, = stream_started                                                                                                                                                               
                        si = pulse.sink_input_info(stream_idx)                                           
                        sink = pulse.sink_info(si.sink)                                                                                                                                                            
                        source = pulse.source_info(sink.monitor_source)                                                                                                                                            

                        # First poll can randomly fail if too short, probably due to latency or such                                                                                                               
                        peak = pulse.get_peak_sample(sink.monitor_source, 3)                             
                        self.assertGreater(peak, 0)                                                                                                                                                                

                        peak = pulse.get_peak_sample(source.index, 0.3, si.index)                                                                                                                                  
>                       self.assertGreater(peak, 0)                                                                                                                                                                
E      AssertionError: 0 not greater than 0                                                                                                                                                                        

And another with plain unittest:

======================================================================                                   
FAIL: test_get_peak_sample (pulsectl.tests.test_with_dummy_instance.DummyTests)
----------------------------------------------------------------------
Traceback (most recent call last):                                                                       
  File "/var/tmp/portage/dev-python/pulsectl-21.5.6/work/pulsectl-21.5.6/pulsectl/tests/test_with_dummy_instance.py", line 610, in test_get_peak_sample                                                            
    self.assertGreater(peak, 0)                                                                          
AssertionError: 0 not greater than 0                                                                     

----------------------------------------------------------------------       
Ran 16 tests in 9.657s                                                                                   

FAILED (failures=1)                                                                                      
mk-fg commented 3 years ago

If I run the test suite multiple times, I can see this test failing. Could it be because /dev/urandom doesn't contain reliable data?

Yeah, given relatively small interval there's a chance to roll something undetectable there, should probably be replaced by some prng output from a pre-determined seed.

mk-fg commented 3 years ago

Should be fixed by 29125d8. Not sure if there's maybe a faster prng than just sha512 in a loop on py2/py3.

mk-fg commented 3 years ago

It randomly occured to me that copying what earlier code did for no reason might not be the best approach, and that I've messed-up numbers a bit and ended up making 80M file via sha512. Replaced with 0xff-padded 5M file, should be long enough and seem to work output-wise.

sbraz commented 3 years ago

No worries, thanks for fixing it :) Could you please tag releases on git to make it easier to compare them?

sbraz commented 3 years ago

Tests still fail with git master:

======================================================================
FAIL: test_get_peak_sample (pulsectl.tests.test_with_dummy_instance.DummyTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/var/tmp/portage/dev-python/pulsectl-9999/work/pulsectl-9999/pulsectl/tests/test_with_dummy_instance.py", line 615, in test_get_peak_sample
    self.assertEqual(peak, 0)
AssertionError: 3.0517578125e-05 != 0

----------------------------------------------------------------------
Ran 16 tests in 10.252s
mk-fg commented 3 years ago

No worries, thanks for fixing it :) Could you please tag releases on git to make it easier to compare them?

I'd have to tag almost every commit then, that doesn't seem to be a good idea.

Tests still fail with git master:

Hm, guess I'll check what sound 0xff pad translates to, but if it's not something too faint will probably just disable that test.

mk-fg commented 3 years ago

Hm, guess I'll check what sound 0xff pad translates to, but if it's not something too faint will probably just disable that test.

Yeah, that was probably the issue - no actual sound frequencies or anything in there, just same sample repeated. Reverted back to sha512-prng for consistent white noise, it should work same on every run, I think, but if it's still unreliable and easy to reproduce, maybe try raising timeout in get_peak_sample calls within that test, see if it immediately helps?

Shouldn't be a big deal to just disable this buggy test instead, if these things are more trouble to maintain than the code they're supposed to help with :)

sbraz commented 3 years ago

The sha512-prng version can fail a different part of the test:

Traceback (most recent call last):                                                                       
  File "/var/tmp/portage/dev-python/pulsectl-21.5.11/work/pulsectl-21.5.11/pulsectl/tests/test_with_dummy_instance.py", line 622, in test_get_peak_sample
    self.assertEqual(peak, 0)                                                                            
AssertionError: 0.965850830078125 != 0                                                                                                                                                                             

I'll simply disable it for now.

I'd have to tag almost every commit then, that doesn't seem to be a good idea.

I just mean the recent versions, I don't mean older ones of course. And I wouldn't be shocked if you had a git tag every time you release a new version.

sbraz commented 3 years ago

Ah, I see you literally have one version per commit. Then I don't know what the right solution is. But I'm thinking, in the future, when I bump the package and I want to easily compare versions, I'd like to have an easy way to view all commits between two versions.

mk-fg commented 3 years ago

Default-disabled it in 29135cf as well. Suspect it might be due to audio delays and whatever on specific system, and don't really know what to raise these to for whole thing not to be annoyingly slow, should be fine to just disable.

mk-fg commented 3 years ago

when I bump the package and I want to easily compare versions, I'd like to have an easy way to view all commits between two versions

Maybe git has some kind of "N commits from date" commit-spec? That'd work with this version schema.

Otherwise I'd probably just to look at the diff between year-months, as versions correspond to these, there aren't that many commits, and if you're looking at the log manually anyway, there shouldn't be much need to be more precise.