Open ghost opened 10 years ago
hmm interesting, issuing "jack_samplerate" returns "44100", while Cadence shows "48000" as current sample rate. also the log says that sample rate had been set to "44100". I wonder where does Cadence get the current sample rate information from? Maybe the bug is just Cadence displaying the wrong sample rate, while underground everything is working correctly?
My master machine is defined with the Buffer Size of 512 samples, Sample Rate of 44100 Hz. When a client machine connects to the master, it is supposed to retrieve master's Buffer Size and Sample Rate and to set them accordingly. Usually this does happen. But sometimes (about 1 time out of 3) it does not.
For instance, I tell Cadence to "Force Restart", then I see the Buffer Size switch to 1024, then I see Buffer Size switch to 512 Samples, as expected. But Sample Rate sometimes would switch to 44100 Hz, and sometimes wouldn't, instead switching to 48000 Hz. Obviously, this completely undermines synchronization between Master and Slave.
Surprisingly, there is no error popping up in this case, and otherwise the master and the client appear to be connected correctly.
To alleviate this, I have to "Force Restart" 1 or more times, until Sample Rate is set correctly.
My ~/.config/jack/conf.xml: http://pastebin.com/NhWijabk
Log as registered by Cadence "Logs": http://pastebin.com/v1DXed9q
Seems like "net" driver is loaded correctly, but then for some reason it also loads the settings for "alsa" driver, which might (?) somehow confuse jack.
If this behavior is in fact a bug with jack, maybe as a workaround Cadence could set the correct driver by "jack_control ds", after loading the settings?