Spotifyd / spotifyd

A spotify daemon
https://spotifyd.rs
GNU General Public License v3.0
9.91k stars 452 forks source link

mpris interface unresponsive (maybe due to credential cache panic ) #1246

Open JJ-Author opened 11 months ago

JJ-Author commented 11 months ago

Description the MPRIS interface on systembus becomes unresponsive after some idle time over the night ( the instance is still visible but not introspectable)

busctl introspect org.mpris.MediaPlayer2.spotifyd.instance365 /org/mpris/MediaPlayer2 Failed to introspect object /org/mpris/MediaPlayer2 of service org.mpris.MediaPlayer2.spotifyd.instance365: Connection timed out

there is an error about credential caching in the log which I do not understand

then I tried this busctl call org.mpris.MediaPlayer2.spotifyd.instance365 /rs/spotifyd/Controls rs.spotifyd.Controls TransferPlayback Call failed: Connection timed out and then spotifyd crashed and was restarted by my systemd config after that it was working properly again

To Reproduce

  1. start spotifyd fresh with mpris on systembus
  2. play music with the android spotify app and select spotifyd as device
  3. pause music and wait 8 hours without letting anything different connect to spotify
  4. try to run the first busctl introspect command for your given instance 3 times getting a timeout
  5. try to run the second busctl command getting again a timeout
  6. see the logs in my case spotifyd had crashed and then started working again

Expected behavior mpris should be instrospectable and responsive all the time if there is a spotify api login problem it should try to automatically login again

Logs

``` Dec 13 09:59:50 DietPi spotifyd[365]: Session[0] strong=4 weak=2 Dec 13 10:01:50 DietPi spotifyd[365]: Session[0] strong=4 weak=2 Dec 13 10:03:50 DietPi spotifyd[365]: Session[0] strong=4 weak=2 Dec 13 10:05:50 DietPi spotifyd[365]: Session[0] strong=4 weak=2 Dec 13 10:06:34 DietPi spotifyd[365]: Token cache write ignored (not configured) Dec 13 10:06:34 DietPi spotifyd[365]: The application panicked (crashed). Dec 13 10:06:34 DietPi spotifyd[365]: Message: RSpotify not authenticated Dec 13 10:06:34 DietPi spotifyd[365]: Location: /root/.cargo/registry/src/index.crates.io-1cd66030c949c28d/rspotify-0.12.0/src/clients/base.rs:106 Dec 13 10:06:34 DietPi spotifyd[365]: Backtrace omitted. Run with RUST_BACKTRACE=1 environment variable to display it. Dec 13 10:06:34 DietPi spotifyd[365]: Run with RUST_BACKTRACE=full to include source snippets. Dec 13 10:07:50 DietPi spotifyd[365]: Session[0] strong=4 weak=2 Dec 13 10:09:50 DietPi spotifyd[365]: Session[0] strong=4 weak=2 Dec 13 10:11:50 DietPi spotifyd[365]: Session[0] strong=4 weak=2 Dec 13 10:13:50 DietPi spotifyd[365]: Session[0] strong=4 weak=2 Dec 13 10:15:14 DietPi spotifyd[365]: kMessageTypePlay "" webapi-44219fb334174bc6b2c634d8f9e4f6eb 1651865768 1702457260546 kPlayStatusPause Dec 13 10:15:14 DietPi spotifyd[365]: Sending status to server: [kPlayStatusStop] Dec 13 10:15:14 DietPi spotifyd[365]: The application panicked (crashed). Dec 13 10:15:14 DietPi spotifyd[365]: Message: called `Result::unwrap()` on an `Err` value: PoisonError { .. } Dec 13 10:15:14 DietPi spotifyd[365]: Location: src/dbus_mpris.rs:84 Dec 13 10:15:14 DietPi spotifyd[365]: Backtrace omitted. Run with RUST_BACKTRACE=1 environment variable to display it. Dec 13 10:15:14 DietPi spotifyd[365]: Run with RUST_BACKTRACE=full to include source snippets. Dec 13 10:15:14 DietPi spotifyd[365]: drop Spirc[0] Dec 13 10:15:14 DietPi spotifyd[365]: Shutting down player thread ... Dec 13 10:15:14 DietPi spotifyd[365]: drop PlayerInternal[0] Dec 13 10:15:14 DietPi spotifyd[365]: PlayerInternal thread finished. Dec 13 10:15:14 DietPi spotifyd[365]: drop Session[0] Dec 13 10:15:14 DietPi spotifyd[365]: drop MercuryManager Dec 13 10:15:14 DietPi spotifyd[365]: drop Dispatch Dec 13 10:15:14 DietPi systemd[1]: spotifyd.service: Main process exited, code=exited, status=101/n/a Dec 13 10:15:14 DietPi systemd[1]: spotifyd.service: Failed with result 'exit-code'. Dec 13 10:15:19 DietPi systemd[1]: spotifyd.service: Scheduled restart job, restart counter is at 1. Dec 13 10:15:19 DietPi systemd[1]: Stopped Spotifyd (DietPi). Dec 13 10:15:19 DietPi systemd[1]: Started Spotifyd (DietPi). Dec 13 10:15:19 DietPi spotifyd[978]: Loading config from "/mnt/dietpi_userdata/spotifyd/spotifyd.conf" Dec 13 10:15:19 DietPi spotifyd[978]: CliConfig { config_path: Some("/mnt/dietpi_userdata/spotifyd/spotifyd.conf"), no_daemon: true, verbose: true, pid: None, shared_config: SharedConfigValues { usernam e: Some("taken out for privacy"), username_cmd: None, password: None, password_cmd: None, use_keyring: false, use_mpris: Some(true), dbus_type: Some(System), on_song_change_hook: None, cache_path: Some( "/mnt/dietpi_userdata/spotifyd/cache"), no-audio-cache: true, backend: Some(Alsa), volume_controller: Some(SoftVolume), device: None, control: None, mixer: None, device_name: Some("DietPi"), bitrate: So me(Bitrate320), audio_format: None, initial_volume: Some("40"), volume_normalisation: false, normalisation_pregain: None, zeroconf_port: None, proxy: None, device_type: Some(Speaker), autoplay: true, ma x_cache_size: None } } Dec 13 10:15:19 DietPi spotifyd[978]: Found user shell: Some("/bin/sh") Dec 13 10:15:19 DietPi spotifyd[978]: No password specified. Checking password_cmd Dec 13 10:15:19 DietPi spotifyd[978]: No password_cmd specified Dec 13 10:15:19 DietPi spotifyd[978]: No proxy specified Dec 13 10:15:19 DietPi spotifyd[978]: Using software volume controller. Dec 13 10:15:19 DietPi spotifyd[978]: Ignoring blacklisted access point ap-gew4.spotify.com:4070 Dec 13 10:15:19 DietPi spotifyd[978]: Ignoring blacklisted access point ap-gew4.spotify.com:443 Dec 13 10:15:19 DietPi spotifyd[978]: Ignoring blacklisted access point ap-gew4.spotify.com:80 Dec 13 10:15:19 DietPi spotifyd[978]: Ignoring blacklisted access point ap-gue1.spotify.com:443 Dec 13 10:15:19 DietPi spotifyd[978]: Failed to resolve Access Point: Unable to resolve any viable access points. Dec 13 10:15:19 DietPi spotifyd[978]: Using fallback "ap.spotify.com:443" Dec 13 10:15:19 DietPi spotifyd[978]: Connecting to AP "ap.spotify.com:443" Dec 13 10:15:20 DietPi spotifyd[978]: Authenticated as "xxxxx" ! Dec 13 10:15:20 DietPi spotifyd[978]: new Session[0] Dec 13 10:15:20 DietPi spotifyd[978]: Session[0] strong=2 weak=1 ```

Compilation flags

Versions (please complete the following information):

danielmcmillan commented 7 months ago

I see the same issue. I started playback using the TransferPlayback dbus method and after some time (much less than 8 hours) the log reports a panic, but the process is still alive so it doesn't get restarted. It didn't crash again when doing introspect and TransferPlayback, but as soon as I opened the Spotify app on another machine I got the second panic and the process died.

Here it is with the backtrace enabled

Backtrace ``` Couldn't fetch metadata from spotify: Nothing playing at the moment. Token cache write ignored (not configured) The application panicked (crashed). Message: RSpotify not authenticated Location: /home/daniel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/rspotify-0.12.0/src/clients/base.rs:106 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ BACKTRACE ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ ⋮ 8 frames hidden ⋮ 9: core::panicking::panic_display::h2c6e530b26584522 at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/core/src/panicking.rs:196 10: core::panicking::panic_str::h4c43cc2df5473fd1 at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/core/src/panicking.rs:171 11: core::option::expect_failed::h39c56eaa3bce6881 at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/core/src/option.rs:1988 12: rspotify::clients::base::BaseClient::auth_headers::hadc1668e0558c207 at : 13: rspotify::clients::base::BaseClient::api_get::h9b23681b5b7dd344 at : 14: spotifyd::dbus_mpris::get_device_id::h325050474d62ce30 at : 15: dbus_crossroads::ifacedesc::IfaceBuilder::method_with_cr::{{closure}}::ha001171c8a86d58c at : 16: dbus_crossroads::crossroads::Crossroads::handle_message_inner::hdd316cc5c3a84b5e at : 17: dbus_crossroads::crossroads::Crossroads::handle_message::hda6f3896ef4ed02f at : 18: spotifyd::dbus_mpris::create_dbus_server::{{closure}}::{{closure}}::h2f08b8943cbbdcd2 at : 19: ::process_one::h1c8860bd6188ec2d at : 20: as core::future::future::Future>::poll::h58d27e434e2502c6 at : 21: spotifyd::dbus_mpris::create_dbus_server::{{closure}}::{{closure}}::h88aa61c939a0c92d at : 22: tokio::runtime::task::core::Core::poll::h2e2c8cdcb2917e71 at : 23: tokio::runtime::task::harness::Harness::poll::h9fec6ef6669f3748 at : 24: tokio::runtime::scheduler::multi_thread::worker::Context::run_task::h866b916a77bae828 at : 25: tokio::runtime::scheduler::multi_thread::worker::Context::run::h08ee4a2fe6d3662a at : 26: tokio::runtime::context::scoped::Scoped::set::h947f25f0a8746071 at : 27: tokio::runtime::context::runtime::enter_runtime::h825bec3258685d62 at : 28: tokio::runtime::scheduler::multi_thread::worker::run::h5c562ff06d9e8e2c at : 29: as core::future::future::Future>::poll::hba477662a9b8bfe5 at : 30: tokio::runtime::task::core::Core::poll::h4ab980a4bdf7aac5 at : 31: tokio::runtime::task::harness::Harness::poll::h2c319639f591823e at : 32: tokio::runtime::blocking::pool::Inner::run::hffcdd47c378bf96a at : 33: std::sys_common::backtrace::__rust_begin_short_backtrace::ha869a53ca4b5b754 at : 34: core::ops::function::FnOnce::call_once{{vtable.shim}}::h75b4f91f1a148f7f at : 35: as core::ops::function::FnOnce>::call_once::h65e29fa99a122141 at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/alloc/src/boxed.rs:2015 36: as core::ops::function::FnOnce>::call_once::h0e96a7fd708b99fb at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/alloc/src/boxed.rs:2015 37: std::sys::pal::unix::thread::Thread::new::thread_start::h1a4b48df802ee0a8 at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/sys/pal/unix/thread.rs:108 38: start_thread at ./nptl/pthread_create.c:442 39: thread_start at ./misc/../sysdeps/unix/sysv/linux/aarch64/clone.S:79 Run with COLORBT_SHOW_HIDDEN=1 environment variable to disable frame filtering. The application panicked (crashed). Message: called `Result::unwrap()` on an `Err` value: PoisonError { .. } Location: src/dbus_mpris.rs:84 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ BACKTRACE ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ ⋮ 9 frames hidden ⋮ 10: ::poll::h9b68c668beb53c18 at : 11: as core::future::future::Future>::poll::hb787d52c83951b93 at : 12: tokio::runtime::runtime::Runtime::block_on::h5c2d6b3a3959e000 at : 13: spotifyd::main::h9bb04d2e3cab8d76 at : 14: std::sys_common::backtrace::__rust_begin_short_backtrace::h50466c551e020831 at : 15: std::rt::lang_start::{{closure}}::hf00986dc282a29a9 at : 16: core::ops::function::impls:: for &F>::call_once::hdbbf4bc5b948549a at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/core/src/ops/function.rs:284 17: std::panicking::try::do_call::hd9aa31d1e8310f6a at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/panicking.rs:554 18: std::panicking::try::h5b4be1ce5148e504 at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/panicking.rs:518 19: std::panic::catch_unwind::h66d85bc43d2d9f6e at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/panic.rs:142 20: std::rt::lang_start_internal::{{closure}}::h2b8a5b503f27f998 at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/rt.rs:148 21: std::panicking::try::do_call::hd7f77e71a4e73fa4 at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/panicking.rs:554 22: std::panicking::try::h63f06620cf880361 at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/panicking.rs:518 23: std::panic::catch_unwind::h4b03c83abc824798 at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/panic.rs:142 24: std::rt::lang_start_internal::h0ddfe42b2029814a at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/rt.rs:148 25: main at : 26: __libc_start_call_main at ./csu/../sysdeps/nptl/libc_start_call_main.h:58 27: __libc_start_main_impl at ./csu/../csu/libc-start.c:360 ```
eladyn commented 7 months ago

Hmm, I think I see the problem in the current code: When the application is idle, we currently don't refresh the token and as such, the dbus calls make use of an invalid token.

linknum23 commented 5 months ago

We are seeing the same thing with our recent build, when MPRIS is polled but spotifyd has been idle for awhile.

amplipi authbind[3093]: The application panicked (crashed).
May 06 15:09:59 amplipi authbind[3093]: Message:  called `Result::unwrap()` on an `Err` value: PoisonError { .. }
May 06 15:09:59 amplipi authbind[3093]: Location: src/dbus_mpris.rs:84
May 06 15:09:59 amplipi authbind[3093]: Backtrace omitted. Run with RUST_BACKTRACE=1 environment variable to display it.
May 06 15:09:59 amplipi authbind[3093]: Run with RUST_BACKTRACE=full to include source snippets.
2opremio commented 1 month ago

I am experiencing the same problem and I am willing to take a stab at it. @eladyn any pointers on where/how to solve it?

eladyn commented 1 month ago

@2opremio I'm definitely not immersed in the issue right now, but if I recall correctly, the problem is that the refresh token is only refreshed when the Player sends out PlayerEvents. When there is no playback and only mpris calls however, no refreshing takes place... If you need more guidance, I can surely have a look tomorrow or you just join the Matrix room so we can discuss over there.

linknum23 commented 1 month ago

The workaround for us was to subscribe to the PropertiesChanged event on the MPRIS interface instead of just polling on some interval. Honestly this is probably the better way to do it. If I recall how this works is by the time the PropertiesChanged event is fired the token has already been refreshed. Here's a link to the python we use to interface with MRPIS now: https://github.com/micro-nova/AmpliPi/blob/main/streams/MPRIS_metadata_reader.py

JJ-Author commented 1 month ago

thanks for sharing @linknum23 but I honestly don't understand why this would be a workaround for the issue I described?? The issues does not involve polling and I was not really interested in the changes of properties but I wanted to call DBUS functions and let a simple and custom piece of media-controller software inspect the current MPRIS players, their capabilities and functions available and the identity string that does exist for that client.

As note/ food for thought for you w.r.t. your use case (track metadata?): given the indicator that the issue is rooted in invalid credentials/token fetching the metadata via the API and the exposing it via MPRIS I have doubts that the PropertiesChanged event is being triggered after the cache is invalid because it can not fetch the new metadata from the API and therefore would not know about any changes. I agree with you that once the token is fresh it can detect the changes and likely emit the signal, but the token will not refresh on its own, without further external stimuli (e.g. controlling via spotify connect) - that seems to be the issue. But maybe that external stimulus is inherent to your use case and that could be why it seems to work for you.

linknum23 commented 1 month ago

Sorry I probably could have given more context on that. You are correct, our use case is tracking the session metadata. For us, using PropertiesChanged avoids interfacing to the dbus interface when it is in this bad state. A play request from either Spotify connect or through spotifyd provides the external stimulus in our case.