GoXLR-on-Linux / goxlr-utility

An unofficial GoXLR App replacement for Linux, Windows and MacOS
MIT License
636 stars 36 forks source link

Error while trying to run the daemon #26

Closed jlimas closed 1 year ago

jlimas commented 1 year ago

Hello! Just found out about this util after using the goxlr script from the other repo for a long time, this is awesome and thanks for the work.

Now I was able to compile the daemon and actually was able to setup everything, which seemed to be working but today I restarted the PC and when trying to open the daemon to make some changes I get the following message.

20:15:22 [INFO] Starting 8 workers
20:15:22 [INFO] Started GoXLR configuration interface at http://127.0.0.1:14564/
20:15:22 [INFO] Tokio runtime found; starting in existing Tokio runtime
20:15:24 [INFO] Connected to possible GoXLR device at Bus 001 Device 010: ID 1220:8fe4
20:15:24 [INFO] Configuring GoXLR Mini, Profile: Go XLR Linux, Mic Profile: Rode Podmic
thread 'tokio-runtime-worker' panicked at 'insertion index (is 2) should be <= len (is 1)', library/alloc/src/vec/mod.rs:1347:13
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

This is the output with the backtrace enabled as suggested

RUST_BACKTRACE=1 goxlr-daemon                                                                                                                                               ~
20:17:53 [INFO] Starting 8 workers
20:17:53 [INFO] Started GoXLR configuration interface at http://127.0.0.1:14564/
20:17:53 [INFO] Tokio runtime found; starting in existing Tokio runtime
20:17:54 [INFO] Connected to possible GoXLR device at Bus 001 Device 010: ID 1220:8fe4
20:17:54 [INFO] Configuring GoXLR Mini, Profile: Go XLR Linux, Mic Profile: Rode Podmic
thread 'tokio-runtime-worker' panicked at 'insertion index (is 2) should be <= len (is 1)', library/alloc/src/vec/mod.rs:1347:13
stack backtrace:
   0: rust_begin_unwind
             at /rustc/e092d0b6b43f2de967af0887873151bb1c0b18d3/library/std/src/panicking.rs:584:5
   1: core::panicking::panic_fmt
             at /rustc/e092d0b6b43f2de967af0887873151bb1c0b18d3/library/core/src/panicking.rs:142:14
   2: alloc::vec::Vec<T,A>::insert::assert_failed
             at /rustc/e092d0b6b43f2de967af0887873151bb1c0b18d3/library/alloc/src/vec/mod.rs:1347:13
   3: alloc::vec::Vec<T,A>::insert
   4: goxlr_profile_loader::profile::ProfileSettings::load
   5: goxlr_profile_loader::profile::Profile::load
   6: goxlr_daemon::profile::ProfileAdapter::from_named
   7: goxlr_daemon::profile::ProfileAdapter::from_named_or_default
   8: goxlr_daemon::device::Device<T>::new
   9: goxlr_daemon::primary_worker::handle_changes::{{closure}}
  10: tokio::runtime::task::core::CoreStage<T>::poll
  11: tokio::runtime::task::harness::Harness<T,S>::poll
  12: std::thread::local::LocalKey<T>::with
  13: tokio::runtime::thread_pool::worker::Context::run_task
  14: tokio::runtime::thread_pool::worker::Context::run
  15: tokio::macros::scoped_tls::ScopedKey<T>::set
  16: tokio::runtime::thread_pool::worker::run
  17: <tokio::runtime::blocking::task::BlockingTask<T> as core::future::future::Future>::poll
  18: tokio::runtime::task::harness::Harness<T,S>::poll
  19: tokio::runtime::blocking::pool::Inner::run
note: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.

Let me know if you also need the backtrace full output, thanks again!

Edit: Forgot to mention the UI only displays this when trying to connect. image

FrostyCoolSlug commented 1 year ago

Hi, do you have the ability to compile and test this with the dev-0.4.0 branch? I've made some fixes to profile loading which may address this.

Thanks

jlimas commented 1 year ago

Awesome! let me try that.

Just noticed that if I remove the profile it worked again but I kept the backup so let me compile and I'll report back.

jlimas commented 1 year ago

Yes! I was able to load my other profile building from the branch dev-0.4.0 thanks for the quick response!

FrostyCoolSlug commented 1 year ago

Perfect, and No problem! I'll look into getting those fixes into main soon, just have one more bug to fix :)