Open JJGadgets opened 2 years ago
Hi, Thanks for the report.
Do you have this env variable set: WAYLAND_DISPLAY
? With what value ?
Hi, Thanks for the report.
Do you have this env variable set:
WAYLAND_DISPLAY
? With what value ?
WAYLAND_DISPLAY
is wayland-1
, set using exec dbus-update-activation-environment DISPLAY SWAYSOCK WAYLAND_DISPLAY XDG_CURRENT_DESKTOP=sway
in Sway's config. Tried to unset it, doesn't change anything.
Restarting Sway seems to have fixed it (I had unsaved work prior), but now I get a new error.
thread 'predictor-eDP-1' panicked at 'Unable to compute luma percent: TIMEOUT', src/frame/capturer/wlroots.rs:128:26
stack backtrace:
0: 0x557ddbec46b0 - <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt::h18674753585b8cc5
1: 0x557ddbee716c - core::fmt::write::h650970577346edc1
2: 0x557ddbec0fb5 - std::io::Write::write_fmt::hcdbe4458fe2ebdfb
3: 0x557ddbec64cb - std::panicking::default_hook::{{closure}}::h2b3ca2e1a25274db
4: 0x557ddbec6043 - std::panicking::default_hook::hb3d80776b693aaeb
5: 0x557ddbd5933e - wluma::main::{{closure}}::hac5ad5b4efe56de1
6: 0x557ddbec6bc9 - std::panicking::rust_panic_with_hook::habc6079310c0728a
7: 0x557ddbec6670 - std::panicking::begin_panic_handler::{{closure}}::ha449aee990d62948
8: 0x557ddbec4b54 - std::sys_common::backtrace::__rust_end_short_backtrace::h54cc540f2a5a6bf2
9: 0x557ddbec65d9 - rust_begin_unwind
10: 0x557ddbd33231 - core::panicking::panic_fmt::he85288327cd30385
11: 0x557ddbd33323 - core::result::unwrap_failed::ha180eafd08eaf142
12: 0x557ddbd7ae2e - wayland_client::proxy::Main<I>::quick_assign::{{closure}}::h5d0332f5ba01d6f6
13: 0x557ddbd7b886 - wayland_commons::filter::Filter<E>::send::h4188a2956c92af27
14: 0x557ddbd6a3a5 - wayland_client::imp::proxy::ProxyInner::assign::{{closure}}::h894514efdf4d0a49
15: 0x557ddbe3eafe - scoped_tls::ScopedKey<T>::with::h18d79b63d6ea52a1
16: 0x557ddbe3d3df - wayland_client::imp::proxy::proxy_dispatcher::{{closure}}::h3a388c2a97c405fe
17: 0x557ddbe40d0a - wayland_client::imp::proxy::proxy_dispatcher::h59caa90d4624e586
18: 0x7f1dfce4b04c - <unknown>
19: 0x7f1dfce4b2cc - wl_display_dispatch_queue_pending
20: 0x557ddbd7a433 - scoped_tls::ScopedKey<T>::set::h2c340a82916e5f54
21: 0x557ddbd5cc09 - <wluma::frame::capturer::wlroots::Capturer as wluma::frame::capturer::Capturer>::run::h3171b8c751a86480
22: 0x557ddbd5ee46 - std::sys_common::backtrace::__rust_begin_short_backtrace::h1b6ace0267256e64
23: 0x557ddbd82a58 - core::ops::function::FnOnce::call_once{{vtable.shim}}::h2d19e92af92472d4
24: 0x557ddbec9f43 - std::sys::unix::thread::Thread::new::thread_start::h2582dd3e3de2d3c5
25: 0x7f1dfda9aeae - start_thread
at /builddir/glibc-2.32/nptl/pthread_create.c:463:8
26: 0x7f1dfd8852ff - __GI___clone
at /builddir/glibc-2.32/misc/../sysdeps/unix/sysv/linux/x86_64/clone.S:95
27: 0x0 - <unknown>
Relevant config:
[[output.backlight]]
name = "eDP-1"
path = "/sys/class/backlight/amdgpu_bl0"
capturer = "wlroots"
Additionally, if MANGOHUD=1
env var is present, MangoHud will attempt to hook into wluma's Vulkan capture and will confuse wluma on where to find config files (EDIT: I confused the stdout messages, I believe the MangoHud messages come from MangoHud itself, not wluma, so I don't think it's actually an issue? I'll leave it up to you to decide.):
[2022-05-29T09:38:29Z INFO wluma] Continue adjusting brightness and wluma will learn your preference over time.
skipping config: /usr/bin/MangoHud.conf [ not found ]
skipping config: /home/jj-void/.config/MangoHud/wluma.conf [ not found ]
parsing config: /home/jj-void/.config/MangoHud/MangoHud.conf [ ok ]
Unknown option 'procmem'
thread 'predictor-eDP-1' panicked at 'Unable to compute luma percent: TIMEOUT', src/frame/capturer/wlroots.rs:128:26
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
Would you like me to move these issues into their own issues for tracking, or leave them in this issue?
Hello! Thanks for the reports!
I believe the MangoHud messages come from MangoHud itself, not wluma, so I don't think it's actually an issue?
Good catch, indeed this is not our errors 🙂
Unable to compute luma percent: TIMEOUT
Now this is a issue we should try to get fixed... To be honest I don't really know where to get started, as I can't reproduce on my hardware and there isn't much in the stack trace that could give us details... Overall, we can see that the error happens here:
This function should get the contents of the screen using Vulkan and evaluate how much "light" the screen is currently displaying.
The function is here:
But as we don't get any error from it, just TIMEOUT
, I don't quite know where it gets stuck..
Would you be able to try to debug this a little on your end? Set RUST_LOG=trace
for example and run cargo run
, and maybe try to add some log prints to see how far the code reaches before it gets stuck with the timeout? It should hopefully give us some hints!
Sure, I'm comfortable with debugging (but have no Rust experience lmao), only issue is that I'm actually trying out wluma
when I was procrastinating, and I have an exam and some projects upcoming in the next week, so I hope you don't mind if replies are delayed.
❯ RUST_LOG=trace RUST_BACKTRACE=full MANGOHUD=0 cargo run
Finished dev [unoptimized + debuginfo] target(s) in 0.05s
Running `target/debug/wluma`
[2022-05-29T16:28:42Z DEBUG wluma] Using Config {
als: Webcam {
video: 0,
thresholds: {
20: "dim",
75: "outdoors",
10: "dark",
0: "night",
50: "bright",
33: "normal",
},
},
output: [
Backlight(
BacklightOutput {
name: "eDP-1",
path: "/sys/class/backlight/amdgpu_bl0",
capturer: Wlroots,
min_brightness: 1,
},
),
Backlight(
BacklightOutput {
name: "keyboard-thinkpad",
path: "/sys/bus/platform/devices/thinkpad_acpi/leds/tpacpi::kbd_backlight",
capturer: None,
min_brightness: 0,
},
),
],
}
[2022-05-29T16:28:42Z INFO wluma] Continue adjusting brightness and wluma will learn your preference over time.
[2022-05-29T16:28:42Z TRACE wluma::als::webcam] ALS (webcam): outdoors (100)
[2022-05-29T16:28:42Z TRACE wluma::predictor::controller] Prediction: 2 (lux: outdoors, luma: 0)
[2022-05-29T16:28:42Z DEBUG wluma::frame::capturer::wlroots] Using output 'Chimei Innolux Corporation 0x14F2 0x00000000 (eDP-1)' for config 'eDP-1'
[2022-05-29T16:28:42Z TRACE wluma::als::webcam] ALS (webcam): outdoors (100)
[2022-05-29T16:28:42Z TRACE wluma::predictor::controller] Prediction: 2 (lux: outdoors, luma: 0)
[2022-05-29T16:28:42Z TRACE wluma::als::webcam] ALS (webcam): outdoors (100)
[2022-05-29T16:28:42Z TRACE wluma::als::webcam] ALS (webcam): night (7)
[2022-05-29T16:28:42Z TRACE wluma::predictor::controller] Prediction: 2 (lux: outdoors, luma: 0)
[2022-05-29T16:28:42Z TRACE wluma::als::webcam] ALS (webcam): night (7)
[2022-05-29T16:28:42Z TRACE wluma::als::webcam] ALS (webcam): night (7)
[2022-05-29T16:28:43Z TRACE wluma::predictor::controller] Prediction: 2 (lux: outdoors, luma: 0)
[2022-05-29T16:28:43Z TRACE wluma::als::webcam] ALS (webcam): night (7)
[2022-05-29T16:28:43Z TRACE wluma::als::webcam] ALS (webcam): night (7)
[2022-05-29T16:28:43Z TRACE wluma::predictor::controller] Prediction: 2 (lux: outdoors, luma: 0)
[2022-05-29T16:28:43Z TRACE wluma::als::webcam] ALS (webcam): night (7)
[2022-05-29T16:28:43Z TRACE wluma::als::webcam] ALS (webcam): night (7)
[2022-05-29T16:28:43Z TRACE wluma::predictor::controller] Prediction: 2 (lux: outdoors, luma: 0)
[2022-05-29T16:28:43Z TRACE wluma::als::webcam] ALS (webcam): night (7)
thread 'predictor-eDP-1' panicked at 'Unable to compute luma percent: TIMEOUT', src/frame/capturer/wlroots.rs:128:26
stack backtrace:
0: 0x5654a9c69970 - <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt::h18674753585b8cc5
1: 0x5654a9c8c42c - core::fmt::write::h650970577346edc1
2: 0x5654a9c66275 - std::io::Write::write_fmt::hcdbe4458fe2ebdfb
3: 0x5654a9c6b78b - std::panicking::default_hook::{{closure}}::h2b3ca2e1a25274db
4: 0x5654a9c6b303 - std::panicking::default_hook::hb3d80776b693aaeb
5: 0x5654a9746ab3 - <alloc::boxed::Box<F,A> as core::ops::function::Fn<Args>>::call::h267ec243cda9bc74
at /builddir/rustc-1.57.0-src/library/alloc/src/boxed.rs:1705:9
6: 0x5654a978094b - wluma::main::{{closure}}::hdb3f5ec7d8d1fdbf
at /home/jjgadgets/GitRepos/Others/wluma/src/main.rs:14:9
7: 0x5654a9c6be89 - std::panicking::rust_panic_with_hook::habc6079310c0728a
8: 0x5654a9c6b930 - std::panicking::begin_panic_handler::{{closure}}::ha449aee990d62948
9: 0x5654a9c69e14 - std::sys_common::backtrace::__rust_end_short_backtrace::h54cc540f2a5a6bf2
10: 0x5654a9c6b899 - rust_begin_unwind
11: 0x5654a96b84f1 - core::panicking::panic_fmt::he85288327cd30385
12: 0x5654a96b85e3 - core::result::unwrap_failed::ha180eafd08eaf142
13: 0x5654a9736fa9 - core::result::Result<T,E>::expect::he2a6be7db10ae094
at /builddir/rustc-1.57.0-src/library/core/src/result.rs:1258:23
14: 0x5654a970a656 - wluma::frame::capturer::wlroots::Capturer::capture_frame::{{closure}}::h0fc0b92e14d53419
at /home/jjgadgets/GitRepos/Others/wluma/src/frame/capturer/wlroots.rs:125:32
15: 0x5654a97d9cfa - wayland_client::proxy::Main<I>::quick_assign::{{closure}}::h65104fcbcdddba8f
at /home/jj-void/.cargo/registry/src/github.com-1ecc6299db9ec823/wayland-client-0.29.4/src/proxy.rs:273:64
16: 0x5654a974d69e - wayland_commons::filter::Filter<E>::send::h3b15d5eeca60db6b
at /home/jj-void/.cargo/registry/src/github.com-1ecc6299db9ec823/wayland-commons-0.29.4/src/filter.rs:100:13
17: 0x5654a978a92a - wayland_client::imp::proxy::ProxyInner::assign::{{closure}}::h5b52008c349baa43
at /home/jj-void/.cargo/registry/src/github.com-1ecc6299db9ec823/wayland-client-0.29.4/src/native_lib/proxy.rs:257:57
18: 0x5654a9a2b913 - <alloc::boxed::Box<F,A> as core::ops::function::Fn<Args>>::call::hf44587b9a59e1353
at /builddir/rustc-1.57.0-src/library/alloc/src/boxed.rs:1705:9
[2022-05-29T16:28:43Z TRACE wluma::als::webcam] ALS (webcam): night (7)
19: 0x5654a9a37a77 - wayland_client::imp::proxy::proxy_dispatcher::{{closure}}::{{closure}}::ha4dcda653b7205e7
at /home/jj-void/.cargo/registry/src/github.com-1ecc6299db9ec823/wayland-client-0.29.4/src/native_lib/proxy.rs:418:25
20: 0x5654a9a2f7a0 - scoped_tls::ScopedKey<T>::with::h4f0d14c9cf7340c4
at /home/jj-void/.cargo/registry/src/github.com-1ecc6299db9ec823/scoped-tls-1.0.0/src/lib.rs:171:13
21: 0x5654a9a36b48 - wayland_client::imp::proxy::proxy_dispatcher::{{closure}}::h60ac95c15c264364
at /home/jj-void/.cargo/registry/src/github.com-1ecc6299db9ec823/wayland-client-0.29.4/src/native_lib/proxy.rs:415:21
22: 0x5654a9a2db3f - std::panicking::try::do_call::hd280f90a87a7ef49
at /builddir/rustc-1.57.0-src/library/std/src/panicking.rs:403:40
23: 0x5654a9a2dced - __rust_try
24: 0x5654a9a2d874 - std::panicking::try::h758184dfaa0a49bc
at /builddir/rustc-1.57.0-src/library/std/src/panicking.rs:367:19
25: 0x5654a9a2a031 - std::panic::catch_unwind::h07e84d2f22a90025
at /builddir/rustc-1.57.0-src/library/std/src/panic.rs:133:14
26: 0x5654a9a35895 - wayland_client::imp::proxy::proxy_dispatcher::h85fba8b93f8ac34b
at /home/jj-void/.cargo/registry/src/github.com-1ecc6299db9ec823/wayland-client-0.29.4/src/native_lib/proxy.rs:387:15
27: 0x7ff2b065304c - <unknown>
28: 0x7ff2b06532cc - wl_display_dispatch_queue_pending
29: 0x5654a97078af - wayland_client::imp::event_queue::EventQueueInner::dispatch::{{closure}}::he063b936677b01f1
at /home/jj-void/.cargo/registry/src/github.com-1ecc6299db9ec823/wayland-client-0.29.4/src/native_lib/event_queue.rs:43:17
30: 0x5654a9710df3 - scoped_tls::ScopedKey<T>::set::h0e532a7b754c6483
at /home/jj-void/.cargo/registry/src/github.com-1ecc6299db9ec823/scoped-tls-1.0.0/src/lib.rs:137:9
31: 0x5654a9707bd5 - wayland_client::imp::event_queue::with_dispatch_meta::h8dfe767772c9004e
at /home/jj-void/.cargo/registry/src/github.com-1ecc6299db9ec823/wayland-client-0.29.4/src/native_lib/event_queue.rs:24:5
32: 0x5654a9707809 - wayland_client::imp::event_queue::EventQueueInner::dispatch::h51ed14c90b3e81f8
at /home/jj-void/.cargo/registry/src/github.com-1ecc6299db9ec823/wayland-client-0.29.4/src/native_lib/event_queue.rs:41:9
33: 0x5654a974f577 - wayland_client::event_queue::EventQueue::dispatch::hf4bd7a73ad968f97
at /home/jj-void/.cargo/registry/src/github.com-1ecc6299db9ec823/wayland-client-0.29.4/src/event_queue.rs:152:9
34: 0x5654a97510e2 - <wluma::frame::capturer::wlroots::Capturer as wluma::frame::capturer::Capturer>::run::h1f04bdceedd5d6cd
at /home/jjgadgets/GitRepos/Others/wluma/src/frame/capturer/wlroots.rs:60:13
35: 0x5654a9781030 - wluma::main::{{closure}}::{{closure}}::he2387331803f511a
at /home/jjgadgets/GitRepos/Others/wluma/src/main.rs:87:29
36: 0x5654a974f25c - std::sys_common::backtrace::__rust_begin_short_backtrace::hb09d013859fddd38
at /builddir/rustc-1.57.0-src/library/std/src/sys_common/backtrace.rs:123:18
37: 0x5654a9792bd0 - std::thread::Builder::spawn_unchecked::{{closure}}::{{closure}}::hdfdac7fca1df38a5
at /builddir/rustc-1.57.0-src/library/std/src/thread/mod.rs:483:17
38: 0x5654a97e8250 - <core::panic::unwind_safe::AssertUnwindSafe<F> as core::ops::function::FnOnce<()>>::call_once::h6c0cebfcdde1e9b7
at /builddir/rustc-1.57.0-src/library/core/src/panic/unwind_safe.rs:271:9
39: 0x5654a971081a - std::panicking::try::do_call::h81534e8a3112760c
at /builddir/rustc-1.57.0-src/library/std/src/panicking.rs:403:40
40: 0x5654a9710d3d - __rust_try
41: 0x5654a9710451 - std::panicking::try::hdd0ba3b931e62611
at /builddir/rustc-1.57.0-src/library/std/src/panicking.rs:367:19
42: 0x5654a96cb520 - std::panic::catch_unwind::h65f38094707f0e98
at /builddir/rustc-1.57.0-src/library/std/src/panic.rs:133:14
43: 0x5654a97921c6 - std::thread::Builder::spawn_unchecked::{{closure}}::hd03b7490c7647741
at /builddir/rustc-1.57.0-src/library/std/src/thread/mod.rs:482:30
44: 0x5654a96b9b6e - core::ops::function::FnOnce::call_once{{vtable.shim}}::h3bfee36e17ec1fcd
at /builddir/rustc-1.57.0-src/library/core/src/ops/function.rs:227:5
45: 0x5654a9c6f203 - std::sys::unix::thread::Thread::new::thread_start::h2582dd3e3de2d3c5
46: 0x7ff2b14a3eae - start_thread
at /builddir/glibc-2.32/nptl/pthread_create.c:463:8
47: 0x7ff2b128e2ff - __GI___clone
at /builddir/glibc-2.32/misc/../sysdeps/unix/sysv/linux/x86_64/clone.S:95
48: 0x0 - <unknown>
Ok no problem, take your time. You can try to add some more debug log in this luma_percent function and give the logs? Thanks! (exams first please)
I'm experiencing the same issue. Here are the details of my environment and some logs associated. If it helps.
Hardware: ThinkPad T14 Gen 1 (AMD), Aukey PC-W3 webcam, AMDGPU Software: Arch Linux, River, wluma built from source
Thanks for reporting, to be honest it doesn't immediately strike me as the same issue, perhaps file it as a new one, so that its not forgotten in the shadow of this one?
By the way, do you have only integrated gpu, or also a separate one? (you can answer in the new ticket so that we move the discussion in one place)
Steps for reproducing the issue
Hardware: ThinkPad T14 Gen 1 (AMD), Chicony webcam, AMDGPU Software: Void Linux, SwayWM,
wluma
run from WezTermmake build && sudo make install
)wluma
What is the buggy behavior?
wlroots capture panics
wluma
on launch, with error "NoCompositorListening".What is the expected behavior?
I can use wlroots capture without errors.
Logs
Version
Latest commit from master, which should also mean 4.1.2?
Environment