bloomberg / memray

Memray is a memory profiler for Python
https://bloomberg.github.io/memray/
Apache License 2.0
13.36k stars 397 forks source link

Native run on PyQt-WebEngine hangs with jemalloc #336

Closed toofar closed 1 year ago

toofar commented 1 year ago

Is there an existing issue for this?

Current Behavior

Hello, me again! Now that I can do native runs with PyQt WebEngine I went back to some scenarios I've tested previously with a non-native run (for context I'm looking into https://github.com/qutebrowser/qutebrowser/issues/1476). I'm looking into how different memory allocaters might make the memory load of a particular application a bit lighter.

When running an application based on PyQt WebEngine with the jemalloc library in LD_PRELOAD the run hangs. With other mallocs (glibc, mimalloc, tcmalloc) the run completes fine.

Here's the stack traces, it seems to be hung up on a semaphore? Which sounds like it could be some fun timing thing. Also it looks like it is coming from DBus related code so probably it doesn't necessarily need all of QtWebEngine to reproduce.

(gdb) thread apply all bt

Thread 3 (Thread 0x7f09d24236c0 (LWP 18456) "python3"):
#0  futex_wait (private=0, expected=2, futex_word=0x7f09e86d5700) at ../sysdeps/nptl/futex-internal.h:146
#1  __GI___lll_lock_wait (futex=futex@entry=0x7f09e86d5700, private=0) at ./nptl/lowlevellock.c:49
#2  0x00007f09e8382262 in lll_mutex_lock_optimized (mutex=0x7f09e86d5700) at ./nptl/pthread_mutex_lock.c:48
#3  ___pthread_mutex_lock (mutex=0x7f09e86d5700) at ./nptl/pthread_mutex_lock.c:93
#4  0x00007f09e86923a0 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#5  0x00007f09e8621944 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#6  0x00007f09e8621b8f in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#7  0x00007f09e862259d in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#8  0x00007f09e86addf9 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#9  0x00007f09e6eedaea in memray::tracking_api::Tracker::prepareNativeTrace (trace=std::optional [no contained value]) at src/memray/_memray/tracking_api.h:237
#10 0x00007f09e6eeed11 in memray::tracking_api::Tracker::trackAllocation (func=memray::hooks::Allocator::MMAP, size=2097152, ptr=0x7f09de600000) at src/memray/_memray/tracking_api.h:218
#11 memray::intercept::mmap (addr=, length=2097152, prot=, flags=, fd=, offset=) at s--Type  for more, q to quit, c to continue without paging--c
rc/memray/_memray/hooks.cpp:224
#12 0x00007f09e8694b5a in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#13 0x00007f09e8694bc2 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#14 0x00007f09e8689994 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#15 0x00007f09e863bc89 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#16 0x00007f09e863cc5f in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#17 0x00007f09e863769c in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#18 0x00007f09e8621878 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#19 0x00007f09e86abc5a in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#20 0x00007f09e86abef8 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#21 0x00007f09e86ad926 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#22 0x00007f09e8622162 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#23 0x00007f09e6eee4c5 in memray::hooks::SymbolHook::operator()(unsigned long) const (this=0x7f09e6f6c6c0 ) at src/memray/_memray/hooks.h:100
#24 memray::intercept::malloc (size=168) at src/memray/_memray/hooks.cpp:169
#25 0x00007f09e8935ece in malloc (size=) at ../include/rtld-malloc.h:56
#26 allocate_dtv_entry (size=, alignment=16) at ../elf/dl-tls.c:684
#27 allocate_and_init (map=0x7f09e6161b00) at ../elf/dl-tls.c:709
#28 tls_get_addr_tail (ti=0x7f09e3a0a9c0, dtv=0x7f09e621f410, the_map=0x7f09e6161b00) at ../elf/dl-tls.c:907
#29 0x00007f09e8939468 in __tls_get_addr () at ../sysdeps/x86_64/tls_get_addr.S:55
#30 0x00007f09e36cfaa9 in QThreadPrivate::start(void*) () from /usr/local/Qt-6.5.0/lib/libQt6Core.so.6
#31 0x00007f09e837efd4 in start_thread (arg=) at ./nptl/pthread_create.c:442
#32 0x00007f09e83ff66c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81

Thread 2 (Thread 0x7f09e47ff6c0 (LWP 18455) "python3"):
#0  __futex_abstimed_wait_common64 (private=0, cancel=true, abstime=0x7f09e47fe410, op=393, expected=0, futex_word=0x7f09e61df468) at ./nptl/futex-internal.c:57
#1  __futex_abstimed_wait_common (futex_word=futex_word@entry=0x7f09e61df468, expected=expected@entry=0, clockid=clockid@entry=0, abstime=abstime@entry=0x7f09e47fe410, private=private@entry=0, cancel=cancel@entry=true) at ./nptl/futex-internal.c:87
#2  0x00007f09e837bd9b in __GI___futex_abstimed_wait_cancelable64 (futex_word=futex_word@entry=0x7f09e61df468, expected=expected@entry=0, clockid=clockid@entry=0, abstime=abstime@entry=0x7f09e47fe410, private=private@entry=0) at ./nptl/futex-internal.c:139
#3  0x00007f09e837e6dc in __pthread_cond_wait_common (abstime=0x7f09e47fe410, clockid=0, mutex=0x7f09e61df418, cond=0x7f09e61df440) at ./nptl/pthread_cond_wait.c:503
#4  ___pthread_cond_timedwait64 (cond=0x7f09e61df440, mutex=0x7f09e61df418, abstime=0x7f09e47fe410) at ./nptl/pthread_cond_wait.c:643
#5  0x00007f09e6f0b3cf in __gthread_cond_timedwait (__abs_timeout=0x7f09e47fe410, __mutex=, __cond=) at /opt/rh/devtoolset-10/root/usr/include/c++/10/x86_64-redhat-linux/bits/gthr-default.h:872
#6  std::condition_variable::__wait_until_impl > > (__lock=..., __lock=..., __atime=..., this=) at /opt/rh/devtoolset-10/root/usr/include/c++/10/condition_variable:232
#7  std::condition_variable::wait_until > > (__atime=..., __lock=..., this=) at /opt/rh/devtoolset-10/root/usr/include/c++/10/condition_variable:141
#8  std::condition_variable::wait_until >, memray::tracking_api::Tracker::BackgroundThread::start():::: > (__p=..., __atime=..., __lock=..., this=) at /opt/rh/devtoolset-10/root/usr/include/c++/10/condition_variable:158
#9  std::condition_variable::wait_for, memray::tracking_api::Tracker::BackgroundThread::start():::: > (__rtime=..., __rtime=..., __p=..., __lock=..., this=) at /opt/rh/devtoolset-10/root/usr/include/c++/10/condition_variable:185
#10 operator() (__closure=) at src/memray/_memray/tracking_api.cpp:690
#11 std::__invoke_impl > (__f=...) at /opt/rh/devtoolset-10/root/usr/include/c++/10/bits/invoke.h:60
#12 std::__invoke > (__fn=...) at /opt/rh/devtoolset-10/root/usr/include/c++/10/bits/invoke.h:95
#13 std::thread::_Invoker > >::_M_invoke<0> (this=) at /opt/rh/devtoolset-10/root/usr/include/c++/10/thread:264
#14 std::thread::_Invoker > >::operator() (this=) at /opt/rh/devtoolset-10/root/usr/include/c++/10/thread:271
#15 std::thread::_State_impl > > >::_M_run(void) (this=) at /opt/rh/devtoolset-10/root/usr/include/c++/10/thread:215
#16 0x00007f09e80d44a3 in ?? () from /lib/x86_64-linux-gnu/libstdc++.so.6
#17 0x00007f09e837efd4 in start_thread (arg=) at ./nptl/pthread_create.c:442
#18 0x00007f09e83ff66c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81

Thread 1 (Thread 0x7f09e88fc5c0 (LWP 18454) "python3"):
#0  syscall () at ../sysdeps/unix/sysv/linux/x86_64/syscall.S:38
#1  0x00007f09e36d16ce in QSemaphore::acquire(int) () from /usr/local/Qt-6.5.0/lib/libQt6Core.so.6
#2  0x00007f09e3596d36 in void doActivate(QObject*, int, void**) () from /usr/local/Qt-6.5.0/lib/libQt6Core.so.6
#3  0x00007f09e57730ed in QDBusConnectionManager::connectToBus(QDBusConnection::BusType, QString const&, bool) () from /usr/local/Qt-6.5.0/lib/libQt6DBus.so.6
#4  0x00007f09e5773227 in QDBusConnectionManager::busConnection(QDBusConnection::BusType) () from /usr/local/Qt-6.5.0/lib/libQt6DBus.so.6
#5  0x00007f09e5774748 in QDBusConnection::sessionBus() () from /usr/local/Qt-6.5.0/lib/libQt6DBus.so.6
#6  0x00007f09dfc9efc1 in QGenericUnixServices::QGenericUnixServices() () from /usr/local/Qt-6.5.0/lib/libQt6Gui.so.6
#7  0x00007f09def29390 in QXcbIntegration::QXcbIntegration(QList const&, int&, char**) () from /usr/local/Qt-6.5.0/plugins/platforms/../../lib/libQt6XcbQpa.so.6
#8  0x00007f09e2ff13dc in QXcbIntegrationPlugin::create(QString const&, QList const&, int&, char**) () from /usr/local/Qt-6.5.0/plugins/platforms/libqxcb.so
#9  0x00007f09df7b2584 in init_platform(QString const&, QString const&, QString const&, int&, char**) () from /usr/local/Qt-6.5.0/lib/libQt6Gui.so.6
#10 0x00007f09df7b5ca1 in QGuiApplicationPrivate::createPlatformIntegration() () from /usr/local/Qt-6.5.0/lib/libQt6Gui.so.6
#11 0x00007f09df7b6810 in QGuiApplicationPrivate::createEventDispatcher() () from /usr/local/Qt-6.5.0/lib/libQt6Gui.so.6
#12 0x00007f09e3540a55 in QCoreApplicationPrivate::init() () from /usr/local/Qt-6.5.0/lib/libQt6Core.so.6
#13 0x00007f09df7b99f9 in QGuiApplicationPrivate::init() () from /usr/local/Qt-6.5.0/lib/libQt6Gui.so.6
#14 0x00007f09e4b830f9 in QApplicationPrivate::init() () from /usr/local/Qt-6.5.0/lib/libQt6Widgets.so.6
#15 0x00007f09e552b939 in sipQApplication::sipQApplication(int&, char**, int) () from /mnt/ssd/user/qute-basedirs/venv-6.5/lib/python3.11/site-packages/PyQt6/QtWidgets.abi3.so
#16 0x00007f09e552ba07 in init_type_QApplication () from /mnt/ssd/user/qute-basedirs/venv-6.5/lib/python3.11/site-packages/PyQt6/QtWidgets.abi3.so
#17 0x00007f09e6a3e7d2 in sipSimpleWrapper_init (self=0x7f09e5fa2320, args=0x7f09e64b5a20, kwds=0x0) at sip_core.c:8788
#18 0x0000000000518b75 in _PyObject_MakeTpCall ()
#19 0x00000000005372e0 in _PyEval_EvalFrameDefault ()
#20 0x00000000005242fb in PyEval_EvalCode ()
#21 0x000000000058fdde in ?? ()
#22 0x000000000053bb47 in ?? ()
#23 0x000000000053ba7c in PyObject_Vectorcall ()
#24 0x000000000053741b in _PyEval_EvalFrameDefault ()
#25 0x00000000005242fb in PyEval_EvalCode ()
#26 0x000000000058fdde in ?? ()
#27 0x000000000053bb47 in ?? ()
#28 0x000000000053ba7c in PyObject_Vectorcall ()
#29 0x000000000052c5f0 in _PyEval_EvalFrameDefault ()
#30 0x000000000055da51 in _PyFunction_Vectorcall ()
#31 0x00000000006523bd in ?? ()
#32 0x0000000000651e47 in Py_RunMain ()
#33 0x0000000000629b47 in Py_BytesMain ()
#34 0x00007f09e831d18a in __libc_start_call_main (main=main@entry=0x629ab0, argc=argc@entry=6, argv=argv@entry=0x7ffd329232a8) at ../sysdeps/nptl/libc_start_call_main.h:58
#35 0x00007f09e831d245 in __libc_start_main_impl (main=0x629ab0, argc=6, argv=0x7ffd329232a8, init=, fini=, rtld_fini=, stack_end=0x7ffd32923298) at ../csu/libc-start.c:381
#36 0x00000000006299e1 in _start ()

And here is a script to reproduce the freeze in a container (remove the --native or the whole -m memray run --native bits to see it run through fully for all mallocs):

docker run -i debian:unstable bash -s <<"EOF"
cat >testbrowser_webengine.py <<"EOP"
#!/usr/bin/env python3
import sys
from PyQt6.QtCore import QUrl, QTimer
from PyQt6.QtWidgets import QApplication
from PyQt6.QtWebEngineWidgets import QWebEngineView

app = QApplication(sys.argv)
wv = QWebEngineView()

def kill():
    app.quit()
QTimer.singleShot(10, kill)

wv.load(QUrl.fromUserInput("https://example.com/"))
wv.show()

app.exec()
EOP
apt update
apt install -y wget xvfb python3-venv python3-pyqt6.qtwebengine libjemalloc2 libmimalloc2.0 libtcmalloc-minimal4
export DEBUGINFOD_URLS=https://debuginfod.debian.net
export QTWEBENGINE_CHROMIUM_FLAGS=--no-sandbox  # running as root in container
python3 -m venv --system-site-packages memray-test
. memray-test/bin/activate
# pip3 install memray  # need fix from #334
# I uploaded the wheel here because JS seems to be required to DL from GH actions
wget -O memray-1.7.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl https://files.catbox.moe/xglmxq.whl
pip3 install memray-1.7.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
# run with a few mallocs, it works fine with all of them except jemalloc
for lib in "" /usr/lib/x86_64-linux-gnu/libmimalloc.so.2 /usr/lib/x86_64-linux-gnu/libtcmalloc_minimal.so.4 /usr/lib/x86_64-linux-gnu/libjemalloc.so.2 ;do
  export LD_PRELOAD=$lib
  printf "\n=====\nRunning with LD_PRELOAD=$LD_PRELOAD\n=====\n\n"
  xvfb-run python3 -m memray run --native testbrowser_webengine.py  # hangs here w/ jemalloc
  echo "Finished run, generating flamegraph"
  python3 -m memray flamegraph `ls -tr *.bin | tail -n1`
done
EOF

Expected Behavior

No response

Steps To Reproduce

see above

Memray Version

latest build from https://github.com/bloomberg/memray/actions/runs/4427648999

Python Version

3.11

Operative System

Linux

Anything else?

No response

godlygeek commented 1 year ago

Hm. At a glance, without firing up a debugger, I'm thinking that jemalloc's malloc isn't reentrant, and we're calling back into it at a time that it wasn't prepared for. I'll try to dig into it in a day or so...

pablogsal commented 1 year ago

Hm. At a glance, without firing up a debugger, I'm thinking that jemalloc's malloc isn't reentrant, and we're calling back into it at a time that it wasn't prepared for. I'll try to dig into it in a day or so...

We may want to skip libunwind from the list of stuff we patch

pablogsal commented 1 year ago

Yeah, it seems that jemalloc is indeed not reentrant: https://github.com/jemalloc/jemalloc/issues/501

pablogsal commented 1 year ago

On the other hand, there is a bunch of things that I don't like here:

Thread 3 (Thread 0x7f09d24236c0 (LWP 18456) "python3"):
#0  futex_wait (private=0, expected=2, futex_word=0x7f09e86d5700) at ../sysdeps/nptl/futex-internal.h:146
#1  __GI___lll_lock_wait (futex=futex@entry=0x7f09e86d5700, private=0) at ./nptl/lowlevellock.c:49
#2  0x00007f09e8382262 in lll_mutex_lock_optimized (mutex=0x7f09e86d5700) at ./nptl/pthread_mutex_lock.c:48
#3  ___pthread_mutex_lock (mutex=0x7f09e86d5700) at ./nptl/pthread_mutex_lock.c:93
#4  0x00007f09e86923a0 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#5  0x00007f09e8621944 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#6  0x00007f09e8621b8f in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#7  0x00007f09e862259d in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#8  0x00007f09e86addf9 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#9  0x00007f09e6eedaea in memray::tracking_api::Tracker::prepareNativeTrace (trace=std::optional [no contained value]) at src/memray/_memray/tracking_api.h:237
#10 0x00007f09e6eeed11 in memray::tracking_api::Tracker::trackAllocation (func=memray::hooks::Allocator::MMAP, size=2097152, ptr=0x7f09de600000) at src/memray/_memray/tracking_api.h:218
#11 memray::intercept::mmap (addr=, length=2097152, prot=, flags=, fd=, offset=) at s--Type  for more, q to quit, c to continue without paging--c
rc/memray/_memray/hooks.cpp:224
#12 0x00007f09e8694b5a in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#13 0x00007f09e8694bc2 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#14 0x00007f09e8689994 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#15 0x00007f09e863bc89 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#16 0x00007f09e863cc5f in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#17 0x00007f09e863769c in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#18 0x00007f09e8621878 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#19 0x00007f09e86abc5a in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#20 0x00007f09e86abef8 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#21 0x00007f09e86ad926 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#22 0x00007f09e8622162 in ?? () from /usr/lib/x86_64-linux-gnu/libjemalloc.so.2
#23 0x00007f09e6eee4c5 in memray::hooks::SymbolHook::operator()(unsigned long) const (this=0x7f09e6f6c6c0 ) at src/memray/_memray/hooks.h:100
#24 memray::intercept::malloc (size=168) at src/memray/_memray/hooks.cpp:169

One is that we are both tracking malloc and the underlying mmap. The other is that I can see our friend __tls_get_addr in the stack but i don't think that's giving problems but now it makes me suspicious.

pablogsal commented 1 year ago

Well, seems that I cannot debug this on my aarch64 laptop 😓

A problem internal to GDB has been detected,
further debugging may prove unreliable.
----- Backtrace -----
0xaaaabe679b9b ???
0xaaaabe9bcefb ???
0xaaaabe9bd0e3 ???
0xaaaabeb64873 ???
0xaaaabe976ebb ???
0xaaaabe8cf49b ???
0xaaaabe5ca2bf ???
0xaaaabe971a37 ???
0xaaaabe971f47 ???
0xaaaabe971ff7 ???
0xaaaabe8782d7 ???
0xffff914f5387 _td_fetch_value
    ./nptl_db/fetch-value.c:115
0xffff914f230f td_ta_map_lwp2thr
    ./nptl_db/td_ta_map_lwp2thr.c:194
0xaaaabe807d8f ???
0xaaaabe809347 ???
0xaaaabe973e73 ???
0xaaaabe7d081f ???
0xaaaabe7dc033 ???
0xaaaabeb64d03 ???
0xaaaabeb657f7 ???
0xaaaabe980787 ???
0xaaaabe980a6b ???
0xaaaabe818dcb ???
0xaaaabe818f1b ???
0xaaaabe81acff ???
0xaaaabe81b733 ???
0xaaaabe5c1183 ???
0xffff9883777f __libc_start_call_main
    ../sysdeps/nptl/libc_start_call_main.h:58
0xffff98837857 __libc_start_main_impl
    ../csu/libc-start.c:381
0xaaaabe5c73af ???
0xffffffffffffffff ???
---------------------
/build/gdb-yCDzia/gdb-13.1/gdb/thread.c:85: internal-error: inferior_thread: Assertion `current_thread_ != nullptr' failed.
A problem internal to GDB has been detected,
further debugging may prove unreliable.
Quit this debugging session? (y or n) [answered Y; input not from terminal]
pablogsal commented 1 year ago

This is the stack I am getting in aarch64 after adding a recursion guard to malloc:

(venv) root@64e4bedf6306:/src# eu-stack -p 7999 --verbose
PID 7999 - process
TID 7999:
#0  0x0000ffffb392b654     __futex_abstimed_wait_common64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:57:12
#1  0x0000ffffb392b654     __futex_abstimed_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:87:9
#2  0x0000ffffb392b654     __GI___futex_abstimed_wait_cancelable64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:139:10
#3  0x0000ffffb392e190 - 1 __pthread_cond_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:503:10
#4  0x0000ffffb392e190 - 1 ___pthread_cond_wait - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:618:10
#5  0x0000ffffa4f219f4 - 1 base::ConditionVariable::Wait() - /usr/lib/aarch64-linux-gnu/libQt6WebEngineCore.so.6.4.2
#6  0x006affffa4f226b4 - 1 - /usr/lib/aarch64-linux-gnu/ld-linux-aarch64.so.1
#7  0x006affffa4f226b4 - 1 - /usr/lib/aarch64-linux-gnu/ld-linux-aarch64.so.1
#8  0x0009ffffa4f228bc - 1 - /usr/lib/aarch64-linux-gnu/ld-linux-aarch64.so.1
#9  0x0010ffffa3690428 - 1 - /usr/lib/aarch64-linux-gnu/ld-linux-aarch64.so.1
#10 0x0050ffffa47ee8ec - 1 - /usr/lib/aarch64-linux-gnu/ld-linux-aarch64.so.1
#11 0x0062ffffa1e76694 - 1 - /usr/lib/aarch64-linux-gnu/ld-linux-aarch64.so.1
#12 0x0047ffffa1e77d74 - 1 - /usr/lib/aarch64-linux-gnu/ld-linux-aarch64.so.1
#13 0x0054ffffa1e5cd98 - 1 - /usr/lib/aarch64-linux-gnu/ld-linux-aarch64.so.1
#14 0x0067ffffa4e4d230 - 1 - /usr/lib/aarch64-linux-gnu/ld-linux-aarch64.so.1
#15 0x0000ffffa4e4d6f8 - 1 QWebEnginePage::QWebEnginePage(QObject*) - /usr/lib/aarch64-linux-gnu/libQt6WebEngineCore.so.6.4.2
#16 0x0000ffffb24c38c0 - 1
eu-stack: dwfl_thread_getframes tid 7999 at 0xffffb24c38bf in <unknown>: No DWARF information found
TID 8000:
#0  0x0000ffffb392b654     __futex_abstimed_wait_common64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:57:12
#1  0x0000ffffb392b654     __futex_abstimed_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:87:9
#2  0x0000ffffb392b654     __GI___futex_abstimed_wait_cancelable64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:139:10
#3  0x0000ffffb392e7a8 - 1 __pthread_cond_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:503:10
#4  0x0000ffffb392e7a8 - 1 ___pthread_cond_clockwait64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:682:10
#5  0x0000ffffb392e7a8 - 1 ___pthread_cond_clockwait64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:670:1
#6  0x0000ffffb213bc3c - 1 - /host_virtiofs/Users/pgalindo3/github/memray/src/memray/_memray.cpython-311-aarch64-linux-gnu.so
#7  0x0000ffffb213bc3c - 1 - /host_virtiofs/Users/pgalindo3/github/memray/src/memray/_memray.cpython-311-aarch64-linux-gnu.so
#8  0x0000ffffb21388bc - 1 - /host_virtiofs/Users/pgalindo3/github/memray/src/memray/_memray.cpython-311-aarch64-linux-gnu.so
#9  0x0000ffffb2138944 - 1 - /host_virtiofs/Users/pgalindo3/github/memray/src/memray/_memray.cpython-311-aarch64-linux-gnu.so
#10 0x0000ffffb2138ac4 - 1 - /host_virtiofs/Users/pgalindo3/github/memray/src/memray/_memray.cpython-311-aarch64-linux-gnu.so
#11 0x0000ffffb2138b34 - 1 - /host_virtiofs/Users/pgalindo3/github/memray/src/memray/_memray.cpython-311-aarch64-linux-gnu.so
#12 0x0000ffffb375e9dc - 1 execute_native_thread_routine - /usr/lib/aarch64-linux-gnu/libstdc++.so.6.0.30
    ../../../../../src/libstdc++-v3/src/c++11/thread.cc:82:18
#13 0x0000ffffafd4eafc - 1
#14 0x0000ffffafd4eafc - 1
eu-stack: dwfl_thread_getframes tid 8000 at 0xffffafd4eafb in <unknown>: No DWARF information found
TID 8001:
#0  0x0000ffffb392b93c     futex_wait - /usr/lib/aarch64-linux-gnu/libc.so.6
    ../sysdeps/nptl/futex-internal.h:146:13
#1  0x0000ffffb392b93c     __GI___lll_lock_wait - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/lowlevellock.c:49:7
#2  0x0000ffffb3931f10 - 1 lll_mutex_lock_optimized - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_mutex_lock.c:48:5
#3  0x0000ffffb3931f10 - 1 ___pthread_mutex_lock - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_mutex_lock.c:93:7
#4  0x0000ffffb3bf4040 - 1 malloc_mutex_lock_final - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    include/jemalloc/internal/mutex.h:151:2
#5  0x0000ffffb3bf4040 - 1 je_malloc_mutex_lock_slow - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    src/mutex.c:90:2
#6  0x0000ffffb3ba9a68 - 1 malloc_mutex_lock - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    include/jemalloc/internal/mutex.h:217:4
#7  0x0000ffffb3ba9a68 - 1 je_arena_choose_hard - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    src/jemalloc.c:534:3
#8  0x0000ffffb3ba9d30 - 1 arena_choose_impl - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    include/jemalloc/internal/jemalloc_internal_inlines_b.h:46:9
#9  0x0000ffffb3baa254 - 1 arena_choose - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    include/jemalloc/internal/jemalloc_internal_inlines_b.h:88:9
#10 0x0000ffffb3baa254 - 1 tcache_alloc_small - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    include/jemalloc/internal/tcache_inlines.h:56:11
#11 0x0000ffffb3baa254 - 1 arena_malloc - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    include/jemalloc/internal/arena_inlines_b.h:151:11
#12 0x0000ffffb3baa254 - 1 iallocztm - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    include/jemalloc/internal/jemalloc_internal_inlines_c.h:55:8
#13 0x0000ffffb3baa254 - 1 imalloc_no_sample - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    src/jemalloc.c:2398:9
#14 0x0000ffffb3baa254 - 1 imalloc_body - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    src/jemalloc.c:2573:16
#15 0x0000ffffb3baa254 - 1 imalloc - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    src/jemalloc.c:2687:10
#16 0x0000ffffb3baa254 - 1 je_malloc_default - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    src/jemalloc.c:2722:2
#17 0x0000ffffb3c03fa8 - 1 fallback_impl<false> - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    src/jemalloc_cpp.cpp:98:28
#18 0x0000ffffb2117d74 - 1 - /host_virtiofs/Users/pgalindo3/github/memray/src/memray/_memray.cpython-311-aarch64-linux-gnu.so
#19 0x0000ffffb2117d74 - 1 - /host_virtiofs/Users/pgalindo3/github/memray/src/memray/_memray.cpython-311-aarch64-linux-gnu.so
#20 0x0000ffffb2116100 - 1 - /host_virtiofs/Users/pgalindo3/github/memray/src/memray/_memray.cpython-311-aarch64-linux-gnu.so
#21 0x0000ffffb3bf6c88 - 1 os_pages_map - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    src/pages.c:149:9
#22 0x0000000000200000 - 1
#23 0x0000000000200000 - 1
eu-stack: dwfl_thread_getframes tid 8001 at 0x1fffff in <unknown>: No DWARF information found
TID 8002:
#0  0x0000ffffb392b654     __futex_abstimed_wait_common64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:57:12
#1  0x0000ffffb392b654     __futex_abstimed_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:87:9
#2  0x0000ffffb392b654     __GI___futex_abstimed_wait_cancelable64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:139:10
#3  0x0000ffffb392e190 - 1 __pthread_cond_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:503:10
#4  0x0000ffffb392e190 - 1 ___pthread_cond_wait - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:618:10
#5  0x0000ffff9c4c35ec - 1 cnd_wait - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:135:13
#6  0x0000ffff9ca27474 - 1 pipe_semaphore_wait - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/gallium/auxiliary/os/os_thread.h:108:7
#7  0x0000ffff9ca27474 - 1 thread_function - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/gallium/drivers/llvmpipe/lp_rast.c:1184:7
#8  0x0000ffff9c4c34fc - 1 impl_thrd_routine - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:67:29
#9  0x0000ffffb392edd8 - 1 start_thread - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_create.c:442:8
#10 0x0000ffffb3997e9c - 1 thread_start - /usr/lib/aarch64-linux-gnu/libc.so.6
    ../sysdeps/unix/sysv/linux/aarch64/clone.S:79
TID 8003:
#0  0x0000ffffb392b654     __futex_abstimed_wait_common64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:57:12
#1  0x0000ffffb392b654     __futex_abstimed_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:87:9
#2  0x0000ffffb392b654     __GI___futex_abstimed_wait_cancelable64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:139:10
#3  0x0000ffffb392e190 - 1 __pthread_cond_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:503:10
#4  0x0000ffffb392e190 - 1 ___pthread_cond_wait - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:618:10
#5  0x0000ffff9c4c35ec - 1 cnd_wait - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:135:13
#6  0x0000ffff9ca27474 - 1 pipe_semaphore_wait - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/gallium/auxiliary/os/os_thread.h:108:7
#7  0x0000ffff9ca27474 - 1 thread_function - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/gallium/drivers/llvmpipe/lp_rast.c:1184:7
#8  0x0000ffff9c4c34fc - 1 impl_thrd_routine - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:67:29
#9  0x0000ffffb392edd8 - 1 start_thread - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_create.c:442:8
#10 0x0000ffffb3997e9c - 1 thread_start - /usr/lib/aarch64-linux-gnu/libc.so.6
    ../sysdeps/unix/sysv/linux/aarch64/clone.S:79
TID 8004:
#0  0x0000ffffb392b654     __futex_abstimed_wait_common64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:57:12
#1  0x0000ffffb392b654     __futex_abstimed_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:87:9
#2  0x0000ffffb392b654     __GI___futex_abstimed_wait_cancelable64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:139:10
#3  0x0000ffffb392e190 - 1 __pthread_cond_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:503:10
#4  0x0000ffffb392e190 - 1 ___pthread_cond_wait - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:618:10
#5  0x0000ffff9c4c35ec - 1 cnd_wait - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:135:13
#6  0x0000ffff9ca27474 - 1 pipe_semaphore_wait - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/gallium/auxiliary/os/os_thread.h:108:7
#7  0x0000ffff9ca27474 - 1 thread_function - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/gallium/drivers/llvmpipe/lp_rast.c:1184:7
#8  0x0000ffff9c4c34fc - 1 impl_thrd_routine - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:67:29
#9  0x0000ffffb392edd8 - 1 start_thread - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_create.c:442:8
#10 0x0000ffffb3997e9c - 1 thread_start - /usr/lib/aarch64-linux-gnu/libc.so.6
    ../sysdeps/unix/sysv/linux/aarch64/clone.S:79
TID 8005:
#0  0x0000ffffb392b654     __futex_abstimed_wait_common64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:57:12
#1  0x0000ffffb392b654     __futex_abstimed_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:87:9
#2  0x0000ffffb392b654     __GI___futex_abstimed_wait_cancelable64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:139:10
#3  0x0000ffffb392e190 - 1 __pthread_cond_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:503:10
#4  0x0000ffffb392e190 - 1 ___pthread_cond_wait - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:618:10
#5  0x0000ffff9c4c35ec - 1 cnd_wait - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:135:13
#6  0x0000ffff9ca27474 - 1 pipe_semaphore_wait - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/gallium/auxiliary/os/os_thread.h:108:7
#7  0x0000ffff9ca27474 - 1 thread_function - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/gallium/drivers/llvmpipe/lp_rast.c:1184:7
#8  0x0000ffff9c4c34fc - 1 impl_thrd_routine - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:67:29
#9  0x0000ffffb392edd8 - 1 start_thread - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_create.c:442:8
#10 0x0000ffffb3997e9c - 1 thread_start - /usr/lib/aarch64-linux-gnu/libc.so.6
    ../sysdeps/unix/sysv/linux/aarch64/clone.S:79
TID 8006:
#0  0x0000ffffb392b654     __futex_abstimed_wait_common64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:57:12
#1  0x0000ffffb392b654     __futex_abstimed_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:87:9
#2  0x0000ffffb392b654     __GI___futex_abstimed_wait_cancelable64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:139:10
#3  0x0000ffffb392e190 - 1 __pthread_cond_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:503:10
#4  0x0000ffffb392e190 - 1 ___pthread_cond_wait - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:618:10
#5  0x0000ffff9c4c35ec - 1 cnd_wait - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:135:13
#6  0x0000ffff9ca27474 - 1 pipe_semaphore_wait - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/gallium/auxiliary/os/os_thread.h:108:7
#7  0x0000ffff9ca27474 - 1 thread_function - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/gallium/drivers/llvmpipe/lp_rast.c:1184:7
#8  0x0000ffff9c4c34fc - 1 impl_thrd_routine - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:67:29
#9  0x0000ffffb392edd8 - 1 start_thread - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_create.c:442:8
#10 0x0000ffffb3997e9c - 1 thread_start - /usr/lib/aarch64-linux-gnu/libc.so.6
    ../sysdeps/unix/sysv/linux/aarch64/clone.S:79
TID 8007:
#0  0x0000ffffb392b654     __futex_abstimed_wait_common64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:57:12
#1  0x0000ffffb392b654     __futex_abstimed_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:87:9
#2  0x0000ffffb392b654     __GI___futex_abstimed_wait_cancelable64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:139:10
#3  0x0000ffffb392e190 - 1 __pthread_cond_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:503:10
#4  0x0000ffffb392e190 - 1 ___pthread_cond_wait - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:618:10
#5  0x0000ffff9c4c35ec - 1 cnd_wait - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:135:13
#6  0x0000ffff9ca24324 - 1 lp_cs_tpool_worker - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/gallium/drivers/llvmpipe/lp_cs_tpool.c:49:10
#7  0x0000ffff9c4c34fc - 1 impl_thrd_routine - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:67:29
#8  0x0000ffffb392edd8 - 1 start_thread - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_create.c:442:8
#9  0x0000ffffb3997e9c - 1 thread_start - /usr/lib/aarch64-linux-gnu/libc.so.6
    ../sysdeps/unix/sysv/linux/aarch64/clone.S:79
TID 8008:
#0  0x0000ffffb392b654     __futex_abstimed_wait_common64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:57:12
#1  0x0000ffffb392b654     __futex_abstimed_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:87:9
#2  0x0000ffffb392b654     __GI___futex_abstimed_wait_cancelable64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:139:10
#3  0x0000ffffb392e190 - 1 __pthread_cond_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:503:10
#4  0x0000ffffb392e190 - 1 ___pthread_cond_wait - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:618:10
#5  0x0000ffff9c4c35ec - 1 cnd_wait - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:135:13
#6  0x0000ffff9ca24324 - 1 lp_cs_tpool_worker - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/gallium/drivers/llvmpipe/lp_cs_tpool.c:49:10
#7  0x0000ffff9c4c34fc - 1 impl_thrd_routine - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:67:29
#8  0x0000ffffb392edd8 - 1 start_thread - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_create.c:442:8
#9  0x0000ffffb3997e9c - 1 thread_start - /usr/lib/aarch64-linux-gnu/libc.so.6
    ../sysdeps/unix/sysv/linux/aarch64/clone.S:79
TID 8009:
#0  0x0000ffffb392b654     __futex_abstimed_wait_common64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:57:12
#1  0x0000ffffb392b654     __futex_abstimed_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:87:9
#2  0x0000ffffb392b654     __GI___futex_abstimed_wait_cancelable64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:139:10
#3  0x0000ffffb392e190 - 1 __pthread_cond_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:503:10
#4  0x0000ffffb392e190 - 1 ___pthread_cond_wait - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:618:10
#5  0x0000ffff9c4c35ec - 1 cnd_wait - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:135:13
#6  0x0000ffff9ca24324 - 1 lp_cs_tpool_worker - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/gallium/drivers/llvmpipe/lp_cs_tpool.c:49:10
#7  0x0000ffff9c4c34fc - 1 impl_thrd_routine - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:67:29
#8  0x0000ffffb392edd8 - 1 start_thread - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_create.c:442:8
#9  0x0000ffffb3997e9c - 1 thread_start - /usr/lib/aarch64-linux-gnu/libc.so.6
    ../sysdeps/unix/sysv/linux/aarch64/clone.S:79
TID 8010:
#0  0x0000ffffb392b654     __futex_abstimed_wait_common64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:57:12
#1  0x0000ffffb392b654     __futex_abstimed_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:87:9
#2  0x0000ffffb392b654     __GI___futex_abstimed_wait_cancelable64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:139:10
#3  0x0000ffffb392e190 - 1 __pthread_cond_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:503:10
#4  0x0000ffffb392e190 - 1 ___pthread_cond_wait - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:618:10
#5  0x0000ffff9c4c35ec - 1 cnd_wait - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:135:13
#6  0x0000ffff9ca24324 - 1 lp_cs_tpool_worker - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/gallium/drivers/llvmpipe/lp_cs_tpool.c:49:10
#7  0x0000ffff9c4c34fc - 1 impl_thrd_routine - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:67:29
#8  0x0000ffffb392edd8 - 1 start_thread - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_create.c:442:8
#9  0x0000ffffb3997e9c - 1 thread_start - /usr/lib/aarch64-linux-gnu/libc.so.6
    ../sysdeps/unix/sysv/linux/aarch64/clone.S:79
TID 8011:
#0  0x0000ffffb392b654     __futex_abstimed_wait_common64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:57:12
#1  0x0000ffffb392b654     __futex_abstimed_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:87:9
#2  0x0000ffffb392b654     __GI___futex_abstimed_wait_cancelable64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:139:10
#3  0x0000ffffb392e190 - 1 __pthread_cond_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:503:10
#4  0x0000ffffb392e190 - 1 ___pthread_cond_wait - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:618:10
#5  0x0000ffff9c4c35ec - 1 cnd_wait - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:135:13
#6  0x0000ffff9ca24324 - 1 lp_cs_tpool_worker - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/gallium/drivers/llvmpipe/lp_cs_tpool.c:49:10
#7  0x0000ffff9c4c34fc - 1 impl_thrd_routine - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:67:29
#8  0x0000ffffb392edd8 - 1 start_thread - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_create.c:442:8
#9  0x0000ffffb3997e9c - 1 thread_start - /usr/lib/aarch64-linux-gnu/libc.so.6
    ../sysdeps/unix/sysv/linux/aarch64/clone.S:79
TID 8012:
#0  0x0000ffffb392b654     __futex_abstimed_wait_common64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:57:12
#1  0x0000ffffb392b654     __futex_abstimed_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:87:9
#2  0x0000ffffb392b654     __GI___futex_abstimed_wait_cancelable64 - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/futex-internal.c:139:10
#3  0x0000ffffb392e190 - 1 __pthread_cond_wait_common - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:503:10
#4  0x0000ffffb392e190 - 1 ___pthread_cond_wait - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_cond_wait.c:618:10
#5  0x0000ffff9c4c35ec - 1 cnd_wait - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:135:13
#6  0x0000ffff9c481794 - 1 util_queue_thread_func - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/util/u_queue.c:290:10
#7  0x0000ffff9c4c34fc - 1 impl_thrd_routine - /usr/lib/aarch64-linux-gnu/dri/armada-drm_dri.so
    ../src/c11/impl/threads_posix.c:67:29
#8  0x0000ffffb392edd8 - 1 start_thread - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_create.c:442:8
#9  0x0000ffffb3997e9c - 1 thread_start - /usr/lib/aarch64-linux-gnu/libc.so.6
    ../sysdeps/unix/sysv/linux/aarch64/clone.S:79
TID 8013:
#0  0x0000ffffb392b93c     futex_wait - /usr/lib/aarch64-linux-gnu/libc.so.6
    ../sysdeps/nptl/futex-internal.h:146:13
#1  0x0000ffffb392b93c     __GI___lll_lock_wait - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/lowlevellock.c:49:7
#2  0x0000ffffb3931f10 - 1 lll_mutex_lock_optimized - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_mutex_lock.c:48:5
#3  0x0000ffffb3931f10 - 1 ___pthread_mutex_lock - /usr/lib/aarch64-linux-gnu/libc.so.6
    ./nptl/pthread_mutex_lock.c:93:7
#4  0x0000ffffb3bf4040 - 1 malloc_mutex_lock_final - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    include/jemalloc/internal/mutex.h:151:2
#5  0x0000ffffb3bf4040 - 1 je_malloc_mutex_lock_slow - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    src/mutex.c:90:2
#6  0x0000ffffb3ba9a68 - 1 malloc_mutex_lock - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    include/jemalloc/internal/mutex.h:217:4
#7  0x0000ffffb3ba9a68 - 1 je_arena_choose_hard - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    src/jemalloc.c:534:3
#8  0x0000ffffb3c01f10 - 1 arena_choose_impl - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    include/jemalloc/internal/jemalloc_internal_inlines_b.h:46:9
#9  0x0000ffffb3c01f10 - 1 arena_choose_impl - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    include/jemalloc/internal/jemalloc_internal_inlines_b.h:32:1
#10 0x0000ffffb3c01f10 - 1 arena_choose - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    include/jemalloc/internal/jemalloc_internal_inlines_b.h:88:9
#11 0x0000ffffb3c01f10 - 1 je_tsd_tcache_data_init - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    src/tcache.c:740:11
#12 0x0000ffffb3c02198 - 1 je_tsd_tcache_enabled_data_init - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    src/tcache.c:644:3
#13 0x0000ffffb3c03a0c - 1 tsd_data_init - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    src/tsd.c:244:9
#14 0x0000ffffb3c03a0c - 1 je_tsd_fetch_slow - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    src/tsd.c:297:5
#15 0x0000ffffb3baa028 - 1 tsd_fetch_impl - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    include/jemalloc/internal/tsd.h:422:10
#16 0x0000ffffb3baa028 - 1 tsd_fetch - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    include/jemalloc/internal/tsd.h:448:9
#17 0x0000ffffb3baa028 - 1 imalloc - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    src/jemalloc.c:2681:15
#18 0x0000ffffb3baa028 - 1 je_malloc_default - /usr/lib/aarch64-linux-gnu/libjemalloc.so.2
    src/jemalloc.c:2722:2
#19 0x0000ffffb2115154 - 1 - /host_virtiofs/Users/pgalindo3/github/memray/src/memray/_memray.cpython-311-aarch64-linux-gnu.so
#20 0x0000ffffb2115154 - 1 - /host_virtiofs/Users/pgalindo3/github/memray/src/memray/_memray.cpython-311-aarch64-linux-gnu.so
#21 0x0000ffffb3cd9698 - 1 malloc - /usr/lib/aarch64-linux-gnu/ld-linux-aarch64.so.1
    ../include/rtld-malloc.h:56:10
#22 0x0000ffffb3cd9698 - 1 allocate_dtv_entry - /usr/lib/aarch64-linux-gnu/ld-linux-aarch64.so.1
    ./elf/dl-tls.c:684:19
#23 0x0000ffffb3cd9698 - 1 allocate_and_init - /usr/lib/aarch64-linux-gnu/ld-linux-aarch64.so.1
    ./elf/dl-tls.c:709:31
#24 0x0000ffffb3cd9698 - 1 tls_get_addr_tail - /usr/lib/aarch64-linux-gnu/ld-linux-aarch64.so.1
    ./elf/dl-tls.c:907:31
#25 0x0000ffff8d9c6afc - 1
#26 0x0000ffff8d9c6afc - 1
eu-stack: dwfl_thread_getframes tid 8013 at 0xffff8d9c6afb in <unknown>: No DWARF information found

Seems that there are at least 2 threads that are allocating but they don't seem to be re-entrant (I am assuming the normal call is os_pages_map to je_malloc_default and we are just in the middle. Maybe I am missing something and we are still malloc-ing inside memray somehow.

pablogsal commented 1 year ago

Seems that this is enough to solve the problem:

diff --git a/src/memray/_memray/elf_shenanigans.cpp b/src/memray/_memray/elf_shenanigans.cpp
index 98ac32a..99221db 100644
--- a/src/memray/_memray/elf_shenanigans.cpp
+++ b/src/memray/_memray/elf_shenanigans.cpp
@@ -166,7 +166,7 @@ phdrs_callback(dl_phdr_info* info, [[maybe_unused]] size_t size, void* data) noe
         patched.insert(info->dlpi_name);
     }

-    if (strstr(info->dlpi_name, "/ld-linux") || strstr(info->dlpi_name, "linux-vdso.so.1")) {
+    if (strstr(info->dlpi_name, "/ld-linux") || strstr(info->dlpi_name, "linux-vdso.so.1") || strstr(info->dlpi_name, "jemalloc")) {
         // Avoid chaos by not overwriting the symbols in the linker.
         // TODO: Don't override the symbols in our shared library!
         return 0;
diff --git a/src/memray/_memray/hooks.cpp b/src/memray/_memray/hooks.cpp
index b46a037..f388cdc 100644
--- a/src/memray/_memray/hooks.cpp
+++ b/src/memray/_memray/hooks.cpp
@@ -165,8 +165,11 @@ void*
 malloc(size_t size) noexcept
 {
     assert(hooks::malloc);
-
-    void* ptr = hooks::malloc(size);
+    void* ptr;
+    {
+        tracking_api::RecursionGuard guard;
+        ptr = hooks::malloc(size);
+    }
     tracking_api::Tracker::trackAllocation(ptr, size, hooks::Allocator::MALLOC);
     return ptr;
 }

I am still unsure how exactly the re-entrancy is happening because my gdb is busted :(

godlygeek commented 1 year ago

Here's the stack that we're deadlocking at on x86-64: https://gist.github.com/godlygeek/4cf3924b3d2be95f69a670f93672f0b1

2 threads are in jemalloc. The one that caused the deadlock is probably Thread 3:

jayaddison commented 1 year ago

Well, seems that I cannot debug this on my aarch64 laptop :sweat:

A problem internal to GDB has been detected,
further debugging may prove unreliable.
----- Backtrace -----
0xaaaabe679b9b ???
0xaaaabe9bcefb ???
0xaaaabe9bd0e3 ???
0xaaaabeb64873 ???
0xaaaabe976ebb ???
0xaaaabe8cf49b ???
0xaaaabe5ca2bf ???
0xaaaabe971a37 ???
0xaaaabe971f47 ???
0xaaaabe971ff7 ???
0xaaaabe8782d7 ???
0xffff914f5387 _td_fetch_value
  ./nptl_db/fetch-value.c:115
0xffff914f230f td_ta_map_lwp2thr
  ./nptl_db/td_ta_map_lwp2thr.c:194
0xaaaabe807d8f ???
0xaaaabe809347 ???
0xaaaabe973e73 ???
0xaaaabe7d081f ???
0xaaaabe7dc033 ???
0xaaaabeb64d03 ???
0xaaaabeb657f7 ???
0xaaaabe980787 ???
0xaaaabe980a6b ???
0xaaaabe818dcb ???
0xaaaabe818f1b ???
0xaaaabe81acff ???
0xaaaabe81b733 ???
0xaaaabe5c1183 ???
0xffff9883777f __libc_start_call_main
  ../sysdeps/nptl/libc_start_call_main.h:58
0xffff98837857 __libc_start_main_impl
  ../csu/libc-start.c:381
0xaaaabe5c73af ???
0xffffffffffffffff ???
---------------------
/build/gdb-yCDzia/gdb-13.1/gdb/thread.c:85: internal-error: inferior_thread: Assertion `current_thread_ != nullptr' failed.
A problem internal to GDB has been detected,
further debugging may prove unreliable.
Quit this debugging session? (y or n) [answered Y; input not from terminal]

Hi @pablogsal - this question is off-topic from memray and PyQT unfortunately, but.. I've encountered a similar stacktrace to this on a Debian accessibility thread and was wondering what your interpretation of the meaning of this stacktrace was?

(that thread is also about a potential process/threading deadlock situation, but as far as I can tell, jemalloc isn't in use there)