Open santigimeno opened 9 months ago
Maybe related to https://github.com/nodejs/node/issues/50725
Does this reproduce on the main branch?
Does this reproduce on the main branch?
Yes, it does. Kind of similar backtrace:
Program terminated with signal SIGSEGV, Segmentation fault.
#0 0x00005610fa34b9a7 in std::__atomic_base<long>::store (__m=std::memory_order_relaxed, __i=20034711383513, this=<optimized out>) at /usr/include/c++/11/bits/atomic_base.h:477
477 __atomic_store_n(&_M_i, __i, int(__m));
[Current thread is 1 (Thread 0x7fe341534880 (LWP 492627))]
(gdb) thread apply all bt
...
Thread 2 (Thread 0x7fe33affd640 (LWP 492708)):
#0 v8::base::AsAtomicImpl<int>::SetBits<unsigned int> (mask=32768, bits=32768, addr=<optimized out>) at /usr/include/c++/11/bits/atomic_base.h:569
#1 v8::internal::MarkBit::Set<(v8::internal::AccessMode)0> (this=<optimized out>) at ../deps/v8/src/heap/marking.h:68
#2 v8::internal::Marking::GreyToBlack<(v8::internal::AccessMode)0> (markbit=..., markbit=...) at ../deps/v8/src/heap/marking.h:429
#3 v8::internal::MarkingStateBase<v8::internal::ConcurrentMarkingState, (v8::internal::AccessMode)0>::GreyToBlack (this=<optimized out>, obj=...) at ../deps/v8/src/heap/marking-state-inl.h:76
#4 v8::internal::ConcurrentMarkingVisitor::ShouldVisit (object=..., this=0x7fe33affc960) at ../deps/v8/src/heap/concurrent-marking.cc:204
#5 0x00005610f9de87a0 in v8::internal::HeapVisitor<int, v8::internal::ConcurrentMarkingVisitor>::VisitSeqOneByteString (this=0x7fe33affc960, map=..., object=...) at ../deps/v8/src/heap/objects-visiting-inl.h:119
#6 0x00005610f9df6649 in v8::internal::ConcurrentMarking::RunMajor (this=0x56110055b500, delegate=delegate@entry=0x7fe33affcc50, code_flush_mode=..., mark_compact_epoch=<optimized out>, should_keep_ages_unchanged=<optimized out>) at ../deps/v8/src/heap/concurrent-marking.cc:459
#7 0x00005610f9df767d in v8::internal::ConcurrentMarking::JobTaskMajor::Run (this=0x5611005b3d80, delegate=0x7fe33affcc50) at ../deps/v8/src/heap/concurrent-marking.cc:306
#8 0x00005610fb118535 in v8::platform::DefaultJobWorker::Run (this=0x5611005b3d50) at ../deps/v8/src/libplatform/default-job.h:147
#9 0x00005610f9756f97 in node::(anonymous namespace)::PlatformWorkerThread (data=0x5611005057b0) at ../src/node_platform.cc:43
#10 0x00007fe3415cdac3 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
#11 0x00007fe34165fa40 in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
Thread 1 (Thread 0x7fe341534880 (LWP 492627)):
#0 0x00005610fa34b9a7 in std::__atomic_base<long>::store (__m=std::memory_order_relaxed, __i=20034711383513, this=<optimized out>) at /usr/include/c++/11/bits/atomic_base.h:477
#1 std::atomic_store_explicit<long> (__m=std::memory_order_relaxed, __i=20034711383513, __a=<optimized out>) at /usr/include/c++/11/atomic:1319
#2 v8::base::Relaxed_Store (value=20034711383513, ptr=<optimized out>) at ../deps/v8/src/base/atomicops.h:316
#3 v8::base::AsAtomicImpl<long>::Relaxed_Store<unsigned long> (new_value=20034711383513, addr=<optimized out>) at ../deps/v8/src/base/atomic-utils.h:110
#4 v8::internal::TaggedField<v8::internal::Object, 0, v8::internal::V8HeapCompressionScheme>::Relaxed_Store (offset=24, value=..., host=...) at ../deps/v8/src/objects/tagged-field-inl.h:153
#5 v8::internal::TaggedField<v8::internal::Object, 0, v8::internal::V8HeapCompressionScheme>::store (value=..., offset=24, host=...) at ../deps/v8/src/objects/tagged-field-inl.h:97
#6 v8::internal::TorqueGeneratedJSCollection<v8::internal::JSCollection, v8::internal::JSObject>::set_table (this=0x7fff5cc9e980, value=..., mode=v8::internal::UPDATE_WRITE_BARRIER) at /home/sgimeno/software/node/out/Debug/obj/gen/torque-generated/src/objects/js-collection-tq-inl.inc:21
#7 0x00005610fa35c904 in v8::internal::HeapObject::RehashBasedOnMap<v8::internal::Isolate> (this=0x7fff5cc9e9b0, isolate=<optimized out>) at ../deps/v8/src/objects/objects.cc:2432
#8 0x00005610fa6546da in v8::internal::Deserializer<v8::internal::Isolate>::Rehash (this=this@entry=0x7fff5cc9eae0) at ../deps/v8/src/snapshot/deserializer.h:78
#9 0x00005610fa64dc90 in v8::internal::ContextDeserializer::Deserialize (this=this@entry=0x7fff5cc9eae0, isolate=isolate@entry=0x56110050c8e0, global_proxy=..., global_proxy@entry=..., embedder_fields_deserializer=...) at ../deps/v8/src/snapshot/context-deserializer.cc:53
#10 0x00005610fa64dfb9 in v8::internal::ContextDeserializer::DeserializeContext (isolate=isolate@entry=0x56110050c8e0, data=data@entry=0x7fff5cc9ed20, can_rehash=<optimized out>, global_proxy=global_proxy@entry=..., embedder_fields_deserializer=...) at ../deps/v8/src/snapshot/context-deserializer.cc:22
#11 0x00005610fa6876d7 in v8::internal::Snapshot::NewContextFromSnapshot (isolate=isolate@entry=0x56110050c8e0, global_proxy=..., global_proxy@entry=..., context_index=context_index@entry=3, embedder_fields_deserializer=...) at ../deps/v8/src/snapshot/snapshot.cc:208
#12 0x00005610fa026ab1 in v8::internal::Genesis::Genesis (this=0x7fff5cc9eea0, isolate=0x56110050c8e0, maybe_global_proxy=..., global_proxy_template=..., context_snapshot_index=3, embedder_fields_deserializer=..., microtask_queue=0x0) at ../deps/v8/src/init/bootstrapper.cc:6660
#13 0x00005610fa029964 in v8::internal::Bootstrapper::CreateEnvironment (this=0x561100526250, maybe_global_proxy=..., maybe_global_proxy@entry=..., global_proxy_template=..., global_proxy_template@entry=..., extensions=extensions@entry=0x7fff5cc9efe0, context_snapshot_index=context_snapshot_index@entry=3, embedder_fields_deserializer=..., microtask_queue=microtask_queue@entry=0x0) at ../deps/v8/src/init/bootstrapper.cc:338
#14 0x00005610f9a5fdc5 in v8::InvokeBootstrapper<v8::internal::Context>::Invoke (microtask_queue=0x0, embedder_fields_deserializer=..., context_snapshot_index=3, extensions=0x7fff5cc9efe0, global_proxy_template=..., maybe_global_proxy=..., i_isolate=0x56110050c8e0, this=<synthetic pointer>) at ../deps/v8/src/api/api.cc:6618
#15 v8::CreateEnvironment<v8::internal::Context> (microtask_queue=0x0, embedder_fields_deserializer=..., context_snapshot_index=3, maybe_global_proxy=..., maybe_global_template=..., extensions=0x7fff5cc9efe0, i_isolate=0x56110050c8e0) at ../deps/v8/src/api/api.cc:6722
#16 v8::NewContext (external_isolate=external_isolate@entry=0x56110050c8e0, extensions=0x7fff5cc9efe0, extensions@entry=0x0, global_template=..., global_template@entry=..., global_object=..., global_object@entry=..., context_snapshot_index=context_snapshot_index@entry=3, embedder_fields_deserializer=..., microtask_queue=microtask_queue@entry=0x0) at ../deps/v8/src/api/api.cc:6763
#17 0x00005610f9a609a4 in v8::Context::FromSnapshot (external_isolate=0x56110050c8e0, context_snapshot_index=<optimized out>, embedder_fields_deserializer=..., extensions=0x0, global_object=..., microtask_queue=0x0) at ../deps/v8/src/api/api.cc:6798
#18 0x00005610f94c532f in node::CreateEnvironment (isolate_data=0x561100566df0, context=..., args=std::vector of length 2, capacity 2 = {...}, exec_args=std::vector of length 2, capacity 2 = {...}, flags=node::EnvironmentFlags::kDefaultFlags, thread_id=..., inspector_parent_handle=std::unique_ptr<node::InspectorParentHandle> = {...}) at ../src/api/environment.cc:452
#19 0x00005610f9701342 in node::NodeMainInstance::CreateMainEnvironment (this=0x7fff5cc9f310, exit_code=0x7fff5cc9f264) at ../src/node_main_instance.cc:136
#20 0x00005610f970100b in node::NodeMainInstance::Run (this=0x7fff5cc9f310) at ../src/node_main_instance.cc:84
#21 0x00005610f95f574a in node::StartInternal (argc=4, argv=0x5611004a9400) at ../src/node.cc:1374
#22 0x00005610f95f581c in node::Start (argc=4, argv=0x7fff5cc9f548) at ../src/node.cc:1381
#23 0x00005610fb0ffb64 in main (argc=4, argv=0x7fff5cc9f548) at ../src/node_main.cc:97
Version
v20.10.0
Platform
Linux sgimeno-N8xxEZ 5.15.0-89-generic #99-Ubuntu SMP Mon Oct 30 20:42:41 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux
Subsystem
src
What steps will reproduce the bug?
Running
test/parallel/test-domain-error-types.js
under load would rarely cause this. To make the crash happen more consistently I had to run multiple instances in parallel while reducing the gc-interval:How often does it reproduce? Is there a required condition?
Using the command above, mostly always. It seems having the gc kick in while deserializing the snapshot is causing the crash.
What is the expected behavior? Why is that the expected behavior?
Not crashing.
What do you see instead?
Variations of this thread backtraces can be extracted from the coredump
Additional information
No response