denoland / deno_core

The core engine at the heart of Deno
MIT License
290 stars 94 forks source link

Fatal process out of memory: Failed to reserve virtual memory for CodeRange #916

Closed louis030195 closed 3 weeks ago

louis030195 commented 4 weeks ago
deno_core = { version = "0.311.0", optional = true }
deno_ast = { version = "0.38.2", features = ["transpiling"], optional = true }
#
# Fatal process out of memory: Failed to reserve virtual memory for CodeRange
#
==== C stack trace ===============================

    0   screenpipe                          0x00000001027cfc98 v8::base::debug::StackTrace::StackTrace() + 24
    1   screenpipe                          0x00000001027d5af0 v8::platform::(anonymous namespace)::PrintStackTrace() + 24
    2   screenpipe                          0x00000001027c6210 v8::base::FatalOOM(v8::base::OOMType, char const*) + 68
    3   screenpipe                          0x000000010282a3b8 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, v8::OOMDetails const&) + 616
    4   screenpipe                          0x00000001029eff94 v8::internal::Heap::SetStackStart() + 0
    5   screenpipe                          0x000000010295dba8 v8::internal::Isolate::Init(v8::internal::SnapshotData*, v8::internal::SnapshotData*, v8::internal::SnapshotData*, bool) + 1564
    6   screenpipe                          0x000000010295e2f8 v8::internal::Isolate::InitWithSnapshot(v8::internal::SnapshotData*, v8::internal::SnapshotData*, v8::internal::SnapshotData*, bool) + 12
    7   screenpipe                          0x0000000102de5ad0 v8::internal::Snapshot::Initialize(v8::internal::Isolate*) + 460
    8   screenpipe                          0x000000010283eea0 v8::Isolate::Initialize(v8::Isolate*, v8::Isolate::CreateParams const&) + 344
    9   screenpipe                          0x000000010283efbc v8::Isolate::New(v8::Isolate::CreateParams const&) + 32
    10  screenpipe                          0x00000001027b8360 v8::isolate::Isolate::new::h0a5c619b6084e341 + 164
    11  screenpipe                          0x000000010274d698 deno_core::runtime::setup::create_isolate::h43f7780524624353 + 724
    12  screenpipe                          0x000000010276f42c deno_core::runtime::jsruntime::JsRuntime::new_inner::h6d05ca2ba0988f22 + 7276
    13  screenpipe                          0x000000010276d4f8 deno_core::runtime::jsruntime::JsRuntime::new::h7b60c681af2638b4 + 108
    14  screenpipe                          0x0000000100fbae8c _ZN15screenpipe_core5pipes5pipes6run_js28_$u7b$$u7b$closure$u7d$$u7d$17h006fc4225d5b821eE + 904
    15  screenpipe                          0x0000000100fb9d74 futures_util::stream::stream::StreamExt::poll_next_unpin::hce92ea0f4fd90e15 + 4296
    16  screenpipe                          0x000000010110a5d4 _ZN72_$LT$core..pin..Pin$LT$P$GT$$u20$as$u20$core..future..future..Future$GT$4poll17h48170fcbfa2f0c24E + 80
    17  screenpipe                          0x0000000101180af0 _ZN88_$LT$tokio..future..poll_fn..PollFn$LT$F$GT$$u20$as$u20$core..future..future..Future$GT$4poll17hc62754d069a4f9b5E + 292
    18  screenpipe                          0x00000001010e7c60 _ZN10screenpipe4main28_$u7b$$u7b$closure$u7d$$u7d$17h524b03b0df84b9a4E + 39844
    19  screenpipe                          0x00000001010d6868 tokio::runtime::park::CachedParkThread::block_on::h5280d81c0b7b4cf6 + 212
    20  screenpipe                          0x000000010109630c tokio::runtime::context::runtime::enter_runtime::h255fb501dd3e45f3 + 376
    21  screenpipe                          0x00000001012292a0 tokio::runtime::runtime::Runtime::block_on::h2b187f4984928f06 + 112
    22  screenpipe                          0x00000001010a3f90 screenpipe::main::he886e40573f82a41 + 172
    23  screenpipe                          0x00000001010720ac std::sys::backtrace::__rust_begin_short_backtrace::h459f278f47b570a5 + 12
    24  screenpipe                          0x00000001011ed8ec _ZN3std2rt10lang_start28_$u7b$$u7b$closure$u7d$$u7d$17h500060577f2d92c3E + 24
    25  screenpipe                          0x000000010439a3f4 std::rt::lang_start_internal::hdd117cb81a316264 + 808
    26  screenpipe                          0x00000001010a4170 main + 52
    27  dyld                                0x00000001954320e0 start + 2360
Trace/BPT trap: 5
(base) louisbeaumont@louisbeemacbook:~/Documents/screen-pipe$

code https://github.com/mediar-ai/screenpipe/blob/main/screenpipe-core/src/pipes.rs

this is weird

when i build locally cargo build --release it works, don't have this issue, but when i build in github ci, macos runner, the binary crash with this error when i use my deno features

https://github.com/mediar-ai/screenpipe/blob/main/.github/workflows/release-app.yml https://github.com/mediar-ai/screenpipe/blob/4ea365ad2655343c350e6dd2159d287c820e0699/.github/workflows/release-app.yml#L154

only on macos, works fine on windows

do you know anything i could change?

tried to bump the version without success

claude

here are 10 possible reasons for this issue, focusing on the differences between your local build and the github ci macos runner:

  1. memory constraints: the ci runner might have less available memory than your local machine, causing the out of memory error.

  2. deno version mismatch: there could be a version incompatibility between deno_core and other dependencies in the ci environment.

  3. macos version differences: the macos version on the runner might be different from your local machine, affecting memory management.

  4. cargo caching issues: improper caching of dependencies in the ci pipeline could lead to inconsistent builds.

  5. compiler optimizations: different optimization levels between local and ci builds might affect memory usage.

  6. environment variables: missing or different environment variables in the ci could impact the build process.

  7. third-party library conflicts: interactions between deno and other libraries might behave differently in the ci environment.

  8. architecture differences: subtle differences in cpu architecture between your local machine and the runner could affect memory allocation.

  9. ci-specific limitations: github actions might impose resource limits that aren't present locally.

  10. dependency resolution: the ci might resolve dependencies slightly differently, pulling in versions that conflict in unexpected ways.

to troubleshoot, you could try:

liashood commented 3 weeks ago

Increase the amount of logging and debugging information in your code to accurately locate where the memory issue is. This will help in identifying specific sections that may require optimization.

try this :

jobs:
  build:
    runs-on: macos-latest
    steps:
      - name: Checkout code
        uses: actions/checkout@v2

      - name: Set up Rust
        uses: actions-rs/toolchain@v1
        with:
          toolchain: stable
          override: true

      - name: Build project
        run: cargo build --release
        env:
          RUSTFLAGS: "-C target-cpu=native"
          V8_FLAGS: "--max-old-space-size=4096"  # Adjust this value as needed
louis030195 commented 3 weeks ago

Increase the amount of logging and debugging information in your code to accurately locate where the memory issue is. This will help in identifying specific sections that may require optimization.

try this :

jobs:
  build:
    runs-on: macos-latest
    steps:
      - name: Checkout code
        uses: actions/checkout@v2

      - name: Set up Rust
        uses: actions-rs/toolchain@v1
        with:
          toolchain: stable
          override: true

      - name: Build project
        run: cargo build --release
        env:
          RUSTFLAGS: "-C target-cpu=native"
          V8_FLAGS: "--max-old-space-size=4096"  # Adjust this value as needed

optimisation?

my JS code is this:

console.log("hi")
louis030195 commented 3 weeks ago

@liashood

same issue with the V8_FLAGS: "--max-old-space-size=4096"

whats RUSTFLAGS: "-C target-cpu=native"? is it necessary?

bartlomieju commented 3 weeks ago

@devsnek please take a look

devsnek commented 3 weeks ago

This backtrace doesn't make much sense to me, CodeRange allocation is not part of setting up the stack. Maybe try a debug build to get a better backtrace?

louis030195 commented 3 weeks ago

Image

This backtrace doesn't make much sense to me, CodeRange allocation is not part of setting up the stack. Maybe try a debug build to get a better backtrace?

does not crash in debug mode :(

louis030195 commented 3 weeks ago

so weird

/Applications/screenpipe.app/Contents/MacOS/screenpipe -> crash

~/Downloads/screenpipe -> works

Can the Fatal process out of memory: Failed to reserve virtual memory for CodeRange be related to permission issue with file access? @devsnek

louis030195 commented 3 weeks ago

nvm its not the path, i ran the CI, ran, moved to different paths it works

i sign the CLI using apple license, although doubt this could be the issue but checking

louis030195 commented 3 weeks ago

actually im confused error does not happen on my friend computer

louis030195 commented 3 weeks ago

1 download build pre tauri + apple signing

https://github.com/mediar-ai/screenpipe/blob/390633ab28297c547a91ffcb90c953e4679d481e/.github/workflows/release-app.yml#L198

2 run ~/Downloads/spp6

works

3 download build post tauri + apple signing

https://github.com/mediar-ai/screenpipe/blob/390633ab28297c547a91ffcb90c953e4679d481e/.github/workflows/release-app.yml#L205

  1. run /Applications/screenpipe.app/Contents/MacOS/screenpipe

Image

louis030195 commented 3 weeks ago

@devsnek are you available for a call in upcoming hours? doing a $300 OSS bounty for this, quite urgent

https://cal.com/louis030195/abcd

i bet we can solve it quick over a call

devsnek commented 3 weeks ago

I don't know much about macOS code signing and I don't have a mac, so I can't really advise you further here.

louis030195 commented 3 weeks ago

not sure how but i fixed the issue