cpsdqs / cohost-dl

download cohost onto your computer
https://cloudwithlightning.net/random/chostin/cohost-dl/
MIT License
53 stars 6 forks source link

Deno ran out of JS heap memory generating index for all posts #24

Closed Anonymous1157 closed 1 month ago

Anonymous1157 commented 1 month ago

I'm not really sure if this is A Me Problem or a bug in Deno or cohost-dl or what. I think the archival process is very close to finished when it fails so I may have captured all the important data already, but it will be annoying to use without the archive global feed!

System is Gentoo with Deno from cargo install deno.

generating index for all posts
that’s 41747 posts

<--- Last few GCs --->

[57397:0x560dd70c1000] 33353780 ms: Scavenge 1376.5 (1382.5) -> 1374.6 (1381.7) MB, pooled: 0 MB, 1.17 / 0.00 ms  (average mu = 0.758, current mu = 0.542) allocation failure; 
[57397:0x560dd70c1000] 33353907 ms: Mark-Compact 1423.4 (1429.6) -> 1414.3 (1423.4) MB, pooled: 6 MB, 89.27 / 0.00 ms  (average mu = 0.759, current mu = 0.760) allocation failure; scavenge might not succeed

<--- JS stacktrace --->

#
# Fatal JavaScript out of memory: Reached heap limit
#
==== C stack trace ===============================

    deno(+0x4ed2f53) [0x560da06ecf53]
    deno(+0x4524aeb) [0x560d9fd3eaeb]
    deno(+0x4522178) [0x560d9fd3c178]
    deno(+0x45564cc) [0x560d9fd704cc]
    deno(+0x4674ef7) [0x560d9fe8eef7]
    deno(+0x46730a3) [0x560d9fe8d0a3]
    deno(+0x4669ca1) [0x560d9fe83ca1]
    deno(+0x466a772) [0x560d9fe84772]
    deno(+0x464ffda) [0x560d9fe69fda]
    deno(+0x502fe42) [0x560da0849e42]
    deno(+0x4d93d36) [0x560da05add36]
./run.sh: line 3: 57397 Trace/breakpoint trap   deno run --allow-env --allow-ffi --allow-net --allow-read --allow-write=out main.ts
cpsdqs commented 1 month ago

You can give it more memory with e.g. --v8-flags=--max-old-space-size=8192 for 8 GB. I don’t know if the archive global feed will have usable performance with so many posts though…

Anonymous1157 commented 1 month ago

I simply ran it again and it finished this time? I'm not sure how/why... I did set ulimit -s 65536 on a whim but I doubt that actually did anything. I just assumed I would be able to reproduce the error, sorry to bother!

The archive global feed is working and the performance is scary fast actually.

cpsdqs commented 1 month ago

huh, alright! then I suppose everything is good & nice in the world