Closed Shimo closed 1 year ago
Yeah RAM is an issue. I scrapped multithreaded reading because the way I made it initially ate all the RAM immediately.
I also get the same problem when trying it with a 5 GB tar.gz file And after using all the system memory it just gets killed
Will attaching RAM limiter to executable resolve the problem?
Will attaching RAM limiter to executable resolve the problem?
I ran into a similar issue with a script I'd written in Python where I tried that. Mine was generating patterns from a word lists, and the solution I found was actually to have it write out the buffer every 100 entries it generated which kept it down. When I tried a ram limiter it just stopped buffering into ram. I'd guess this would have a similar issue? I'm not familiar with Rust yet otherwise I'd see if I could replicate it in a fork. But it might just be as simple as having it check the number of frames every time it loops in the for each and then if it's more than 500 or something write it to the file then wipe buffer and go back to the loop?
I also wonder if there's a use case for bringing back the good ole multi part archives and then choosing the series of archives. Have it do 1GB at a time by splitting the archives into 1GB chunks and then just append the video outputs at the end.
https://github.com/guymarshall/password_generator/blob/master/src/main.rs
In this file on lines 25-27, I am checking if the size is a certain size, then dumping it to storage. Feel free to use my code for this and adapt it for your program to prevent runaway RAM usage!
Its a dirty fix but if you're running on a VM and linux you can allocate a section of a drive as linux-swap RAM and get around the problem.
Its a dirty fix but if you're running on a VM and linux you can allocate a section of a drive as linux-swap RAM and get around the problem.
That could work! But to make the application as cross-platform and portable as possible, would it be better to build a garbage-collector of sorts instead of relying on any one OS feature? Good idea though!
I'm not sure how much use this would prove. However, I did decide to try and force it to not overflow as to prevent a crash through tweaking how my distro handles memory as a whole.
I got this error:
thread` '<unnamed>' panicked at 'Failed to create new Mat: Error { code: -4, message: "OpenCV(4.7.0) /usr/src/debug/opencv/opencv-4.7.0/modules/core/src/alloc.cpp:73: error: (-4:Insufficient memory) Failed to allocate 2764800 bytes in function 'OutOfMemoryError'\n" }', src/embedsource.rs:20:14
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
thread 'thread '<unnamed><unnamed>' panicked at '' panicked at 'Failed to create new Mat: Error { code: -4, message: "OpenCV(4.7.0) /usr/src/debug/opencv/opencv-4.7.0/modules/core/src/alloc.cpp:73: error: (-4:Insufficient memory) Failed to allocate 2764800 bytes in function 'OutOfMemoryError'\n" }Failed to create new Mat: Error { code: -4, message: "OpenCV(4.7.0) /usr/src/debug/opencv/opencv-4.7.0/modules/core/src/alloc.cpp:73: error: (-4:Insufficient memory) Failed to allocate 2764800 bytes in function 'OutOfMemoryError'\n" }thread 'thread '', ', <unnamed><unnamed>src/embedsource.rssrc/embedsource.rs' panicked at '' panicked at '::20Failed to create new Mat: Error { code: -4, message: "OpenCV(4.7.0) /usr/src/debug/opencv/opencv-4.7.0/modules/core/src/alloc.cpp:73: error: (-4:Insufficient memory) Failed to allocate 2764800 bytes in function 'OutOfMemoryError'\n" }Failed to create new Mat: Error { code: -4, message: "OpenCV(4.7.0) /usr/src/debug/opencv/opencv-4.7.0/modules/core/src/alloc.cpp:73: error: (-4:Insufficient memory) Failed to allocate 2764800 bytes in function 'OutOfMemoryError'\n" }20:', ', :14src/embedsource.rssrc/embedsource.rs14
::
2020::1414
thread 'thread '<unnamed><unnamed>' panicked at '' panicked at 'Failed to create new Mat: Error { code: -4, message: "OpenCV(4.7.0) /usr/src/debug/opencv/opencv-4.7.0/modules/core/src/alloc.cpp:73: error: (-4:Insufficient memory) Failed to allocate 2764800 bytes in function 'OutOfMemoryError'\n" }Failed to create new Mat: Error { code: -4, message: "OpenCV(4.7.0) /usr/src/debug/opencv/opencv-4.7.0/modules/core/src/alloc.cpp:73: error: (-4:Insufficient memory) Failed to allocate 2764800 bytes in function 'OutOfMemoryError'\n" }', ', src/embedsource.rssrc/embedsource.rs::2020:14:
14
thread '<unnamed>' panicked at 'Failed to create new Mat: Error { code: -4, message: "OpenCV(4.7.0) /usr/src/debug/opencv/opencv-4.7.0/modules/core/src/alloc.cpp:73: error: (-4:Insufficient memory) Failed to allocate 2764800 bytes in function 'OutOfMemoryError'\n" }', src/embedsource.rs:20:14
thread 'main' panicked at 'called `Result::unwrap()` on an `Err` value: Any { .. }', src/etcher.rs:517:41
Etching video ended in 27783ms
When trying to encode larger files it seems the ram usage continually increases until the computer locks up and the program eventually crashes. Would it be possible to handle larger files by writing the output frames in specified chunk sizes appending them to the end file?
Test file : archlinux-2023.02.01-x86_64.iso
System Details: Kernel: 6.1.12-arch1-1 arch: x86_64 bits: 64 RAM: 64 GiB
Otherwise this is amazing! And it seems others have already made the same suggestions I was thinking of. Keep up the excellent work!