Open skinkie opened 4 years ago
Can you test the new version 0.1.5
? On a 46MB (10MB gzipped) file, I'm getting 30MB memory usage now (using time -v <command>
, Maximum resident set size)
Alternatively, send me the file (or a similarly sized one) so I can test it locally.
thread 'main' panicked at 'called Result::unwrap()
on an Err
value: ParseIntError { kind: Overflow }', /home/skinkie/.cargo/registry/src/github.com-1ecc6299db9ec823/sdr-heatmap-0.1.5/src/lib.rs:45:24
note: run with RUST_BACKTRACE=1
environment variable to display a backtrace
Obviously I can provide you the file somewhere!
Was your scan above ~4.2GHz?
6GHz ;)
fedab33fbc636181c8d299be4812b3c047e75f66 should fix it. Can you test the latest master
?
If it doesn't work just upload the file (e.g. https://transfer.sh/ ) so I can test the changes myself immediately.
fedab33 should fix it. Can you test the latest
master
? If it doesn't work just upload the file (e.g. https://transfer.sh/ ) so I can test the changes myself immediately.
The file is to big for transfer.sh. How would I do the cargo thing far master?
Transfer.sh should handle up to 10GB. Or mega.nz for up to 50GB, but you need to create an account.
If you're using cargo install
then you can do cargo install --git https://github.com/j2ghz/sdr-heatmap.git --rev fedab33 -f
I am trying to visualize a 6GB file (1.6GB gzip-compressed) that was generated by a hackrf_sweep. Surely the python code was limited to single threading (and I didn't see a rendering yet) this application instantly captures all my RAM (16GB) but does seem to have multicore performance, but I fail to see why this RAM usage is justifiable. The CSV file is an ASCII representation of "floats". But actually the depth of the values can be represented in only two bytes per value. Could you put some effort towards a better RAM ratio?