Open plasmaphase opened 3 months ago
Noisereduce iterates over chunks so length doesn’t matter. It looks like the it thinks your data has 268416000 channels which probably means you just need to transpose the input data
Tim Sainburg https://timsainburg.com/ Postdoctoral Fellow Harvard Medical School 814.574.7780, @.***
On Sun, Jun 30, 2024 at 16:49 Mark Gilson @.***> wrote:
Is there a way this could be used on larger wav files without needed so much memory? I'm processing 2GB wav files and I get this error:
File "/mnt/vault/Audio/remnoise.py", line 6, in
reduced_noise = nr.reduce_noise(y=data, sr=rate) File "/home/tron/.local/lib/python3.10/site-packages/noisereduce/noisereduce.py", line 186, in reduce_noise return sg.get_traces() File "/home/tron/.local/lib/python3.10/site-packages/noisereduce/spectralgate/base.py", line 222, in get_traces filtered_chunk = self.filter_chunk(start_frame=0, end_frame=end_frame) File "/home/tron/.local/lib/python3.10/site-packages/noisereduce/spectralgate/base.py", line 148, in filter_chunk padded_chunk = self._read_chunk(i1, i2) File "/home/tron/.local/lib/python3.10/site-packages/noisereduce/spectralgate/base.py", line 140, in _read_chunk chunk = np.zeros((self.n_channels, i2 - i1)) numpy.core._exceptions._ArrayMemoryError: Unable to allocate 117. TiB for an array with shape (268416000, 60002) and data type float64 — Reply to this email directly, view it on GitHub https://github.com/timsainb/noisereduce/issues/109, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAJYKBWADOXINKCNHNJYHGTZKBVOHAVCNFSM6AAAAABKEMW2B6VHI2DSMVQWIX3LMV43ASLTON2WKOZSGM4DENBUGAYTEMI . You are receiving this because you are subscribed to this thread.Message ID: @.***>
Is there a way this could be used on larger wav files without needed so much memory? I'm processing 2GB wav files and I get this error:
File "/mnt/vault/Audio/remnoise.py", line 6, in <module> reduced_noise = nr.reduce_noise(y=data, sr=rate) File "/home/tron/.local/lib/python3.10/site-packages/noisereduce/noisereduce.py", line 186, in reduce_noise return sg.get_traces() File "/home/tron/.local/lib/python3.10/site-packages/noisereduce/spectralgate/base.py", line 222, in get_traces filtered_chunk = self.filter_chunk(start_frame=0, end_frame=end_frame) File "/home/tron/.local/lib/python3.10/site-packages/noisereduce/spectralgate/base.py", line 148, in filter_chunk padded_chunk = self._read_chunk(i1, i2) File "/home/tron/.local/lib/python3.10/site-packages/noisereduce/spectralgate/base.py", line 140, in _read_chunk chunk = np.zeros((self.n_channels, i2 - i1)) numpy.core._exceptions._ArrayMemoryError: Unable to allocate 117. TiB for an array with shape (268416000, 60002) and data type float64