Closed saraalemadi closed 5 years ago
You'll want to go through it in batches. Pseudocode:
BATCH_SIZE = 50
NUM_GEN = 1000000
for i in range(0, NUM_GEN, BATCH_SIZE):
_z = (np.random.rand(BATCH_SIZE, 100) * 2.) - 1.'
_G_z = ...
for j in range (BATCH_SIZE):
clip = _G_z[j]
librosa.write_wav(...)
Hi @chrisdonahue,
I am trying to generate a very large dataset (1M audio clips) from a trained waveGAN model, I was able to incoperate a for loop to generate 4 audio clips, however, I have noticed when I increqse that to a million I get an error related to this line specifically in generate.py:
_z = (np.random.rand(64, 100) * 2.) - 1.
Any tips on how to resolve this issue?
Thanks, Sara
Full generate.py code with my modification for your reference: