Currently, during the generation of IQ samples for OFDM signals the generated signal can be much longer than num_iq_samples. This is dealt with by truncating the IQ samples before returning them. However, depending on the bandwidth of the OFDM signal, this can lead to unnecessary memory/cpu use during generation. Given the current hardcoded minimum bandwidth of 0.2, the OFDM signal generator will use up to 5x more memory than required.
I've been experimenting with generating signals with a much larger number of IQ samples and smaller relative bandwidths and in this case the excess memory/cpu use is much larger.
Here I have fixed the issue similarly to how it is avoided in other places (like ChirpSSDataset._generate_samples() or FSKBasebandModulator()) by limiting the number of symbols generated to a value that will yield a final signal length only slightly larger than num_iq_samples.
Currently, during the generation of IQ samples for OFDM signals the generated signal can be much longer than
num_iq_samples
. This is dealt with by truncating the IQ samples before returning them. However, depending on the bandwidth of the OFDM signal, this can lead to unnecessary memory/cpu use during generation. Given the current hardcoded minimum bandwidth of 0.2, the OFDM signal generator will use up to 5x more memory than required.I've been experimenting with generating signals with a much larger number of IQ samples and smaller relative bandwidths and in this case the excess memory/cpu use is much larger.
Here I have fixed the issue similarly to how it is avoided in other places (like
ChirpSSDataset._generate_samples()
orFSKBasebandModulator()
) by limiting the number of symbols generated to a value that will yield a final signal length only slightly larger thannum_iq_samples
.