Closed pseeth closed 2 years ago
Looks good to me. This may not be within the scope of this PR, but there are cases in which it may be beneficial to fix a transform across all batch indices (e.g. have Choose
pick the same effect for each batch item, or have RoomImpulseResponse
apply the same impulse response to all batch items). This could be accomplished by a constructor argument in BaseTransform
that, if True
, would result in all states given to .batch_instantiate()
being deterministically replaced with the first state in the batch. This would allow toggling batch-randomness on/off for individual transforms, including Choose
.
class BaseTransform:
def __init__(self, keys: list = [], name: str = None, prob: float = 1.0, batch_randomize=True):
...
def batch_instantiate(
self,
states: list,
signal: AudioSignal = None,
):
if not self.batch_randomize:
states = states[0] * len(states)
kwargs = []
for state in states:
kwargs.append(self.instantiate(state, signal))
kwargs = util.collate(kwargs)
return kwargs
I believe this means the modified states would then propagate down the hierarchy of any composed transforms within the transform in question, which seems like reasonable behavior. This way, all future transforms could be written with the new Choose
approach of randomizing across batch indices, with the constructor argument giving the option to fix transforms across batch indices.
Hmm, I think it's a good idea but should be a separate PR. batch_instantiate
is a helper function, but if you look in CSVDataset
, the transforms are actually instantiated on separate threads, and then collated together afterwards.
This adds two things:
And then some fixes:
hanning
got removed from scipy 1.9.0, as it was just an alias forhann
. So that broke some code, I took out all references tohanning
and replaced them withhann
in the codebase.profile_transforms
script needed some tweaks to accommodateGlobalVolumeNorm
, as well as excludingRepeat
andRepeatUpTo
.Choose
, the regression data forRepeatUpTo
had to be fixed.