Currently we can filter samples for things like "If d1 was below -1 at any timestep in the sample". But when you look at that sample, d1 might actually be quite high. This makes it quite hard to flick through the set of samples and identify commonalities, since you need to look down to make that what's depicted is actually a timestep where d1 is below 0.
If there was an option (a default, preferably) that decided which timesteps would be used in the filtering process, then that'd be very useful. Even just two options, where ts=4 (the gradient timestep) and ts=[all] would be great.
Currently we can filter samples for things like "If d1 was below -1 at any timestep in the sample". But when you look at that sample, d1 might actually be quite high. This makes it quite hard to flick through the set of samples and identify commonalities, since you need to look down to make that what's depicted is actually a timestep where d1 is below 0.
If there was an option (a default, preferably) that decided which timesteps would be used in the filtering process, then that'd be very useful. Even just two options, where ts=4 (the gradient timestep) and ts=[all] would be great.