Closed AtMostafa closed 2 years ago
Wouldn't calling remove_low_firing_neurons
before z-scoring eliminate this?
I'm a bit uncomfortable with #108. I think I'd prefer throwing errors/warnings or dealing with NaNs rather than silently modifying the data.
you mean adding something like remove_low_firing_neurons(df,signal, 0.01)
in the z_score_signal
function?
Implicitly expecting the user to have run remove_low_firing_neurons
beforehand is not ideal and finding the source of the error takes a long time if later analyses can't handle NaNs.
BTW, is it really modifying the data? I guess if later analyses depend on having std==1, you're right!
No, I think the z-scoring function should just do the one thing it's supposed to.
I think numpy warns you when you're dividing by zero, right? If not, we could warn the user that they have some quiet neurons.
I start every notebook with remove_low_firing_neurons
(but then never z-score).
All-zeros in some columns might mess up analyses, too, which is what you get if you leave non-spiking neurons in.
By modifying the data I mostly meant we're not applying the same transformation to every unit.
I agree with the warning, let's just leave the rest as is.
I just checked and numpy gives this warning by default:
RuntimeWarning: invalid value encountered in true_divide trial_data[signal] = [(s - col_mean) / col_std for s in trial_data[signal]]
Is this okay or do we want to add another warning ourselves?
I wonder why I didn't get any warnings then... anyway, as long as there is some indication, it should be ok.
In
z_score_signal
(here), there should be a check for units without any spikes, otherwise dividing bynp.std
causes NaN values and problems downstream.