catalystneuro / datta-lab-to-nwb

MIT License
1 stars 2 forks source link

regular timestamps calculated #109

Closed bendichter closed 2 months ago

bendichter commented 3 months ago

https://github.com/catalystneuro/datta-lab-to-nwb/blob/7540d94f152fcee2707131872d5410cd2e5f2d45/src/datta_lab_to_nwb/markowitz_gillis_nature_2023_keypoint/keypointinterface.py#L53

@pauladkisson why are we computing the timestamps here? Seems like we should be using starting time and sampling rate

bendichter commented 3 months ago

Also, more importantly, is this the first stream? I couldn't find any logic to synchronize this with other streams in the conversion code.

pauladkisson commented 3 months ago

Yes, this should be using starting time and sampling rate -- looks like a mistake made in all the confusion of temporal alignment.

We ended up adding the raw fiber photometry response data without temporal alignment, but with a comment warning that the raw data was not aligned. This is because the datta lab was unable to come up with the alignment information that we needed. All the other data streams (processed photometry, keypoints, moseq-extract output, etc.) are already aligned.

bendichter commented 3 months ago

ok got it, thanks!

CodyCBakerPhD commented 3 months ago

@pauladkisson did you want to regenerate the files with starting time + rate or just accept the less efficient timestamps and close the issue?

pauladkisson commented 3 months ago

I can go ahead and regenerate the files, it should be an easy fix.