catalystneuro / datta-lab-to-nwb

MIT License
1 stars 2 forks source link

Requested Data #20

Closed pauladkisson closed 4 months ago

pauladkisson commented 1 year ago

Important data for the datta-lab-to-nwb conversion that is not yet provided.

Fiber Photometry

Behavior

Optogenetics

Velocity-Modulation

Keypoints

kaijfox commented 1 year ago

Sorry for the delay here but we've tracked down most of this now. A couple of quick things:

Thanks!

pauladkisson commented 1 year ago

Hey @kaijfox, good to hear from you!

Could you clarify what you're looking for in terms of reference frames?

The reference_frame is a description of the "zero-position" for a Spatial Series like the mouse's position or orientation. For example, "top left", "bottom right", etc.

Unfortunately we didn't record raw opto power series.

No worries, we will just have to work with the data we have.

I believe the position info for the velocity experiments is supposed to be in the dataframes on already zenodo.

XY-position information (centroid_x_mm, centroid_y_mm) is not present in the optoda_raw_data/closed_loop_behavior_velocity_conditioned.parquet dataframe and I didn't see any other files that related to the velocity modulation experiments. If I am overlooking a file already in the zenodo please lmk.

Do you have a standard way you like to receive raw data like this? We're looking at ~11TB.

I think this google drive folder should be able to handle all the data, but I will double-check with @bendichter.

If you get a chance, please also take a look at some of the clarifying questions (#14), which will help me a lot to move forward on this conversion!

Thanks!

CodyCBakerPhD commented 1 year ago

I think this google drive folder should be able to handle all the data, but I will double-check with @bendichter.

The drive is where it will end up, yes - but @bendichter will need to set up a Globus transfer link for them to that location, otherwise if you try to 'drag-and-drop' or other drive-specific methods the storage will count as personal usage (capped at whatever plan the user has) instead of our usage (uncapped)

It's also bandwidth throttled, so you'll only be able to transfer about 1 TB a day - but as long as both Globus endpoints remain active, you should only need to trigger the transfer once and it will be smart enough to reattempt every day

kaijfox commented 1 year ago

Great a globus endpoint will be very helpful. Once we get that I'll upload a couple sessions from each experiment so you have some stuff to work with right away before we hit the bandwidth limit and then we can fill it out with the rest of the data as a second step.

On the reference_frame issue, I'm going to include that in #14, which Win is going to look at in the next couple days since he was around for the data collection.

I'll look into the position information for closed_loop_behavior_velocity_conditioned.parquet. It seems likely that the info simply wasn't included there because it's not directly used in the analysis, but it will be present in moseq-extract data once that's on the google drive so I'll let you know which files to look in specifically.

bendichter commented 1 year ago

@kaijfox can you see if you have access to: https://app.globus.org/file-manager?origin_id=a0e2af7a-11c6-4118-b15a-f53c219e05e1&origin_path=%2FMy%20Drive%2Fdata%2FDatta-lab-to-nwb%2F

I just gave access to "Kai Jordan Fox (kfox@access-ci.org)"

kaijfox commented 1 year ago

Ah I believe I'm kfox1@access-ci.org

bendichter commented 1 year ago

@kaijfox ok, added!

kaijfox commented 1 year ago

Preliminary raw data should be in that google drive.

The root directory contains a number of experiment directories, each of which contains several session directories. The experiments used in the analyses for the paper are listed in a TOML file analysis_configuration.TEMPLATE found in the root directory of dattalab/dopamine-reinforces-spontaneous-behavior.

Each session directory contains:

If the experiment included photometry, the session folder will contain:

A couple of notes

I'm sure questions will come up as you dig into the files just let me know!

bendichter commented 1 year ago

@CodyCBakerPhD , @pauladkisson this looks like it might be a good opportunity for us to develop hooks in NeuroConv for custom temporal alignment procedures.