AllenNeuralDynamics / aind-foraging-behavior-bonsai-basic

Basic analyses of foraging behavior .nwb from Bonsai
MIT License
0 stars 0 forks source link

Convert old bpod nwb to new bonsai nwb format for backward-compatiblity of the current pipeline #28

Closed hanhou closed 5 months ago

hanhou commented 5 months ago

TL;DR: working with nwb is sooooooooo frustrating...

hanhou commented 5 months ago

See a weird bug:

  1. nwb is generated successfully and can be loaded locally

  2. after uploading to S3 (by "registering an external data asset" from the result folder of Run 2450362), the file can still be opened if downloaded from S3

  3. however, nwb cannot be loaded if I attached the data aasset that is a LINK to S3.

    ConstructError: (root/acquisition/bpod_backup_BehavioralEvents/choice GroupBuilder {'attributes': {'comments': 'no comments', 'description': 'time (second) relative to the first trial start (aligned with ephys)', 'namespace': 'core', 'neurodata_type': 'TimeSeries', 'object_id': 'dbfd1c57-8e24-4177-8943-5e5ba8d8933f'}, 'groups': {}, 'datasets': {}, 'links': {}}, "Could not construct TimeSeries object due to: either 'timestamps' or 'rate' must be specified")

  4. I'm trying

    1. IMPORT, not LINK, a new data asset to aind-behavior-data/foraging_nwb_bpod/ (doesn't work!)
    2. to registering Run 2450362 as an CO internal asset directly... (doesn't work!)
hanhou commented 5 months ago

image

hanhou commented 5 months ago

Seems like some weird interaction between nwb and CO data asset:

  1. If the nwb is in /data/foraging_nwb_bpod, it doesn't work
  2. If I copy the same file to /scratch, it works!

But other nwbs in /data/foraging_nwb_bonsai always work... Has something to do with the total size of a folder?

hanhou commented 5 months ago

This does not work

fileN = '473360_2021-08-12_16-52-05.nwb'
nwb_file = '/root/capsule/data/foraging_nwb_bpod/' + fileN

io = NWBHDF5IO(nwb_file_copy, mode='r')
nwb = io.read()

This works

import shutil

nwb_file_copy = '/root/capsule/scratch/' + fileN
shutil.copy(nwb_file, nwb_file_copy)

io = NWBHDF5IO(nwb_file_copy, mode='r')
nwb = io.read()
hanhou commented 5 months ago

- Well, the workaround that copying to scratch DOES NOT WORK for multiprocessing...

- And I confirmed that this is related to the size of a capsule. When I only convert a part of nwbs and put it into a capsule, everything is fine even without copying to scratch.

hanhou commented 5 months ago

Seems like the nwb behaves differently when loaded from VS Code cloud workstation and from "Reproducible Run"???

All above problems only exist in VS Code cloud workstation...