catalystneuro / neuroconv

Create NWB files by converting and combining neural data in proprietary formats and adding essential metadata.
https://neuroconv.readthedocs.io
BSD 3-Clause "New" or "Revised" License
51 stars 21 forks source link

[Bug]: Issue with DLC interface not loading #1114

Open vigji opened 1 week ago

vigji commented 1 week ago

What happened?

I am replicating the funny behavior I already described in #967: when running DLCInterface conversions, it fails silently not adding the correct fields in the nwb.

This time it was more painful as I forgot about the issue previously raised, and here it was 'in production'. I was running a batch conversion of multiple experiments:

for exp in experiments:
    load_stuff()
    interfaces_pipe = make_interfaces()
    interfaces_pipe.run_conversion()

This resulted in only the first experiment missing all the DLC related stuff - which was driving me completely nuts for a good day of debugging, being those also quite slow operations, until I realized it sounded like the previous issue - and I fixed it adding a random

from ndx_pose import PoseEstimation, PoseEstimationSeries

At the beginning of the script (alternatively, running twice the conversion of the first experiment :D)

This was massively annoying, I strongly suggest looking deeper into this or change the DLC example to document it.

This time it was happening from a different machine and different OS! (same interface though - always VSCode, running a script)

Steps to Reproduce

The following code results in the bug:

from datetime import datetime
from zoneinfo import ZoneInfo
from pathlib import Path
from neuroconv.datainterfaces import DeepLabCutInterface
import numpy as np

file_path = ".../....h5"
config_file_path = ".../config.yaml"
path_to_save_nwbfile = ".../test_nwb.nwb"

timestamps = np.array([1,2,3])  # stupid, speeds up processing if tstamps provided

interface = DeepLabCutInterface(file_path=file_path, config_file_path=config_file_path, 
                                subject_name="ind1", verbose=False)
interface.set_aligned_timestamps(timestamps)

metadata = interface.get_metadata()

session_start_time = datetime(2020, 1, 1, 12, 30, 0, tzinfo=ZoneInfo("US/Pacific"))
metadata["NWBFile"].update(session_start_time=session_start_time)

interface.run_conversion(nwbfile_path=path_to_save_nwbfile, metadata=metadata)

from pynwb import NWBHDF5IO

# confirm the file contains the new TimeSeries in acquisition
with NWBHDF5IO(path_to_save_nwbfile, "r") as io:
    read_nwbfile = io.read()
    print(read_nwbfile.processing)

The print states:

Fields:
  data_interfaces: {
    PoseEstimation <class 'pynwb.core.NWBDataInterface'>
  }
  description: processed behavioral data
}

And there is no data inside, unless I run the code twice or I add the initial import, in which case I get

Fields:
  data_interfaces: {
    PoseEstimation <class 'ndx_pose.pose.PoseEstimation'>
  }
  description: processed behavioral data
}

and data can be accessed.

Traceback

No response

Operating System

Windows

Python Executable

Conda

Python Version

3.10

Package Versions

neuroconv == 0.6.4

Code of Conduct

h-mayorquin commented 1 week ago

Hey, for provenance here is the exact link to your previous report:

https://github.com/catalystneuro/neuroconv/pull/967#issuecomment-2244737817

Thanks a bunch. I am able to reproduce the error. We will take a look into it.

vigji commented 1 week ago

Great, thanks!