catalystneuro / tye-lab-to-nwb

NWB Conversion project for the Tye lab at the Salk Institute.
MIT License
0 stars 0 forks source link

Discontinuous timestamps in AUX.continuous data files #29

Closed laurelrr closed 11 months ago

laurelrr commented 11 months ago

Working subject 41, I received this error: (tye_lab_to_nwb_env) lkeyes@node32:~/Projects/GIT/tye-lab-to-nwb$ python src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_all_sessions.py 0%| | 0/1 [00:00<?, ?it/s] Source data is valid! concurrent.futures.process._RemoteTraceback: """ Traceback (most recent call last): File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/concurrent/futures/process.py", line 256, in _process_worker r = call_item.fn(*call_item.args, call_item.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_session.py", line 145, in session_to_nwb converter = NeurotensinValenceNWBConverter(source_data=source_data) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/neuroconv/src/neuroconv/nwbconverter.py", line 64, in init self.data_interface_objects = { ^ File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/neuroconv/src/neuroconv/nwbconverter.py", line 65, in name: data_interface(source_data[name]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/neuroconv/src/neuroconv/datainterfaces/ecephys/openephys/openephysdatainterface.py", line 40, in new return OpenEphysLegacyRecordingInterface( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/neuroconv/src/neuroconv/datainterfaces/ecephys/openephys/openephyslegacydatainterface.py", line 42, in init available_streams = self.get_stream_names(folder_path=folder_path) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/neuroconv/src/neuroconv/datainterfaces/ecephys/openephys/openephyslegacydatainterface.py", line 57, in ge t_stream_names streamnames, = OpenEphysLegacyRecordingExtractor.get_streams(folder_path=folder_path) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/spikeinterface/extractors/neoextractors/neobaseextractor.py", line 142, in get_streams neo_reader = get_reader(cls.NeoRawIOClass, **neo_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/spikeinterface/extractors/neoextractors/neobaseextractor.py", line 14, in get_reader neo_reader.parse_header() File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/neo/rawio/baserawio.py", line 179, in parse_header self._parse_header() File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/neo/rawio/openephysrawio.py", line 118, in _parse_header assert np.all(diff == RECORD_SIZE), \ AssertionError: Not continuous timestamps for 104_AUX1.continuous. Maybe because recording was paused/stopped. """

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_all_sessions.py", line 92, in parallel_convert_sessions( File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_all_sessions.py", line 83, in parallel_con vert_sessions future.result() File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/concurrent/futures/_base.py", line 449, in result return self.get_result() ^^^^^^^^^^^^^^^^^^^ File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/concurrent/futures/_base.py", line 401, in get_result raise self._exception AssertionError: Not continuous timestamps for 104_AUX1.continuous. Maybe because recording was paused/stopped.

CodyCBakerPhD commented 11 months ago

Looks like this has occurred before; there might even be a fix on neo main (no release of this year...)

Otherwise Alessio had a suggested workaround: https://github.com/NeuralEnsemble/python-neo/issues/1210

Should be an easy fix

weiglszonja commented 11 months ago

Thank you @laurelrr for opening this issue!

So basically the fix would be for us to include an extra step by copying the legacy files and saving them as spikeinterface binary recordings (https://github.com/NeuralEnsemble/python-neo/issues/1210)?

recording = se.read_openephys("...")

recording_saved = recording.save(folder="my-binary-folder", n_jobs=8, progress_bar=True)

# continue analysis with recording_saved (which is now a binary recording)
CodyCBakerPhD commented 11 months ago

So basically the fix would be for us to include an extra step by copying the legacy files and saving them as spikeinterface binary recordings (https://github.com/NeuralEnsemble/python-neo/issues/1210)?

I'd try the unreleased master branch in hopes that https://github.com/NeuralEnsemble/python-neo/pull/1213 fixed it before going to that? I guess we would need the data to check on our end if that fixes anything, or could ask @laurelrr to pip install git+https://github.com/NeuralEnsemble/python-neo.git@master and try that session again?

laurelrr commented 11 months ago

Sure, let me try that now.

laurelrr commented 11 months ago

No, I think it is the same error:

(tye_lab_to_nwb_env) lkeyes@node32:~/Projects/GIT/tye-lab-to-nwb$ python src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_all_sessions.py
  0%|                                                                                                                                                                                  | 0/1 [00:00<?, ?it/s]Source data is valid!
concurrent.futures.process._RemoteTraceback:
"""
Traceback (most recent call last):
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/concurrent/futures/process.py", line 256, in _process_worker
    r = call_item.fn(*call_item.args, **call_item.kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_session.py", line 145, in session_to_nwb
    converter = NeurotensinValenceNWBConverter(source_data=source_data)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/neuroconv/src/neuroconv/nwbconverter.py", line 64, in __init__
    self.data_interface_objects = {
                                  ^
  File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/neuroconv/src/neuroconv/nwbconverter.py", line 65, in <dictcomp>
    name: data_interface(**source_data[name])
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/neuroconv/src/neuroconv/datainterfaces/ecephys/openephys/openephysdatainterface.py", line 40, in __new__
    return OpenEphysLegacyRecordingInterface(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/neuroconv/src/neuroconv/datainterfaces/ecephys/openephys/openephyslegacydatainterface.py", line 42, in __init__
    available_streams = self.get_stream_names(folder_path=folder_path)
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/neuroconv/src/neuroconv/datainterfaces/ecephys/openephys/openephyslegacydatainterface.py", line 57, in get_stream_names
    stream_names, _ = OpenEphysLegacyRecordingExtractor.get_streams(folder_path=folder_path)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/spikeinterface/extractors/neoextractors/neobaseextractor.py", line 142, in get_streams
    neo_reader = get_reader(cls.NeoRawIOClass, **neo_kwargs)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/spikeinterface/extractors/neoextractors/neobaseextractor.py", line 14, in get_reader
    neo_reader.parse_header()
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/neo/rawio/baserawio.py", line 178, in parse_header
    self._parse_header()
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/neo/rawio/openephysrawio.py", line 120, in _parse_header
    raise ValueError(
ValueError: Not continuous timestamps for 104_AUX1.continuous. Maybe because recording was paused/stopped.
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_all_sessions.py", line 92, in <module>
    parallel_convert_sessions(
  File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_all_sessions.py", line 83, in parallel_convert_sessions
    future.result()
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
ValueError: Not continuous timestamps for 104_AUX1.continuous. Maybe because recording was paused/stopped.
weiglszonja commented 11 months ago

I’d be happy to take a look at that session @laurelrr to see how to fix this issue! Can you share this session with us?

laurelrr commented 11 months ago

Yes, I can do that.

laurelrr commented 11 months ago

OK, I uploaded the neural data via globus. This should have the events, plex, and recordings. Let me know if I missed something.

weiglszonja commented 11 months ago

Thank you @laurelrr, I managed to download and look more into this issue. With that fix you can turn off this error by using the ignore_timestamps_errors argument just like this:

from neo.rawio import OpenEphysRawIO
io = OpenEphysRawIO("/Users/weian/data/hao", ignore_timestamps_errors=True)
# parse header to see we got it right
io.parse_header()

The reason why you still have that error is because we need to enable this argument in here and in our interface. This is an easy fix (if @CodyCBakerPhD also agrees) I'll work on this tomorrow to have this fix be ready to used in this pipeline.

CodyCBakerPhD commented 11 months ago

@weiglszonja That sounds great, thanks for digging into it!

weiglszonja commented 11 months ago

@laurelrr While we have to wait on other dependencies to be able merge my fix into neuroconv, you can already use the fix if you install neuroconv pinned to this branch as:

 pip install git+https://github.com/catalystneuro/neuroconv@add_ignore_timestamps_errors_to_interface

You can pull the latest changes from this repo and rerun those sessions where you had these issues (the other one with the date issue should also be fixed)

Let me know if the issue persists.

laurelrr commented 11 months ago

Hello Szonja, I tried installing your changes as you suggested (in my conda environment) using these commands:

pip install git+https://github.com/catalystneuro/neuroconv@add_ignore_timestamps_errors_to_interface
git pull

I changed line 90 of src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_all_sessions.py to point to the excel file set up for my one subject. Then, I attempted to re-run processing on one subject.
However, I see the error shown below. Could you let me know if I misunderstood your instructions on correctly pulling your most recent changes?

lkeyes@node32:/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb$ python src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_all_sessions.py
  0%|                                                                                                                                                                                 | 0/1 [00:00<?, ?it/s]Source data is valid!
concurrent.futures.process._RemoteTraceback:
"""
Traceback (most recent call last):
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/concurrent/futures/process.py", line 256, in _process_worker
    r = call_item.fn(*call_item.args, **call_item.kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_session.py", line 149, in session_to_nwb
    converter = NeurotensinValenceNWBConverter(source_data=source_data)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/neuroconv/nwbconverter.py", line 65, in __init__
    self.data_interface_objects = {
                                  ^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/neuroconv/nwbconverter.py", line 66, in <dictcomp>
    name: data_interface(**source_data[name])
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/neuroconv/datainterfaces/ecephys/openephys/openephysdatainterface.py", line 45, in __new__
    return OpenEphysLegacyRecordingInterface(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/neuroconv/datainterfaces/ecephys/openephys/openephyslegacydatainterface.py", line 56, in __init__
    available_streams = self.get_stream_names(
                        ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/neuroconv/datainterfaces/ecephys/openephys/openephyslegacydatainterface.py", line 17, in get_stream_names
    stream_names, _ = OpenEphysLegacyRecordingExtractor.get_streams(
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/spikeinterface/extractors/neoextractors/neobaseextractor.py", line 141, in get_streams
    neo_kwargs = cls.map_to_neo_kwargs(*args, **kwargs)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: OpenEphysLegacyRecordingExtractor.map_to_neo_kwargs() got an unexpected keyword argument 'ignore_timestamps_errors'
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_all_sessions.py", line 93, in <module>
    parallel_convert_sessions(
  File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_all_sessions.py", line 83, in parallel_convert_sessions
    future.result()
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
TypeError: OpenEphysLegacyRecordingExtractor.map_to_neo_kwargs() got an unexpected keyword argument 'ignore_timestamps_errors'
CodyCBakerPhD commented 11 months ago

Maybe

pip uninstall neuroconv

to ensure all neuroconv references in the environment are removed (including the previous static version in the site-packages)

then again reinstall from dev branch

pip install git+https://github.com/catalystneuro/neuroconv@add_ignore_timestamps_errors_to_interface
laurelrr commented 11 months ago

OK, I tried that. Seems like it gives me the same error

(tye_lab_to_nwb_env) lkeyes@node32:/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb$ python src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_all_sessions.py
  0%|                                                                                                                                                                                 | 0/1 [00:00<?, ?it/s]Source data is valid!
concurrent.futures.process._RemoteTraceback:
"""
Traceback (most recent call last):
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/concurrent/futures/process.py", line 256, in _process_worker
    r = call_item.fn(*call_item.args, **call_item.kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_session.py", line 149, in session_to_nwb
    converter = NeurotensinValenceNWBConverter(source_data=source_data)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/neuroconv/nwbconverter.py", line 65, in __init__
    self.data_interface_objects = {
                                  ^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/neuroconv/nwbconverter.py", line 66, in <dictcomp>
    name: data_interface(**source_data[name])
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/neuroconv/datainterfaces/ecephys/openephys/openephysdatainterface.py", line 45, in __new__
    return OpenEphysLegacyRecordingInterface(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/neuroconv/datainterfaces/ecephys/openephys/openephyslegacydatainterface.py", line 56, in __init__
    available_streams = self.get_stream_names(
                        ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/neuroconv/datainterfaces/ecephys/openephys/openephyslegacydatainterface.py", line 17, in get_stream_names
    stream_names, _ = OpenEphysLegacyRecordingExtractor.get_streams(
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/spikeinterface/extractors/neoextractors/neobaseextractor.py", line 141, in get_streams
    neo_kwargs = cls.map_to_neo_kwargs(*args, **kwargs)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: OpenEphysLegacyRecordingExtractor.map_to_neo_kwargs() got an unexpected keyword argument 'ignore_timestamps_errors'
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_all_sessions.py", line 93, in <module>
    parallel_convert_sessions(
  File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_all_sessions.py", line 83, in parallel_convert_sessions
    future.result()
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
TypeError: OpenEphysLegacyRecordingExtractor.map_to_neo_kwargs() got an unexpected keyword argument 'ignore_timestamps_errors'
CodyCBakerPhD commented 11 months ago

Oh, but did you update the other dependencies in the current environment?

pip install git+https://github.com/neuralensemble/python-neo@master
pip install git+https://github.com/spikeinterface/spikeinterface@main
laurelrr commented 11 months ago

No, I missed those. Trying now...

laurelrr commented 11 months ago

Oooh, looks promising! It's running without an error. I'll keep you posted on the progress. Thanks!

laurelrr commented 11 months ago

Darn, got a new error:

(tye_lab_to_nwb_env) lkeyes@node32:/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb$ python src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_al
  0%|                                                                                                                                                          Source data is valid!
                                                                                                                                                               concurrent.futures.process._RemoteTraceback:
"""
Traceback (most recent call last):
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/concurrent/futures/process.py", line 256, in _process_worker
    r = call_item.fn(*call_item.args, **call_item.kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_session.py", line 149, in session_to_nwb
    converter = NeurotensinValenceNWBConverter(source_data=source_data)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/neuroconv/nwbconverter.py", line 65, in __init__
    self.data_interface_objects = {
                                  ^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/site-packages/neuroconv/nwbconverter.py", line 66, in <dictcomp>
    name: data_interface(**source_data[name])
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: Can't instantiate abstract class NeurotensinDeepLabCutInterface with abstract method add_to_nwbfile
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_all_sessions.py", line 93, in <module>
    parallel_convert_sessions(
  File "/nadata/snlkt/home/lkeyes/Projects/GIT/tye-lab-to-nwb/src/tye_lab_to_nwb/neurotensin_valence/neurotensin_valence_convert_all_sessions.py", line 83, in parallel_convert_sessions
    future.result()
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/home/lkeyes/anaconda3/envs/tye_lab_to_nwb_env/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
TypeError: Can't instantiate abstract class NeurotensinDeepLabCutInterface with abstract method add_to_nwbfile
CodyCBakerPhD commented 11 months ago

@laurelrr Sorry about that - opened #33 which shouldn't take too long to fix