Closed dysprague closed 8 months ago
Could you clarify that the error comes up only when running inspector and not when writing the file or using any of the containers in question @dysprague
What is your timeline in terms of pushing the extension? I should have some time next week to take a look! @dysprague
The error only comes up when running inspector. I was reading and writing and using these files with behavior as expected and only triggered this issue when running the NWB inspector.
I'm not in a huge rush but hoping to publish in the next two weeks or so if you could take a look this week that would be great!
Thanks!
@CodyCBakerPhD I just wanted to reach since this issue seems like it might be related to nwb-inspector. Is there maybe just a different way in which to call the inspector since this involves extensions?
The error only comes up when running inspector.
I think this is in part because in your example you are defining and registering a custom class for PlaneExtension(PlaneSegmentation):
. Because of this HDMF doesn't need to auto-generate the class when you read/write data in the example. However, when running the inspector the extension is probably loaded from the schema cached in the NWBFile, and the custom class has not been registered, such that HDMF needs to auto-generate a class for PlaneExtension
via the generate_class
function.
@oruebel Let me see if I understand. PlaneSegmentation
is a defined class so it makes sense not to autogenerate. However, the inspector does not directly deal with extensions such that even though the class is registered in the extension, because it is not in HDMF it will try to autogenerate from the schema in the extension, which is what possibly is leading to some issues.
PlaneSegmentation
is a defined class so it makes sense not to autogenerate.
PlaneSegmentation
is defined in the NWB schema and has a defined class in PyNWB. However, PlaneExtension
is part of the extension. The read/write example shown in the issue defines and registers it's own custom Container class for PlaneExtension
so that code does not need to user auto-generate.
However, the inspector does not directly deal with extensions such that even though the class is registered in the extension
I assume the extension is not actually installed and imported when running the nwb-inspector. I.e., the inspector needs to load the schema for the extension from the file, and since there is no Container for PlaneExtension
, HDMF needs to auto-generate the class.
@dysprague can you provide a link to your extension and I'll take a look.
PlaneSegmentation
is a defined class so it makes sense not to autogenerate.
PlaneSegmentation
is defined in the NWB schema and has a defined class in PyNWB. However,PlaneExtension
is part of the extension. The read/write example shown in the issue defines and registers it's own custom Container class forPlaneExtension
so that code does not need to user auto-generate.However, the inspector does not directly deal with extensions such that even though the class is registered in the extension
I assume the extension is not actually installed and imported when running the nwb-inspector. I.e., the inspector needs to load the schema for the extension from the file, and since there is no Container for
PlaneExtension
, HDMF needs to auto-generate the class.
The inspector didn't have an issue with the BEADL extension. I'll try running inspector with the extension and see.
@mavaylon1 Providing the link to the extension here.
I used PlaneExtension in the original comment to recreate the error in the most basic way, but the object that is actually triggering the error in my workflow is the VolumeSegmentation object
Just checking in on if there has been any progress on this. Let me know if you need anything else from me.
@dysprague Taking a look today! I'll keep you posted.
@dysprague I don't see the spec for your PlaneExtension in the repo. Do you have it there and I am missing something?
@dysprague have you pip install -e .
your extension?
@dysprague Any word on the spec?
@mavaylon1 Sorry, I was out of office Thursday and Friday and just saw this! I have pip install -e .
the extension.
The object that is triggering the error in my extension is the VolumeSegmentation object which inherits from the PlaneSegmentation object. I wrote the PlaneExtension object here to show the minimal case that creates the error, but VolumeSegmentation triggers that exact same error.
Hey @dysprague I am having trouble actually reproducing your error. These are the versions I am using and the steps I did. Let's see if we can get to the same output.
from pynwb import NWBFile, NWBHDF5IO
from pynwb import register_class
from pynwb.file import Subject from pynwb.ophys import PlaneSegmentation
from hdmf.utils import docval, popargs, get_docval
from ndx_multichannel_volume import CElegansSubject, OpticalChannelReferences, OpticalChannelPlus, ImagingVolume, VolumeSegmentation, MultiChannelVolume from pynwb.ophys import OnePhotonSeries, OpticalChannel, ImageSegmentation, Fluorescence, CorrectedImageStack, MotionCorrection, RoiResponseSeries, PlaneSegmentation, ImagingPlane import numpy as np
from datetime import datetime from dateutil import tz
class PlaneExtension(PlaneSegmentation):
__fields__ = ({'name':'imaging_plane', 'child':True},
{'name':'reference_images','child':True})
@docval(*get_docval(PlaneSegmentation.__init__, 'imaging_plane', 'name','reference_images','id','description', 'columns', 'colnames'))
def __init__(self, **kwargs):
super().__init__(**kwargs)
@docval(*get_docval(PlaneSegmentation.add_roi,'pixel_mask','voxel_mask','image_mask','id'),
allow_extra=True)
def add_roi(self, **kwargs):
"""Add a Region Of Interest (ROI) data to this"""
return super().add_roi(**kwargs)
@docval(*get_docval(PlaneSegmentation.create_roi_table_region, 'description','region','name'))
def create_roi_table_region(self, **kwargs):
return self.create_region(**kwargs)
nwbfile = NWBFile( session_description = 'Add a description for the experiment/session. Can be just long form text',
identifier = '20230322-21-41-10',
#Specify date and time of trial. Datetime entries are in order Year, Month, Day, Hour, Minute, Second. Not all entries are necessary
session_start_time = datetime(2023, 3, 22, 21, 41, 10, tzinfo=tz.gettz("US/Pacific")),
lab = 'FOCO lab',
institution = 'UCSF',
related_publications = ''
)
nwbfile.subject = Subject( subject_id="001", age="P90D", description="mouse 5", species="Mus musculus", sex="M", )
device = nwbfile.create_device( name="Microscope", description="My two-photon microscope", manufacturer="The best microscope manufacturer", )
optical_channel = OpticalChannel( name="OpticalChannel", description="an optical channel", emission_lambda=500.0, )
imaging_plane = nwbfile.create_imaging_plane( name="ImagingPlane", optical_channel=optical_channel, imaging_rate=30.0, description="a very interesting part of the brain", device=device, excitation_lambda=600.0, indicator="GFP", location="V1", grid_spacing=[0.01, 0.01], grid_spacing_unit="meters", origin_coords=[1.0, 2.0, 3.0], origin_coords_unit="meters", )
img_seg = ImageSegmentation( name = 'NeuroPALSegmentation' )
ps = PlaneExtension( name = 'PlaneSeg', description = 'test plane extension', imaging_plane = imaging_plane, )
for _ in range(30):
x = np.random.randint(0, 95)
y = np.random.randint(0, 95)
z = np.random.randint(0, 95)
# define an example 4 x 3 region of pixels of weight '1'
pixel_mask = []
for ix in range(x, x + 4):
for iy in range(y, y + 3):
pixel_mask.append([ix, iy, 1])
# add pixel mask to plane segmentation
ps.add_roi(pixel_mask=pixel_mask)
img_seg.add_plane_segmentation(ps)
ophys_module = nwbfile.create_processing_module( name="ophys", description="optical physiology processed data" )
ophys_module.add(img_seg)
with NWBHDF5IO("test_plane_extend.nwb","w") as io: io.write(nwbfile)
Try this in a new environment and check your versions. It is similar to your code that you have but I commented out the register_class for PlaneExtension
@rly Can you also follow those steps? It seems to write the nwbfile just fine. I run the inspector as well and have no issues. Granted, I ran it without Dandi settings.
@dysprague can you run the inspector as follows: nwbinspector <path_to_file>
- Cloned your extension and pip install -e
@mavaylon1 if the extension is installed with pip install -e
I'm wondering whether the NWBInspector/PyNWB ends up using the PlaneSegmentation Container class that is defined in the extension. Just as a sanity check, can you try running the nwbinspector in a conda environment where PyNWB and the inspector are installed but the extension is not. From the error, it seems that issue arises in the class generation.
- Cloned your extension and pip install -e
@mavaylon1 if the extension is installed with
pip install -e
I'm wondering whether the NWBInspector/PyNWB ends up using the PlaneSegmentation Container class that is defined in the extension. Just as a sanity check, can you try running the nwbinspector in a conda environment where PyNWB and the inspector are installed but the extension is not. From the error, it seems that issue arises in the class generation.
@oruebel The extension does not define a PlaneSegmentation class. It imports it from pynwb.
The extension does not define a PlaneSegmentation class. It imports it from pynwb.
Sorry, I meant the PlaneExtension
class not PlanSegmentation
@mavaylon1 Confirming that when I run the code that you provide, this error is not triggered. This is both with and without the dandi config flag.
https://github.com/focolab/ndx-multichannel-volume/tree/test I just added the PlaneExtension class to the schema in a new git branch of the extension. When I register class in the ndx-multichannel-volume.py file and then otherwise run all of the same code as the previous test when the class was just defined but not registered, I am still triggering the same error as I described initially.
Used the code below to generate the file after pip install . from the test branch of the extension.
from pynwb import NWBFile, NWBHDF5IO
from pynwb import register_class
from pynwb.file import Subject
from pynwb.ophys import PlaneSegmentation
from hdmf.utils import docval, popargs, get_docval
from ndx_multichannel_volume import CElegansSubject, OpticalChannelReferences, OpticalChannelPlus, ImagingVolume, VolumeSegmentation, MultiChannelVolume, PlaneExtension
from pynwb.ophys import OnePhotonSeries, OpticalChannel, ImageSegmentation, Fluorescence, CorrectedImageStack, MotionCorrection, RoiResponseSeries, PlaneSegmentation, ImagingPlane
import numpy as np
from datetime import datetime
from dateutil import tz
#Below is the code used to define the extended class
#the following code was used to create the minimum working example failing NWB file
nwbfile = NWBFile(
session_description = 'Add a description for the experiment/session. Can be just long form text',
#Can use any identity marker that is specific to an individual trial. We use date-time to specify trials
identifier = '20230322-21-41-10',
#Specify date and time of trial. Datetime entries are in order Year, Month, Day, Hour, Minute, Second. Not all entries are necessary
session_start_time = datetime(2023, 3, 22, 21, 41, 10, tzinfo=tz.gettz("US/Pacific")),
lab = 'FOCO lab',
institution = 'UCSF',
related_publications = ''
)
nwbfile.subject = Subject(
subject_id="001",
age="P90D",
description="mouse 5",
species="Mus musculus",
sex="M",
)
device = nwbfile.create_device(
name="Microscope",
description="My two-photon microscope",
manufacturer="The best microscope manufacturer",
)
optical_channel = OpticalChannel(
name="OpticalChannel",
description="an optical channel",
emission_lambda=500.0,
)
imaging_plane = nwbfile.create_imaging_plane(
name="ImagingPlane",
optical_channel=optical_channel,
imaging_rate=30.0,
description="a very interesting part of the brain",
device=device,
excitation_lambda=600.0,
indicator="GFP",
location="V1",
grid_spacing=[0.01, 0.01],
grid_spacing_unit="meters",
origin_coords=[1.0, 2.0, 3.0],
origin_coords_unit="meters",
)
img_seg = ImageSegmentation(
name = 'NeuroPALSegmentation'
)
ps = PlaneExtension(
name = 'PlaneSeg',
description = 'test plane extension',
imaging_plane = imaging_plane,
)
for _ in range(30):
# randomly generate example starting points for region
x = np.random.randint(0, 95)
y = np.random.randint(0, 95)
z = np.random.randint(0, 95)
# define an example 4 x 3 region of pixels of weight '1'
pixel_mask = []
for ix in range(x, x + 4):
for iy in range(y, y + 3):
pixel_mask.append([ix, iy, 1])
# add pixel mask to plane segmentation
ps.add_roi(pixel_mask=pixel_mask)
img_seg.add_plane_segmentation(ps)
ophys_module = nwbfile.create_processing_module(
name="ophys", description="optical physiology processed data"
)
ophys_module.add(img_seg)
with NWBHDF5IO("/Users/danielysprague/foco_lab/data/Test/test.nwb","w") as io:
io.write(nwbfile)
Then run nwbinspector
Hi @dysprague we found the issue and are working on a fix: https://github.com/hdmf-dev/hdmf/issues/959 The issue is with how we auto-generate classes and not anything on your end. I do have some tips and ideas that I will add on a ticket to your extension that are outside of this issue ticket. The fix should be added within the next few days! I'll keep you posted.
What happened?
I am working on an extension of Pynwb and one object that I am creating is an extension of the Pynwb PlaneSegmentation object. I was able to write and read NWB files created with this extended object, but when I tried to run the NWBinspector to verify the NWB file I got an error "AttributeError: type object 'MultiContainerInterface' has no attribute 'columns'".
After doing some careful tracing of the error, it appears that the error is being caused because the hdmf/build/classgenerator.py file is adding the MultiContainerInterface CustomGenerator to the end of bases in line 70, so when hdmf/utils/ExtenderMeta tries to call the DynamicTable function '__gather_colums' in hdmf/common/table.py line 280, it ends up trying to add columns from the MultiContainerInterface object rather than the PlaneSegmentation object which actually has the columns attribute.
I was able to recreate this error with a very simple extension of the PlaneSegmentation object which basically just inherits all of the same methods and variables without any changes.
I was able to avoid this error by adding an elif after line 380 in hdmf/build/classgenerator.py which checks whether bases[0] is a subclass of Container. However, when I do this I get further errors that DynamicTable is empty, so I think there may be some cascading issues.
I am trying to publish this extension and get other labs using it very soon so I would appreciate if someone could look at this soon!
Steps to Reproduce
Traceback
Operating System
macOS
Python Executable
Conda
Python Version
3.11
Package Versions
environment_for_issue.txt
Code of Conduct