Open CodyCBakerPhD opened 1 year ago
@alessandratrapani @weiglszonja @pauladkisson This is the first issue in which to compile all our ideas / pain points for improving the ophys representations
Ideas that I have come across:
BackgroundResponseSeries
to represent background 'neuropil' activity without calling it an 'ROI'.BackgroundResponseSeries
Would that link to a separate PlaneSegmentation table too?
emission_lambda
is dependent on the indicator
and is used to represent the filter used by the optic channelexcitation_lambda
is a property of the optic channelSubject
target_indicator
Microscope
, subtype of Device
which could have properties describing a laser or LED propertiesCurrent draft proposal:
OpticChannel
is a new neurodata type that can exist independently of the ImagingPlane
Microscope
is a new neurodata type, subtype of Device
ImagingPlane
is modified to only describe physical space, either of disjoint planes or contiguous volume
MicroscopySeries
is a new neurodata type that describes 1P, 2P, 3P, light-sheet, widefield, confocal images
None
for othersunit
attribute because it is always otherwise set to n.a.
or a.u.
(quantal flux can be calculated after the fact but raw imaging would never have)resolution
is not relevant to microscopy, more of an ephys relevant valueSegmentationImages
should also have a link to which plane segmentation they belong to. Currently they are added to an Images
container and the only way to determine which plane they belong to is relying on the name of the container.
@weiglszonja This would apply to all other summary images as well, right?
@CodyCBakerPhD Yes, like what we see for Pinto with the contrast, PCA and vasculature "masks".
looping @h-mayorquin in
Hey guys, I found a nice paper on future trends in microscopy here. There are a few new developments that I think we should watch for such as bar code labeling that uses multiple fluorophores for each molecule to allow scientists to look at more markers simultaneously.
Based on the discussion in ndx-holographic-stimulation, should we also include changes in the Ogen module? Or should we limit those to a separate extension?
If we're extending ogen to include ogen from a 2P system, and the metadata for all that overlaps, then probably yes
From https://github.com/NeurodataWithoutBorders/helpdesk/discussions/64#discussioncomment-8026504 posted by @ehennestad
I would argue that the indicator should not be coupled to an optical channel. True, optical channels should be optimised to capture signal only from the indicator of interest and block out other indicators but this might not always be true (bleedthrough is a common problem). Furthermore, multiple indicators are present in an imaging plane, regardless of the configuration (filters) of the optical channel and whether or not an optical channel is active or not. On the other hand, I understand that the imaging plane is an abstraction and in the ideal case it is coupled to a "perfect" optical channel and thus capturing only one indicator. I don't have a strong opinion either way, I was just curious what is common practice.
I really liked this suggestion though:
but, should maybe be its own neurodata type so that it can link to the optic channel since an optic channel exists to target the emission from a specific indicator (not several indicators); perhaps the optic channel field would be target_indicator
I still think it makes most sense to define the indicator somewhere else (e.g associated to the subject), and keep the optical channel independent from it. The optical channel is the same from one session to the next, independent of what kind of subject or indicator is being imaged.
@alessandratrapani @pauladkisson FYI: Multichannel volumetric imaging NEP https://github.com/nwb-extensions/nep-review/issues/3
Something else that's come up in my latest conversion is a slightly irregular Z-axis (depth) on each frame acquisition, such that a regular grid_spacing
as we have now doesn't exactly capture that descriptor of the volume
i) Create a separate neurodata type for OpticChannels, rather than them being a special thing generated by the ImagingPlane table
This will give the ability to reuse the same optic channels across different ImagingPlanes
ii) Generic MicroscopySeries to replace OnePhotonSeries and TwoPhotonSeries; I also saw a recent talk where 3P was gaining popularity; we've also used OnePhotonSeries for light sheet data before and that association raised eyebrows