AllenNeuralDynamics / aind-data-schema

A library that defines AIND data schema and validates JSON.
MIT License
23 stars 16 forks source link

Add support for lick detection #723

Closed mochic closed 8 months ago

mochic commented 9 months ago

The NP* rigs and behavior boxes use lick sensors. We would like to add a reference to to a detector to reward delivery. We mainly only need to know whether it is capacitative or piezoelectric and what model number it is.

saskiad commented 8 months ago
bruno-f-cruz commented 8 months ago

Before this gets merged, just wanted to add a few points that might be relevant for the short, and probably long term...

  1. We will also need to add information to the schemas on a new lick detector developed in house (https://github.com/AllenNeuralDynamics/harp.device.lickety-split). This is a full harp-device with the solo purpose of capacitive lick detection.

  2. Not sure that a LickDetector class as a first citizen device might solve this problem either. Unless the lick detector is its own device, and we won't consider ever adding a different "LickDetector" (see point 1 tho...) the decision to establish a Device sub-class given its application seems a bit odd. The way I see it, if the lick detector in question is indeed its own device, we should make sure we model it as so, but perhaps choose a different name (maybe the manufacturer's name as it is done with OpenEphysAcquisitionBoard). Alternatively, if the lickometer is simply an input to a daq device I/O, it should be modeled as such (probably as a channel).

  3. I still think that having a reference in RewardSpout is warranted, but maybe something more general like Union[Device, DAQChannel] might be worth considering.

  4. Finally, if there is indeed a case where the schemas should model the application (e.g. Lickometer) in addition to the device (e.g. nidaq:channel0), I think it might be generally more useful to come up with a distinct class that models these concepts. We could think of ways to map application:device to make the relationship explicit.