Open alanruttenberg opened 6 months ago
I see how it could be interpreted this way, but the operative parts of the definitions are distinct.
Cameras record while microscopes enable users to see and telescopes aid in observation. I think the emphasis should be read on record here, and this could be added as a clarifying annotation.
Unless this is not a convincing interpretation.
The problem is that some telescopes and microscopes are built to record. The telescopes I deal with in my domain are built to record - they are cameras that happen to have telescope lens structure. So there will be multiple inheritance. This needs to be better factored. There is the optics aspect, what kind of lens or mirror is used, and there is a separate aspect which is whether they are used to record or not. All things with lens/mirrors in optical systems share some properties, such as focal length. Imaging sensors have a distinct set of properties (spectral sensitivity, detector size, detector technology).
Off the top of my head, there should probably be a function realized in recording an image, and make camera defined in terms of that function. The spectral range should also be factored out into a relationship so they can be used in defined classes, and those can be re-used for optical filters - there's only Filter in CCO, but I will need to represent optical filters of various sorts. There's another multiple inheritance issue. An optical filter would be both an optical instrument and Filter. Filter is again the sort of class that should be a defined class defined in terms of a function. Also notice that there are parallel subclasses Infrared camera and infrared telescopes, and nothing that links them both as being sensitive to infrared.
By defined class I mean a class that has an equivalentClasses axioms as part of their definition.
Also note the definition of the parent 'imaging instrument': A Material Artifact that is designed to produce images (visual representations) of an entity.
A visual representation is an information entity. If a telescope merely helps you see something by focusing light, that's not producing an representation or image. To be a representation is more than just perceiving what is there.
That's a good point; it seems to produce material entities which bear representative ICEs. I think 'produce' is fine because it does produce but may not record (like a camera). It seems to me that imagining instruments and their subclasses create concretizations using light, and those concretizations bear information which represents entities that interact with that light.
But this might not be accurate because the light isn't 'captured' in a simple telescope, but just concentrated so as to let the light hit your eye (or a camera) in a particular way.
But this might not be accurate because the light isn't 'captured' in a simple telescope, but just concentrated so as to let the light hit your eye (or a camera) in a particular way.
Right. Unless you consider the pattern of activation of your vision cells or brain as concretization, which is pushing it. There's also a difference between a screen that has a concretization, like your phone's camera constantly displaying whats in front of it on a screen, vs what happens which you tap to take an image and it is saved to nonvolatile storage. Another example of that is a ground glass screen on which some telescopes project.
But more importantly this needs to factored better. The focusing element is mix and match with whether there is a sensor making a recording or not. Sensors will have a hierarchy and focusing elements will have a hierarchy and they will be combined using defined classes of artifacts that have both as parts, with a specific function.
Alan, can we work on refactoring both this and the optical lenses at the same time? Offline perhaps until we have something concrete?
Yes that makes sense.
It looks like telescope and microscope both satisfy the definition of camera and so would be subclasses. Currently they are siblings.