It would be nice to have simulated devices for development. The manipulator(s) would have an angle, a true transformation matrix. Commands would be issued with some delay (should be threaded). The camera would generate an image from the true position of the manipulator. There should be a simulation of defocus. This could come from saved images of a pipette (which we could rotate, for example) + noise (which could be generated for example by having 2 or 3 images of the same configuration); we might want to use images of a coverslip too, of a slice/culture (eg to simulate movements of the slice).
It would be nice to have simulated devices for development. The manipulator(s) would have an angle, a true transformation matrix. Commands would be issued with some delay (should be threaded). The camera would generate an image from the true position of the manipulator. There should be a simulation of defocus. This could come from saved images of a pipette (which we could rotate, for example) + noise (which could be generated for example by having 2 or 3 images of the same configuration); we might want to use images of a coverslip too, of a slice/culture (eg to simulate movements of the slice).