Open jvitordm opened 1 year ago
You can go by software, but this will trigger an image on the cameras one-by-one, due to USB limitations of missing a broadcast communication capability. This will result, depending on your system setup in the cameras taking the image not exactly in the same moment.
What are the requirements on the maximum deviation of the point in time, when your cameras take an image?
Hi @thiesmoeller, so, I'm being able to acquire images at 12fps - with that, I understand that if I were able to always combine the last acquired frames I'd have at max approx. a 0,08s diff between frames, which is not that good, but also not the end of the world for my application. I was wondering how much better I could get with triggering from software... if that's still not good enough I'd then consider moving for hardware trigger (my seller recommended this option: https://www.baslerweb.com/en/products/accessories-and-bundles/dart-i-o-board-starter-kit/ I'd have to check if it works with the nvidia orin dev kit I'm using...)
Any comments and code sample suggestions are highly appreciated, thanks!
The inter camera delay would be 400-600µs when you send software trigger one by one
Hi @SMA2016a, that's interesting. I have a big object to be captured with the 4 cameras that will be moving at approx. 0,25m/s - therefore if I'm able to capture an image with such a small delay the max displacement would be around 0,15mm which is great.
Should I be able to implement it with some approach on the examples? Such as: https://github.com/basler/pypylon/blob/26009c3f0b4490ab06800da6feebf7e15349b5cb/samples/grabmultiplecameras.py or https://github.com/basler/pypylon/blob/26009c3f0b4490ab06800da6feebf7e15349b5cb/samples/grabstrategies.py ?
Am I then also able to have timestamps to check the delay? Thank you!
yes, these are the samples you need. When you start your application, read out camera's clock in ticks once. Then you know the time offset between the cameras. This value need to be subtracted from the timestamp getting in grab result.
yes, these are the samples you need. When you start your application, read out camera's clock in ticks once. Then you know the time offset between the cameras. This value need to be subtracted from the timestamp getting in grab result.
That's great, will work on it and let you know if I have any further doubts. Thank you!
@thiesmoeller
Talking about hardware trigger, a quote from your reply here (https://github.com/basler/pypylon/issues/470#issuecomment-1046035362):
"For HW Trigger, you can also use one of the cameras as "main-camera" and trigger the others from the first camera`s timing. You just have to wire up the IO-Ports of the cameras Or use the UserOutput of one camera to trigger itself and the other ones."
Is it then possible to use a camera to trigger the others once, for example, it identifies the object is present? Alternatively, I'm using an nvidia orin (https://www.nvidia.com/en-us/autonomous-machines/embedded-systems/jetson-orin/) could I use it for sending the trigger signal? What would you recommend?
update: I was able to get some good initial results with the implementation from https://github.com/basler/pypylon/issues/91
Nevertheless it is still a latestimage approach. I was not able to implement a software trigger for multiple cameras once the camera variable is initiated as below (in grabmultiplecameras.py): cameras = pylon.InstantCameraArray(2)
and then I cannot use the following for softwaretrigger (from grabstrategies.py): camera.RegisterConfiguration(pylon.SoftwareTriggerConfiguration(), pylon.RegistrationMode_ReplaceAll, pylon.Cleanup_Delete)
the following error is prompted: AttributeError: 'InstantCameraArray' object has no attribute 'RegisterConfiguration'
As you are running from am Orin, you could use the GPIO to generate the trigger signal for your cameras.
As all cameras run on their own time source they will slightly drift. HW trigger is the only way to properly synchronize the cameras.
To your register configuration question: You have to do this for each camera in the Instant camera array like if you are setting e.g. ExposureTime
Hello there,
I wanted to sync images from 4x daA3840-45uc cameras. The requirements are not that strict - I need only a single frame for each of them every 1s approximately. So I wanted to try a software sync before going to a hardware sync if that's really needed.
I'm new to pypylon, so any help on how to start with it is appreciated, Thanks!