Open diarmaidocualain opened 1 year ago
Each frame has a timestamp (the satrt of the exposure time), that you can read out in the Callback. You could associate the IDs and the timestamp to check in your code what the actual times of the frames are. What cameras are you using? Please also check the information in the User Guide regarding Hardware Triggers, the GPIO Interfaces and specifically if you have a Rolling Shutter camera, you can only trigger at half of the maximum FPS at the moment.
You can also check the timings of the trigger and the exposure by using the GPIOs and a logic analyzer to see the timings of the output ExposureActive for both cameras.
The settings for that would be almost the same as the ones for the trigger of the Controller, just on a different Line: LineSelector = LineX LineMode = Ouput LineSource = ExposureActive
If you find discrepancy here, then it is most likely, that you have not set the settings correctly. Please check them in Vimba Viewer and attach screenshots for the various I/Os, Trigger and PixelFormat + ExposureTime settings.
Also note, that if the exposure time is longer than the period between triggers for the follower, then the triggers will be ignored.
Thanks Teresa-AlliedVision for the information. I was not aware that there was a timestamp that could be queried from the frame in vmbpy. Yes, I can use this to sync up the pairs of frames based on the pairs with the closest matching timestamps. I will have a look for documentation that refers to it. I am using the https://www.alliedvision.com/en/products/alvium-configurator/alvium-1800-u/2050/ model cameras. I will have a look at the documentation that you are referring to. Thanks!
Yes, the 2050 is a Rolling Shutter (RS) Sensor, so with the hardware trigger you don't get the full framerate, only half at the moment. With the coming firmware 13 you will be able to achieve almost the full framerate with hardware trigger, but until then, I recommend setting the controller to half the max FPS and setting the DeviceLinkThroughputLimit (the datarate of the camera in Bytes/s) as high as your hardware allows, before you get incomplete frames. A higher datarate gives you a higher max FPS in general.
Hi, please let me know if this is the wrong area to ask this question, im not sure where else to go with it.
I am looking to sync captured frames by two AV USB cameras. As such, I followed the guide here on hardwiring them: https://forum.alliedvision.com/t/how-to-synchronize-alvium-usb-cameras/232
I followed the steps and was able to make a cable and trigger a capture from two cameras, a controller and a follower. However, I cannot understand how to pair the synchronized frames with each other after capture. I am using vmbpy in Python. The cameras are set up as above, and then instructed to start streaming in the main code as follows:
You can see that they are set-up with a callback handler function. In this function, I write the frames and their Frame ID to a buffer.
I have a buffer of frames and their ID from each camera then. However, when I match up pairs of frames by their ID, they are not in sync, as I would have expected.
Would you have any suggestions on how to capture pairs of frames that are in sync with a reasonable FPS rate?
Thanks,