basler / pypylon

The official python wrapper for the pylon Camera Software Suite
http://www.baslerweb.com
BSD 3-Clause "New" or "Revised" License
564 stars 207 forks source link

Ensure synchronized grabbing from multiple cameras #598

Open dahleri opened 1 year ago

dahleri commented 1 year ago

Hi, I am building a stereo application where it is important to ensure that the processed frames were acquired synchronous at the same time, in order to triangulate points from the different views. I am syncing the image acquisition through PTP and periodic signal. The computing time for the stereo algorithm is very unstable and sometimes it takes longer than the fixed framerate. I struggle to decide on the most appropriate grabbing strategy.

I would like to use "LatestImageOnly" to be able to compare the output from my stereo application to other sensors, but for some reason it sometimes skips frames from one or more of the cameras. I dont know why this happens but could maybe be because the images are not transferred at the same time and the code therefore skip one of the frames if its not ready in the buffer? Is there a way to get around this problem?

I must keep track of when the images were acquired in order to compare my output to other sensors, therefore I dont want a queue building up. At the same time, I must ensure that the images were acquired at the same time and cannot allow one camera to skip an image and loose sync.

How would you approach this problem? Thank you very much in advance!

SMA2016a commented 1 year ago

[I would like to use "LatestImageOnly" to be able to compare the output from my stereo application to other sensors, but for some reason it sometimes skips frames from one or more of the cameras] Most properly, the image processing takes long time than 1/fps use onebyone instead of LatestImageOnly

set Packetsize to 8000 interpacketdelay xxxx test a value by increasing it by 500 steps.

dahleri commented 1 year ago

Thank you for your response! To clarify, its not a problem that we sometimes skip images. This is rather expected since I know the processing sometime takes longer than 1/fps. The problem is when only one camera skips an image, because we then lose sync which can be seen comparing the timestamps.

I am confused about how this happens. I would ideally always retrieve the latest acquired images from both cameras and be able to trust that they both were taken at the same time.

I will try with the packetsize and interpacketdelay, but do you think they will have any impact if the problem is not buffer underrun?

thiesmoeller commented 1 year ago

From system architecture point of view I would recommend to decouple the stereo processing and the capturing of the image pairs.

The pylon APIs will only take care of the images per camera. But your application has to handle "LatestPairOnly"...

E.g. you could e.g. capture the pairs and push them into a fixed length queue. And run your stereo code on the other end of the queue.

dahleri commented 1 year ago

Do you mean that the acquisition and processing are done in separate threads and buffers stored in a customised buffer factory?

Will have a look at the samples Grab_UsingBufferFactory and Grab_UsingGrabLoopThread to test if that solves my problem. Will let you know the results as soon as I have managed to implement the suggestions.

thiesmoeller commented 1 year ago

You don't have to use buffer factory. Just have one grab thread that handles the cameras and provide pairs of images in a queue Another thread then does the stereo processing and reads from this queue.

dahleri commented 1 year ago

After some initial tests my problem still remains. I am right now testing the grab loop without my stereo application running in order to verify my code before threading it. The grab loop takes less than 1/fps so that should not be the problem. The behaviour seems quite random and sometimes it works the way I want, and sometimes not. I will demonstrate the code and output with some c++ pseudo code and images below:

I dont know why our results vary, how can we make it more stable? Our desired behaviour is to keep the timestamp between right and left camera in sync.

// Block that configure the cameras with CActionTriggerConfiguration

cameras.open()
// Same configuration for both cameras
CBooleanParameter(nodemap, "PtpEnable").SetValue(true);
CFloatParameter(nodemap, "BslPeriodicSignalPeriod").SetValue(1'000'000 / float(c_frameRate));
CEnumParameter(nodemap, "TriggerSelector").SetValue("FrameStart");
CEnumParameter(nodemap, "TriggerMode").SetValue("On");
CEnumParameter(nodemap, "TriggerSource").SetValue("PeriodicSignal1");
CFloatParameter(nodemap, "BslPeriodicSignalDelay").SetValue(0.0);

// Block that enable PTP and let the cameras synchronise with PtpDataSetLatch over ~30 seconds

cameras.StartGrabbing(GrabStrategy_LatestImageOnly)
while (true){

    cameras[0].RetrieveResult(ptrGrabResult_Left)
    cameras[1].RetrieveResult(ptrGrabResult_Right)

    //print statements for debugging purpose:

    cout << "Left ID:   " << ptrGrabResult_Left->GetID() << "  and timestamp:  " << ptrGrabResult_Left->GetTimeStamp() << endl;

    cout << "Right ID:  " << ptrGrabResult_Right->GetID() << "  and timestamp:  " << ptrGrabResult_Right->GetTimeStamp() << endl;

    cout << "Difference between Left and Right: " << std::fixed << setprecision(2) << (double)ptrGrabResult_Left->GetTimeStamp() - (double)ptrGrabResult_Right->GetTimeStamp() << endl;

    cout << "Buffer ready - Left: " << cameras[0].NumReadyBuffers() << " Right: " << cameras[1].NumReadyBuffers() << endl;

    cout << "Skipped left: " << ptrGrabResult_Left->GetNumberOfSkippedImages() << " Right: " << ptrGrabResult_Right->GetNumberOfSkippedImages() << endl;
    }

The output is shown in following images: //1 Desired behaviour PNG-bild

//2 Error from start PNG-bild

//3 Skipped one frame and achieved desired behaviour (still not acceptable) PNG-bild

thiesmoeller commented 1 year ago

Hi @dahleri ,

the issue with your setup is, that you enforce an order of the receive of the data from the two cameras. This can might lead to your problem. You wait for buffer-left while buffer-right is already overwritten by 'LatestImageOnly' policy.

It is recommended in these setups to use the Background loop pattern as described in https://github.com/basler/pypylon-samples/blob/main/notebooks/grabstrategies.ipynb or in https://github.com/basler/pypylon/blob/master/samples/grabusinggrabloopthread.py

You can attach the same OnImageGrabbed handler to both cameras. So you have a single place to sort the incoming buffers. On each camera you can set a CameraContext, that gets reflected in the GrabResult.