Open deepzf opened 6 years ago
to make a stereo camera you need a "sync" pin on each camera, this is the only way to synchronize to cameras.
I do not believe the raspberry camera have such features.
hoihj
I will just leave it here: As I understand there is a way to get digital signal when frame is taken https://www.raspberrypi.org/forums/viewtopic.php?t=190314 But I am still not sure how to use it to ensure two frames taken at the same time.
Very interesting news on the hardware pulses (been away from this for far too long!). However, I'd strongly recommend reading the Camera Hardware chapter to understand what actually happens when you request a frame be captured, and thus why synchronization is a seriously hard task.
Actually, I should add something to the docs about the GPIO pulses ...
A bit late for this comment but I have managed to sync both cameras on a compute module by using multiprocessing. Here is an example:
@maykef Well Done, Mate! I was sure it is possible to do with compute module ;-) Not sure how, since dont have one. Maybe not easy. I believe people mostly want to find a way to do it with separate PIs.
my point is how to sync cameras on different pi boards
Not sure multiprocessing can run different processes in different machines, I’ll have a quick look into it.
@deepzf This might help you: http://www.micsymposium.org/mics_2017_proceedings/docs/MICS_2017_paper_4.pdf Once you have setup the cluster you may be able to manipulate cameras in a multiprocessing fashion. The only drawback I see on this is that you need a switch, which can add space, weight and complications to processes that should be simpler.
@maykef Do you have any more to say about what you did? Maybe you have a blog post or smth?
@soswow This is my entry from 2 years ago: https://www.instructables.com/id/A-Raspberry-Pi-Multispectral-Camera/ This is from this year: https://publiclab.org/questions/maykef/10-12-2018/cheap-multispectral-camera What is it that you want to do?
@maykef I guess I am interested in this like everyone else here - to do stereo CV
Thought so. Someone already developed this: http://stereopi.com All I can tell is that one thing is to trigger the camera in still mode (rolling shutter, overheads and the like), and another one very different is to sync frames for stereo vision. I have seen few projects were people managed to sync several cameras, so it's got to be possible.
I've seen this one and even pre-ordered already, I believe.
So you are on your way then!
Answer comes kind of late but here it goes
In case you are interested.
Not so long ago, I would use the standard gpio board with the compute module to acquire synchronized frames for stereo calibration of a pair of cameras but I had two main issues:
:exclamation: the camera connectors were so very fragile that I was always ended up breaking them
:exclamation: it requires an additional converter from the 22- to 15 way cable to connect to the camera which is just a bother
So what I ended up using were two Raspberry Pi 3, one as master another as slave. I would have, eg, the GPIO20 connected between them so the master trigger the slave to start a video. Then, GPIO21 connected so the master can tell the slave that video was finished (:exclamation: if both Raspberry are not powered, for example, by your own laptop, you should share a GND pin between them to ensure that the slave can appropriately detect the pulse from master)
In practice, whenever is the master console you hit ENTER the master sends a pulse to GPIO20 and record a video. On the slave side, there is a callback associated to this pin which consequently triggers a video recording.
With this approach I was able to take frames almost synched to the ms (as in the examples) . Never actually tried with over a pair of Raspberry but in principle it should work with n Raspberry.
Hi @ninja-asa,
First of all, well done for that. Now, as I understood, the reason why you ditched the CM was because of the connection between the cameras and the CM board, is that correct? When you say you connect 2 RPi 3, one as master, the other as slave, is that done through network, or you just simply connected both GPIOs to initialise the video capture? It sounds interesting, and I can actually reply it with a bunch of RPis I've got, and compare performance with the compute module.
@ninja-asa Interesting. I assume you are doing for same reason (stereo CV) I wonder what is the threshold in terms of time difference for stereo CV to be still practical?
@maykef 1) Actually I still use the CM, the amount of GPIO can be really handy . The syncing I proposed was just a work-around that for me works best to optimize a process of making the stereo-calibration. Until then, I must have spent countless hours verifying connections trying to understand why the cameras wouldn't be detected by the compute module: was it the flat cable between CM and Camera adapter? was it the flat cable between the adapter and the camera? was any of the jumpers connecting camera -related GPIO broken? This happened to me quite often and connecting/disconnecting I would end up breaking those black supports on the Compute Module Board. This frustration is way smaller with just using two RPi3.
2) No network, only hard-wired GPIO, the simplest and most straightforward approach possible. (But really important is to have the RPi share GND otherwise it is possible that the HIGH of the slave "is not clearly noticed". (Actually I use network but is only to remotely access the master RPi )
3) By the way, thank you for sharing the StereoPi project. Did not know about it but really interesting
@soswow I guess it really depends on the context of the stereo you intend to do, the expected/target working distances and which maximum velocity you expect in your view and of course your hardware
Hi @ninja-asa,
I've got the suspicion that GPIO connections won't allow for camera shots sync, I actually did this long time ago already. I'll give another try and report here.
Guys, StereoPi is a project of our team. As I know there is no hardware sync (at sensors level) used in Compute Module based solutions. But our experiments with depth map and livestream stereoscopic video to users (Oculus Go) shows that synchronization issues are undetectable. If you plan to use depth map for drone races with 100 miles per hour speed - of course it is better to use hardware sync, and also use sensors with global shutter. But in this case, it is better to look for some industrial-level solutions. StereoPi is designed for beginners who study OpenCV and stereoscopic vision, and for this scenario synchronization implemented in stock Raspbian kernel now is perfect.
I'm going to start a project using several pi zero w with picamera.But I don't kown how to let them capture at the same time Accurately .I have done some tests and several different settings,but still about a unsynchronized frame at intervals of twenty frames. Is there any Solution to solve this problem?