I have been playing with Pi4/HQ Cam and OpenCV; dewarping, perspective transformation etc.
I would like to try a new challenge, but am a little overwhelmed with options and am lacking confidence on hitting that ‘purchase now’ button and parting with several hundreds of dollars only to learn two weeks later when it doesn’t work that I should have bought X instead.
I would like to build a dual-camera RPi4/HQ camera system to capture a soccer game/full soccer pitch. (2 x HQ camera with 90 degree lenses, stitched frames)
After researching and reading, the concept/design that I like, is the one that uses 2xPiZero and NFS over USB to Pi4 host; such as this:
My thinking is to then use OAK-1 to do the heavy lifting in terms of image processing and future AI work.
The main thing that I am unclear about is frame synchronisation using this approach;
how to do it
any impact on frame rates / other issues…
I am also a bit concerned about how much I can plug into a Pi4 host… if I am connecting two PiZ with HQ cams (using the USB/OTG ports), and an SSD to handle all of the I/O, and then an OAK-1…
I am thinking maybe I just bite the bullet and use a product such as ArduCam’s Camarray. It is a big jump in cost, but does seem to make life much simpler.
I am hoping that maybe somebody has beaten a path before me and can give some guidance. Thank you in advance for any assistance.
The ArduCam CamArray is definitely a much more elegant solution. However like you said it is a bit of a price jump. If the Arducam boards are still operating the same way they were a couple of years ago, you aren’t actually accessing the cameras simultaneously, just consecutively in a loop.
The solution Laurent has given on the Pi forums would still likely be a good overall option (especially if you already have the two HQ cams and a Pi 3/4). From the limited documention I’ve read, Nginx and the PiCamera Library will be a great option.
I’d just ensure that if you are streaming it to a webserver (in either scenario) that you make sure you have a wired connection going from the Pi that is acting as a host.
Thank you for the heads up regarding ArduCam CamArray Owen. I will look into it.
If some of the time and energy of my youth remained, I would probably go for the Compute Module and IO Board (dual CSI), but that risks a hole that I won’t have the time or energy to dig myself out of.
Frame synchronisation’s a bit of a tricky one, but commonly required for stereo machine vision applications (eg. doing depth mapping). Fortunately support for this was added in August last year https://www.raspberrypi.org/forums/viewtopic.php?t=281913
There’s some really good forum topics on the official Pi forums on this:
Get all of the synchronising signals from the same source. Then if the cameras are some distance apart make sure the cables are exactly the same length.
This was how TV studios did it (not sure about now). I don’t think things have changed that much in that regard as the cable length method is very stable and reliable. Everything is timed (by electronic measurement) so that all signals arrive back at Master Control at EXACTLY the same time. You could have a similar thing on a much smaller scale but the same principal applies.
Cheers Bob