Closed TekuConcept closed 8 years ago
No one would happen to have a microHDMI (not miniHDMI) to HDMI/DVI adapter adapter would they?
Acquired the following details:
Learned of Logi-Bone cape for BeagleBone. As a valuable FPGA extension it can be used for video capturing. Made contact with the creators of the LogiBone team and they said they may provide educational discounts or even donate one to us for free. (Haven't heard back from them yet) Other than that, they are about $75-$80 a piece and pre-releases don't take place for another week or so.
Successfully designed adapter between two OV5642 cameras and PRU1 on the bone. Successfully allocated 32MB of contiguous DDR memory for real-time camera buffer space. (set the bootargs in uBoot: mem=480M out of 512M)
can you guide me through this I need to interface the BBB with ov5642 for a drone.
USU's RoboSub team has designed a custom cape to capture from two OmniVision 5642 cameras synchronously. The firmware, assembly, and C-code are still under development and is not ready for public release just yet.
Nevertheless, the task is very strait forward: you feed the camera an XCLK signal (8-50MHz), then out comes a PCLK that goes strait into one of the PRU pins on the bone, along with 8 data pins. When the PCLK is read high with the assembly instruction 'WBS,' the data bits are then read in through the R30/R31 registers and then written to RAM with 'SBBO.'
When the picture is finished being captured, an addresses is used by the C-code to read the image from memory and process it.
USUb Electrical Engineer ~ Chris Walker
PS I too own a multirotor: a Y6B, and I have a similar desire - yet another reason why I petitioned for this research here at Utah State University.
On Dec 30, 2014, at 2:16 PM, davidrotor notifications@github.com wrote:
can you guide me through this I need to interface the BBB with ov5642 for a drone.
\ Reply to this email directly or view it on GitHub.
Hello Chris, Ive retaken my beaglebone black project since I put it on hold because of the workload I had in my final year. Can you guide me through what does the PRU have to do or how do I take pictures with the omnivision cameras?
On Tue, Dec 30, 2014 at 4:10 PM, TekuConcept notifications@github.com wrote:
USU's RoboSub team has designed a custom cape to capture from two OmniVision 5642 cameras synchronously. The firmware, assembly, and C-code are still under development and is not ready for public release just yet.
Nevertheless, the task is very strait forward: you feed the camera an XCLK signal (8-50MHz), then out comes a PCLK that goes strait into one of the PRU pins on the bone, along with 8 data pins. When the PCLK is read high with the assembly instruction 'WBS,' the data bits are then read in through the R30/R31 registers and then written to RAM with 'SBBO.'
When the picture is finished being captured, an addresses is used by the C-code to read the image from memory and process it.
USUb Electrical Engineer ~ Chris Walker
PS I too own a multirotor: a Y6B, and I have a similar desire - yet another reason why I petitioned for this research here at Utah State University.
On Dec 30, 2014, at 2:16 PM, davidrotor notifications@github.com wrote:
can you guide me through this I need to interface the BBB with ov5642 for a drone.
\ Reply to this email directly or view it on GitHub.
— Reply to this email directly or view it on GitHub https://github.com/USU-Robosub/mk-proto/issues/2#issuecomment-68404183.
The objective of the PRU is to read a byte from the camera on the rising edge of the camera's PCLK. It then writes the byte to RAM where your C-program will read that byte. The frame (or picture) is finished writing on the rising edge of VSNC.
Unfortunately I can't explain any further since I do not have a working solution yet - I just got our ALTERA Max II working (required for our stereo vision). I would like to use the GPMC instead of the PRU for a higher throughput (100MHz vs 50MHz).
By the way, if you need something working now, there is always the LogiBone cape, though it may be a little pricy in my opinion. :)
(currently irrelevant to this project - maybe can be reopened in a future build)
Read in HSYNC buffer from camera and write to processor. VSYNC validates a single frame. (The OV7670 camera defaults to approximately 30FPS at 32kp/frame while the OV5642 defaults to approximately 15FPS at 5mp/frame.)
Create raw YUV image file from buffers.