esp-rs / esp-hal

no_std Hardware Abstraction Layers for ESP32 microcontrollers
https://docs.esp-rs.org/esp-hal/
Apache License 2.0
718 stars 194 forks source link

Camera Drivers #1475

Open Dominaezzz opened 5 months ago

Dominaezzz commented 5 months ago

I want to add camera drivers for ESP32 (I2S), ESP32S2 (I2S and some others) and ESP32S3 (LCD_CAM). I already have a working driver (esp-pacs + esp-idf-hal) for the ESP32 and ESP32S3.

I have two blockers:

1) I need someone with suitable hardware to test/run whatever example I conjure up in my PRs. This issue is meant to advertise the need for this. Anyone with either chip and the OV2640, OV3660 or OV5640 camera would do fine. I can write an example for any of these three cameras. Some example devkits for this include:

2) A suitable example to write. Camera initialisation happens over SCCB (I2C pretty much) and is a largely opaque process, as documentation on these OmniVision cameras are quite poor. I don't want to attempt writing a full blown library just for an example and I really want to keep it as simple as possible. Would it be acceptable to just have one big array of tuples describing what registers to write to, their values and some delays between some them? Or do you need examples to be something that people can play around with more?

MabezDev commented 5 months ago

I have two boards, an esp32 and esp32s3 board with an OV2640 so I can test those (I might have an s2 board kicking around too, but I'd have to dig it out).

A suitable example to write. Camera initialisation happens over SCCB (I2C pretty much) and is a largely opaque process, as documentation on these OmniVision cameras are quite poor. I don't want to attempt writing a full blown library just for an example and I really want to keep it as simple as possible. Would it be acceptable to just have one big array of tuples describing what registers to write to, their values and some delays between some them? Or do you need examples to be something that people can play around with more?

I think a tuple of register + data is fine, at least to get something working. Writing the esp CAM driver is enough work, creating a generic camera driver is a whole other project. I also do something similar in my keyboard: https://github.com/MabezDev/mkey/blob/e9ea41c31231d2101b1c52aeb7b5a8e1345c7aa4/firmware/src/main.rs#L311-L617 because I don't have time to write a full driver (yet?).

In terms of an example though, we need something that someone can run and know that the camera works. I don't know the easier route for this. My gut says a simple HTTP server that serves a snapshot of the camera FB periodically, but maybe there is a simpler one?

Dominaezzz commented 5 months ago

but maybe there is a simpler one?

I've never setup an HTTP server on no_std so I don't know how hard it is. Know of any examples? My initial idea for this (which is the example I have in my project) was to dump the JPEG from the camera as HEX onto the console and use xxd -r -p uart.txt image.jpg to convert it to JPEG file. Somewhat tedious but it works haha. May be more interesting to stream the footage over serial maybe.

I have two boards, an esp32 and esp32s3 board with an OV2640 so I can test those (I might have an s2 board kicking around too, but I'd have to dig it out).

Fantastic!

I think a tuple of register + data is fine, at least to get something working.

Sweet

MabezDev commented 5 months ago

I've never setup an HTTP server on no_std so I don't know how hard it is. Know of any examples? My initial idea for this (which is the example I have in my project) was to dump the JPEG from the camera as HEX onto the console and use xxd -r -p uart.txt image.jpg to convert it to JPEG file. Somewhat tedious but it works haha. May be more interesting to stream the footage over serial maybe.

Ah yeah I didn't think of that, that's probably the lowest barrier to start with. I thought we had a HTTP server example in esp-wifi, but its just client I think (and hand rolled HTTP, not a real crate) so lets go with the uart approach initially.

Dominaezzz commented 5 months ago

Now that LCD_CAM Camera is done, I'll be doing ESP32's I2S Camera mode to take pictures from the same camera. (Once I'm done/bored with my current SPI project)

Noxime commented 1 month ago

Hi, I'm working on a camera project with the ESP32-S3 and OV2640 (with extension to different sensors possible). For my purpose it would be quite useful if I could DMA in on a line-by-line basis, instead of the full frame. Fitting one line of the image into SRAM is much easier than a full 1600x1200 😄 Would this be difficult / possible to implement?

If test hardware is needed, I have an ESP32-CAM board and a Seeed XIAO-S3 Sense board, both with OV2640. Thank you for your work.

Dominaezzz commented 1 month ago

Would this be difficult / possible to implement?

Probably but it depends on what you want the camera to be doing between each line.

With the current APIs in the hal you can stream the image in from the camera, so you don't have to wait for the full frame before consuming the image data. (I personally do something like this for an MJPEG http server)

Have you tried the example in the repo yet?

Noxime commented 1 month ago

Have you tried the example in the repo yet?

Yes, I've got the camera working but to fully illustrate my situation:

I am making a basic machine vision firmware. It looks at a 640x480 30hz video stream and finds corners in it. I want to use the camera in YUV422 mode so I can fairly easily discard all the colour data and just work with 8 bit Luma values. Reading in the full frame as is is 600kib, so it can't fit in SRAM and the DMA cannot write into PSRAM.

My plan is to work on a line by line basis. Two line sized buffers (640*2 bytes) are used. When the first one is being filled by DMA, the cpu copies only the Y values from the second line buffer to the PSRAM. Then at the end of a line, the buffers are switched and the next line is started.

If I understand correctly, at the moment read_dma triggers the frame acquisition and holds the reference until whole frame has been written to the RXBUF. In practice I cannot read VGA data unless in JPEG mode, because the data size is too big. I don't think I have enough CPU cycles to decode the JPEGs from camera at 30hz.

Dominaezzz commented 1 month ago

and the DMA cannot write into PSRAM.

It can, the hal APIs just don't allow it yet. Or I suppose that's what you mean.

If I understand correctly, at the moment read_dma triggers the frame acquisition and holds the reference until whole frame has been written to the RXBUF.

You want read_dma_circular.

Noxime commented 1 month ago

You want read_dma_circular.

Thank you, this solved my issue for now 😄 It would be extra wonderful if DmaTransferRxCircular provided a zero-copy API (pop_with(f: fn(&mut [u8]))) but that's not camera specific.

Dominaezzz commented 6 days ago

It would be extra wonderful if DmaTransferRxCircular provided a zero-copy API

Zero-copy API (for Camera only) coming in https://github.com/esp-rs/esp-hal/pull/2242 .