nikivanov / watney

Watney is a low-cost 3D-printable FPV rover
GNU General Public License v3.0
378 stars 39 forks source link

esp32 + esp32cam #35

Closed eerison closed 10 months ago

eerison commented 10 months ago

Hi @nikivanov

thank you for the project, you did a nice project <3

I was wondering if you have any example using esp32 I guess it could reduce cost of the project.

and to make mic+speaker works, did you use i2s protocol or something else?

nikivanov commented 10 months ago

@eerison The software stack runs on python / linux and relies on raspberry pi's hardware video encoding. ESP32 can do none of those things. It could be rewritten to use ESP32 / esp32cam, but the video quality and framerate would be drastically worse. Besides that, ESP32 most likely isnt powerful enough to encode and decode audio either. But to answer your other question - yes! Both microphone and the audio amp are using I2S - one uses the left channel, and the other uses the right one.

eerison commented 10 months ago

Hey @nikivanov

Do you mind to say where exactly you handle

I2s and camera to appear in the browser?

nikivanov commented 10 months ago

I2S functionality resides exclusively on the raspberry pi device, the server. The browser simply gets an audio/video feed from the server. The technology behind it is WebRTC - same technology that's used for audio / video calls through the browser. The browser does not know where AV signals come from.

Here are some code links that set up the audio:

Hope that helps!

nikivanov commented 10 months ago

As for the camera, things are much simpler there - the server starts this file when the WebRTC session is started. It acquires the hardware-encoded h264 video stream from the Pi, then adapts it to the RTSP (real time streaming protocol) and then pipes it to a local UDP port, where it gets picked up by the Janus WebRTC server.

In general, the pattern is to have Janus handle all WebRTC related stuff. Janus either pushes the audio bytes to a UDP socket, or reads bytes from a UDP socket to play back or record audio / video. The bytes are either picked up by GStreamer in case of playback, or pushed by the GStreamer to those UDP sockets in the case of recording

nikivanov commented 10 months ago

@eerison I'm closing this issue since your questions have been answered. Feel free to reach out if you have any followups.