Closed Semmu closed 2 years ago
Fyi it looks like Micro-RTSP has a package available:
https://platformio.org/lib/show/6071/Micro-RTSP
This would make it easy(er) to integrate into ESPHome as a component.
I'm also interested in streaming the camera sans Home Assistant. I started to investigate whether interfacing with the Web Server to stream an image using HTTP multipart/x-mixed-replace, which is what Home Assistant does for its front end. This solution will require adding another request handler to the Web Server and making it long lived and hooking into the ESP32Camera add_image_callback
. While in theory this could work, I'm still unfamiliar with the hardware and the libraries and wonder if there's something that's missing?
Hey yes, I also would be interested in this feature, so I can integrate them with my Homebridge Cameras (which are gathered by FFMPEG via RTSP), so RTSP Support would be appreciated!
RTSP Support would be appreciated!
Yup, would be lovely :)
same here, I almost went the way to create RSTP cams and then feed them in motioneye and to HASS. But missing the ease of use of ESPHome and its OTA updates was kind of a deal breaker. So a rtsp library for the esp32 cam module would be great
+1 Would be very welcome!
+1 Yes please
+1 same here
+1 would love that feature
+1 from me to.
+1 Yes please!
+1 Yes Please, would be awesome to have that feature
+1 please
+1 as well please.
+1 please
+1 please
I do run Home Assistant, and it is quite good at what its main task it. Cameras are not part of that, even though they had an attempt at it a while ago. Watching, recording, anything...it just can't handle it well. It is a pain. Would be great if a program with that as main goal could access these cameras.
Please add this feature!
this would be great! + 1
Just a note, recently I found this modified version of the stock ESP32 cam webserver example firmware, which seems to include more features and some fixes, so I think this would be the best project to include/integrate somehow.
No promises, but I'm attempting to implement this.
~I'm a little blocked by https://github.com/esphome/issues/issues/1884 though :/~ Not actually blocked; Just needed to revert my platform version :-)
https://github.com/esphome/esphome/compare/dev...crossan007:feature/rtsp-server
So, I'm making some progress on this 🎉
I've got a new component working which is enabled via YAML. It accepts a link to the camera component id similar to how the various sensors access the GPIO pins. This will prevent the inclusion of the RTSP bits from firmware images which don't request it (thereby preventing image size bloat)
I am still working through getting the RTSP protocol implementation working with the AsyncTCP library; I started a new PlatformIO library (https://github.com/crossan007/ESPAsyncRTSPServer-esphome) which is roughly based off of the AsyncWebServer's implementation and lots of consultation with https://datatracker.ietf.org/doc/html/rfc2326.
At this point, it accepts a connection from VLC, and then crashes 🤣🤣
Example YAML:
esp32_camera:
external_clock:
pin: GPIO0
frequency: 20MHz
i2c_pins:
sda: GPIO26
scl: GPIO27
data_pins: [GPIO5, GPIO18, GPIO19, GPIO21, GPIO36, GPIO39, GPIO34, GPIO35]
vsync_pin: GPIO25
href_pin: GPIO23
pixel_clock_pin: GPIO22
power_down_pin: GPIO32
name: ${devicename} cam
resolution: 800x600
id: cam1
rtsp_server:
port: 8675
camera: cam1
Getting there!
I'm very glad you started working on this and the progress is awesome!
It would really be a great success !!
It "works" except something is wonky with the jpeg parsing - causing video like the above screenshot.
I've made a little progress on it since then, but I've mostly been bashing my head against Wireshark captures trying to figure out what I'm messing up in the RTP packet format 😧
You're doing amazing work @crossan007 :D
It works! Video frames now stream from the camera to VLC.
I'm hunting down some lingering instabilities/random crashes, but I'm getting closer!
One thing I think I need to do is move the RTP frame sending out of the "new frame" ISR(I think that's what is called?) and into the main "loop". It seems like the chip sometimes disconnects from wifi while sending frames which I think is because I'm not yielding any time to the wifi stack.
Alright. It's probably got some bugs, but here goes:
PlatformIO library: https://platformio.org/lib/show/12429/ESPAsyncRTSPServer-esphome
I started a merge request with ESPHome; I'm not sure it's quite ready for primetime yet, so the MR is in draft state.
C++ is not my strongest language, so I welcome any "suggestions" for stupid things I may have done.
You can try my freshly made external_component
exposing camera over HTTP (snapshot and mjpg stream): https://github.com/ayufan/esphome-components#25-esp32_camera_web_server. It is a single stream allowed (as implemented by esp_http_server
).
external_components:
- source: github://ayufan/esphome-components
esp32_camera_web_server:
# define only what is needed
# only a single stream is supported at a given time
- port: 8080
mode: stream
- port: 8081
mode: snapshot
I did some tests with the RTSP server and it works (mostly) very good (getting some error messages from time to time and few unexpected resets). A very quick test of the HTTP server also looks good, although I'm not able to include the camera to the Surveilance camera app on my Synology NAS with it (which works just fine using RTSP).
@JanPeter1 I'm unsure how you configure the Synology NAS. The output from HTTP is a mjpg stream, with the limitation that only a single stream can be open at a given time.
@ayufan: hmm... the strange thing is, that your server runs fine with a browser (Chrome), vlc and MotionEye, but the Surveilance camera app is not detecting the camera. After one or few frames (depending on the frame size) httpd_resp_send_chunk() returns ESP_ERR_HTTPD_RESP_SEND (not ESP_ERR_HTTPD_RESP_HDR - sorry for the confusion). Error code is 104 - connection reset by peer, so the client had closed its socket. This simple web-server: https://github.com/bnbe-club/rtsp-video-streamer-diy-14 is working fine also with the Surveilance app (and it is sending very similar data with some minor differences that I already have tested). Maybe some low-level buffer for the HTTP stream is smaller?
@JanPeter1
Interesting. The client needs to receive frames otherwise there will be tx timeout and the connection will be force closed on server side. Maybe this is what is happening? That server is delaying receiving next frames to do more of a slow fetch?
@ayufan: It seems that the Surveillance station does not like the chunked transfer encoding. I did a dirty hack to directly send the data to the underlying socket without “transfer-encoding: chunked” and additionally removed the “Content-Length” from the JPEG header, and now the stream is accepted both by the browser and by the Synology app. Since the raw data looks correct to me with your original code, I guess that's a flaw on the Surveillance app side. Thanks a lot for your effort and the nice and easy to use extension. Hope the merge request gets accepted soon.
Interesting. Why did you remove Content-Length
? I might understand the transfer-encoding
, but CL
not sure :)
I see that this upstream code has one bug: the boundary should be written after the frame, not before the frame.
Right, the Conent-Length was ok for the Surveillance app, it just doesn't understand the chunked encoding. A far as I can see, the boundary has to be sent right after the header and then after each image - so the point where you are sending it looks good to me. If you are interested, here is what I added to your code to inspect the sent data and to get rid of the chunked encoding (I know, it's a bit ugly - unfortunately the http_server has no functionality to send chunks of data without the chunked encoding):
#include <sys/socket.h>
int send_data(httpd_handle_t, int sockfd, const char *buf, unsigned int buf_len, int flags) {
enum ChunkType { HEADER, LEN, DATA, CLRF };
static ChunkType chunk_cnt = HEADER; // chunked data state - do not send LEN and CLRF to the socket
static bool clrf = false; // last data was CRLF - start chunked data after two consecutive CRLFs
int ret = 0;
if (buf == NULL) {
return HTTPD_SOCK_ERR_INVALID;
}
switch (chunk_cnt) {
case HEADER: // header data, remove transfer-encoding: chunked
{
static const char *encoding = "Transfer-Encoding: chunked\r\n";
String txt = buf;
txt.replace(encoding, "");
ret = send(sockfd, txt.c_str(), txt.length(), flags);
// if we have removed the transfer encoding, claim that it was sent anyways
if (txt.length() != buf_len) {
ESP_LOGV(TAG, "removed '%s'", encoding);
ret += strlen(encoding);
}
ESP_LOGV(TAG, "sending: %s", buf);
}
break;
case LEN: // length of the current chunk - don't send
ESP_LOGV(TAG, "not sending chunk length: 0x%s", buf);
ret = buf_len;
chunk_cnt = DATA;
break;
case DATA: // data part - send it
ESP_LOGV(TAG, "sending data (%d bytes)", buf_len);
ret = send(sockfd, buf, buf_len, flags);
chunk_cnt = CLRF;
break;
case CLRF: // trailing CLRF - ignore
ESP_LOGV(TAG, "CLRF not sent");
ret = buf_len;
chunk_cnt = LEN;
break;
}
// check for two consecutife CLRFs that separate the header from the body
if ((buf_len == 2) && (buf[0] == '\r') && (buf[1] == '\n'))
{
if (clrf == true) chunk_cnt = LEN; // start with counting data chunks
clrf = true;
} else {
clrf = false;
}
if (ret < 0) {
ESP_LOGD(TAG, "error: %d, errno: %d", ret, errno);
chunk_cnt = HEADER;
return HTTPD_SOCK_ERR_FAIL;
}
return ret;
}
and httpd_sess_set_send_override(req->handle, httpd_req_to_sockfd(req), send_data);
inside streaming_handler_()
.
Cool. The boundary sending moment is likely dependent on presence or absence of Content-Length:
since the client needs to know when to stop reading. I have some simpler way (than above), but will be a little of manual generation of HTTP. Not a big deal, because there's no easy way for non-chunked sending in esp_http
.
@JanPeter1 I pushed small change that removes usage of chunked encoding. Can you validate it via https://github.com/ayufan/esphome-components?
Hi @ayufan: this works great now in the Surveillance app on my Synology NAS, in VLC on Windows and in a browser (Chrome and Edge on Windows). Unfortunately, motionEye on my NAS does not detect the camera (request timed out although several frames get transmitted). I'll do some more tests later (with one of my test versions motionEye was working as well earlier).
works great on me blue iris
You can try my freshly made
external_component
exposing camera over HTTP (snapshot and mjpg stream): https://github.com/ayufan/esphome-components#25-esp32_camera_web_server. It is a single stream allowed (as implemented byesp_http_server
).external_components: - source: github://ayufan/esphome-components esp32_camera_web_server: # define only what is needed # only a single stream is supported at a given time - port: 8080 mode: stream - port: 8081 mode: snapshot
works great on me blue iris
@gastonMM Would you mind sharing your BI config? I've got the webpage working with stream option configured, and can see BI establishes http connection when testing/finding ports via esphome logs, but doesn't stream video when settings are saved. I get No Signal and not Failed to connect, so some connection appears to be happening. I've tried both stream and snapshot, and many port config combinations.
My settings are Network IP Camera Http://192.168.1.20:8080 Make:Generic/Onvif Model:*RTSP H.264/265/mjpg/mpeg4 rtstp_port:8080 onvif_port:8080 main:/ onvif_source:VideoSource_1
I share my configuration with you
@gastonMM Would you mind sharing your BI config? I've got the webpage working with stream option configured, and can see BI establishes http connection when testing/finding ports via esphome logs, but doesn't stream video when settings are saved. I get No Signal and not Failed to connect, so some connection appears to be happening. I've tried both stream and snapshot, and many port config combinations.
My settings are Network IP Camera Http://192.168.1.20:8080 Make:Generic/Onvif Model:*RTSP H.264/265/mjpg/mpeg4 rtstp_port:8080 onvif_port:8080 main:/ onvif_source:VideoSource_1
@gastonMM That did the trick, thank you very much!
@ayufan Do you have a plan to officially put this into the ESPHome project?
Hi @ayufan: this works great now in the Surveillance app on my Synology NAS, in VLC on Windows and in a browser (Chrome and Edge on Windows). Unfortunately, motionEye on my NAS does not detect the camera (request timed out although several frames get transmitted). I'll do some more tests later (with one of my test versions motionEye was working as well earlier).
Hi are there any updates on this as I also cannot get this to work with MotionEye unfortunately. Many Thanks.
My esp32_camera_web_server
is being reviewed upstream:
If you are interested in helping out with docs, in particular how to configure various software this could be great :)
My
esp32_camera_web_server
is being reviewed upstream:
- Add
esp32_camera_web_server:
to expose mjpg/jpg images esphome#2237- Add
esp32_camera_web_server:
usage docs esphome-docs#1595If you are interested in helping out with docs, in particular how to configure various software this could be great :)
Hi @ayufan: this works great now in the Surveillance app on my Synology NAS, in VLC on Windows and in a browser (Chrome and Edge on Windows). Unfortunately, motionEye on my NAS does not detect the camera (request timed out although several frames get transmitted). I'll do some more tests later (with one of my test versions motionEye was working as well earlier).
Hi are there any updates on this as I also cannot get this to work with MotionEye unfortunately. Many Thanks.
@ayufan any thoughts on why this doesn't work on MotionEye?
Describe the problem you have/What new integration you would like Hi there,
I would like to access the ESP32 cam stream without using Home Assistant. I know this has been asked before, but please still read on.
Please describe your use case for this integration and alternatives you've tried: So I've checked existing feature requests and found https://github.com/esphome/feature-requests/issues/205 and https://github.com/esphome/issues/issues/245, but I think they did not do justice for the cause.
ESPHome is awesome, but right now using this firmware for ESP32-based cams locks them to Home Assistant because they cannot be viewed any other way, despite being cheap and powerful little cameras. And using Home Assistant is not always possible/desired.
There are a lot of already existing firmwares that expose standard RTSP/mjpeg streams, but they are not part of the ESPHome ecosystem, so they lack the elegance of simple configuration and hassle-free tooling, without installing Arduino IDE for example.
Could we please have an other component for cameras that is not strictly tied to Home Assistant? These could be integrated for example:
Additional context I, personally, don't want to run Home Assistant, because I do not require any feature from it, but I would want to be able to use ESPHome-based firmwares, because they work well and are easy to configure.