Closed eravion closed 6 months ago
With the current code is not possible. And I don't know if live555 library (the library I'm using) is able to implement this scenario.
I asked to ChatGPT, it propose the following code.
#include <liveMedia/liveMedia.hh>
#include <liveMedia/RTSPClient.hh>
#include <liveMedia/BasicUsageEnvironment.hh>
#include <liveMedia/H264VideoRTPSink.hh>
#include <liveMedia/MediaSession.hh>
int main(int argc, char** argv) {
TaskScheduler* scheduler = BasicTaskScheduler::createNew();
UsageEnvironment* env = BasicUsageEnvironment::createNew(*scheduler);
RTSPClient* rtspClient = RTSPClient::createNew(*env, "rtsp://129.180.1.1:554/my_channel", 0, nullptr, 0);
MediaSession* mediaSession = MediaSession::createNew(*env, "my_channel", "Stream from my_channel", "Session description");
H264VideoRTPSink* videoSink = H264VideoRTPSink::createNew(*env, rtspClient->createNewStream(0), 96); // 96 is the RTP payload format
mediaSession->addSubsession(videoSink);
mediaSession->initiate();
env->taskScheduler().doEventLoop(); // Start streaming
// Cleanup
delete rtspClient;
delete scheduler;
return 0;
}
No idea if it is working or not ...
This code is not clear to me.
the full exchange here : https://chat.openai.com/share/4e645b57-1967-449e-a09a-05e9b542dcb3 Does it help?
Broadcasting RTSP using the Live555 library involves setting up a server to stream the content and making it accessible to clients.
I think it's not exaclty what you are looking for. You want to send the stream to a server, or I'm wong?!?
yes, that's what the ffmpeg command is doing ffmpeg -f v4l2 -i /dev/video0 -c:v h264_omx -preset medium -tune zerolatency -b:v 500k -f rtsp rtsp://output-url Send the stream to the destination "output-url" using "rtsp" protocol
another option is to use ffmpeg on the camera. A compatible version is available here : https://johnvansickle.com/ffmpeg/ both armhf and armel are worming fine.
Did you try it on the cam? Is the cpu enough to run ffmpeg?
Yes, it is working fine :
I wrote a simple script that I lunch manually in order to restart ffmpeg in case of. But I don't know to start it at camera startup and if it is possible to add it in the web UI ?
#!/bin/bash
STREAM_URL=$1
# rtsp://127.0.0.1/ch0_0.h264
STREAM_NAME=$2
# my_camera_name
DESTINATION=$3
# ip or fqdn of destination
while true; do
# need an IP for ffmpeg, DNS resolution issue
IP_RTSP=`ping -c 1 ${DESTINATION} | grep ${DESTINATION} | awk '{ print $3}' | sed 's/(//' | sed 's/)://' | sed 's/)//' | grep -v ping`
ffmpeg -i ${STREAM_URL} -c:v copy -c:a copy -rtsp_transport tcp -f rtsp rtsp://${IP_RTSP}/${STREAM_NAME}
if [ $? -eq 0 ]; then
echo "FFmpeg ended."
break
else
echo "FFmpeg crash, restart in progress..."
sleep 5
fi
done
with a ngnix server on camera, we can see also the video via browser eventually!
You can use startup.sh file, it's the last script executed by system.sh.
You can also try this ffmpeg: https://github.com/roleoroleo/yi-hack-utils/tree/main/Allwinner-v2 It's partially static (no libraries but built with the proper libc).
Yes, it is ok with this version It is using less memory and less CPU.
Good point
Next step having an automatic startup or a startup managed by the UI
@eravion
Great work 🔥 Can you try to .m3u8 ? so, we can see the video streaming directly on bowser.
like that: ffmpeg -rtsp_transport tcp -i "rtsp://127.0.0.1/ch0_0.h264" -y -tune zerolatency -x264-params keyint=3 -hls_time 3 -hls_list_size 3 -start_number 1 camera.m3u8
Hello @Giuserver I tried the 3 binaries of ffmpeg. For the two coming from https://johnvansickle.com/ffmpeg/, the result is NOK. The camera hang once ffmpeg started. No file in the folder, no access to the camera ssh/http. Power cycle to have it back.
With the binary from https://github.com/roleoroleo/yi-hack-utils/tree/main/Allwinner-v2 It failed with this error :
Unrecognized option 'tune'.
Not sure this device is strong enough.
I'm building the new version of ffmpeg. I will share it asap. In the meantime I also compiled live555HLSProxy to create an hls video. If you are interested I can share it.
This is the www folder with all new js to play the hls video: www.tar.gz Overwrite your /tmp/sd/yi-hack/www (after a backup).
This is the hls proxy: live555HLSProxy.gz
mkdir /tmp/live
mkdir /tmp/sd/yi-hack/www/live
mount --bind /tmp/live /tmp/sd/yi-hack/www/live
cd /tmp
live555HLSProxy rtsp://127.0.0.1/ch0_0.h264 live
At the moment live555HLSProxy is built with just 2 segment 5 seconds long. Maybe it's not enough.
I'm not a good js programmer, so it's not perfect.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Hello,
Today camera "expose" a RTSP stream, could it be possible to add a feature so stream/broadcast RTSP from the camera? More or less what we can do with a raspberry and ffmpeg : ffmpeg -f v4l2 -i /dev/video0 -c:v h264_omx -preset medium -tune zerolatency -b:v 500k -f rtsp rtsp://output-url
Where rtsp://output-url is targeting a server listening for RTSP incoming stream (like https://github.com/bluenviron/mediamtx).
Thanks.