Open laheller opened 5 years ago
Interesting idea. There are presumably additional controls to limit the amount of bandwidth used and so on?
Interesting idea. There are presumably additional controls to limit the amount of bandwidth used and so on?
Just check the mentioned softwares first to see, how it works. About the bandwidth, I would look for API library with such feature, otherwise not sure. Or maybe ffmpeg can do that.
@openastroproject @fooons @mssc89
Documentation used for the below sample.
Broadcast from webcam to youtube via ffmpeg from command line:
ffmpeg -hide_banner -f lavfi -i anullsrc=r=16000:cl=mono -rtbufsize 50M -thread_queue_size 8 -f v4l2 -framerate 30 -video_size 1280x720 -i /dev/video0 -vcodec libx264 -preset ultrafast -b:v 1500k -metadata title="Open Astro Project" -metadata description="Open Astro Project" -acodec aac -ar 48000 -f flv "rtmp://a.rtmp.youtube.com/live2/your-keyz-goez-here"
The above command uses NULL audio source (Youtube needs also an audio in broadcast) and the V4L device [/dev/video0]
Required packages are or might be: ffmpeg, v4l-utils
@openastroproject @fooons @mssc89 Hello
Any news or update on the above? Do you plan to implement such (at least experimental) feature?
BR,
Ladislav
I will look at how this might be possible for the 1.7.0 release, which is going to be some months away. At the moment I am trying to get 1.6.0 finished and released.
Actually, mounting live display from oacapture as virtual camera in /dev/videoX should not be too hard, and would be more versatile. I may look at this in my spare time and create fork, but I do not guarantee anything.
Hi @openastroproject @mssc89
Any update? I would like to use this feature as an (hopefully better) alternate to AllSky camera, which btw has NO such feature.
BR,
Ladislav
I will begin working on it from september, since I will also need this feature for my own project. I haven't watched this project for quite some time, maybe something has moved on in this topic?
I have been thinking about how it might best be done, but I haven't got around to writing code yet. Still working on a DSLR interface for oalive, though I think I'm getting close with that now. I haven't forgotten about it :)
I have been thinking about how it might best be done, but I haven't got around to writing code yet. Still working on a DSLR interface for oalive, though I think I'm getting close with that now. I haven't forgotten about it :)
As @mssc89 wrote in his older comment:
Actually, mounting live display from oacapture as virtual camera in /dev/videoX should not be too hard, and would be more versatile.
I think this way could be the best, since with that virtual camera device you can do almost anything, use in other applications, etc.
Update:
There is a command line tool (from package v4l2loopback-utils) to create virtual video device, where the commands in case of command line would be:
sudo modprobe v4l2loopback
Then using ffmpeg it's easy to push the stream from camera to virtual device (with name video1):
<stdout_stream_from_camera_using_oacapture> | ffmpeg -i - -vcodec rawvideo -pix_fmt yuv420p -threads 0 -f v4l2 /dev/video1
Another possibility, that may be simpler to implement, would be to have oacapture write a data stream to a named pipe (I think it should be possible to write a stream of PPM images in binary format, which would be relatively straightforward to generate). ffmpeg could then be used to send those images wherever.
Ok, I have a new output format for oacapture now that writes 8-bit mono/RGB/BGR data to a named pipe as a sequence of PPM files. It looks as though it works ok in that if I cat the named pipe into a normal file I get the output I expect. I'll check it in once I untangle my various development branches.
I believe ffmpeg should be able to process a stream of ppm file data from the named pipe into a video stream of some sort, or even into a V4L2 loopback device. I can't work out how to do either at the moment however.
@openastroproject
Yes, this solution looks more universal, since ffmpeg can receive any data from stdin when called following way:
<oacapture data to pipe> | ffmpeg -i -
That last "-" at the end of command line means ffmpeg takes data from pipe.
It's also a bit more portable as MacOS doesn't have V4L2 for example, and provides a little more flexibility given that ffmpeg options aren't the most stable of things (though I admit they've got better in recent years).
It would be nice to add direct support for YouTube/Facebook too and I see no reason not to do that, even in the next release if time allows, but this change should at least allow the opportunity to work out the exact ffmpeg options required to get it all to happen.
Hi @openastroproject
VLC and OBS softwares has such feature, basically to transcode the live stream coming from camera (via ffmpeg?) and push it to following RTMP endpoints:
rtmp://a.rtmp.youtube.com/live2/your-keyz-goes-here
rtmp://live-api-s.facebook.com:80/rtmp/stream_key
BR,
Ladislav