Open deedos opened 11 years ago
Nice, Even though it's actually slightly different from our scope (tetra does a lot of optimizations specific to our webcams) We'll be looking into that.
Thanks On Oct 31, 2013 2:26 PM, "Daniel Roviriego" notifications@github.com wrote:
TO be able to stream WEBM to Flumotion Server, would be nice to implement the flumotion protocol (FGDP) as a output sink from mbc-tetra. That would allow a direct link to a flumotion Server. Another way of implementing such a thing, would be a lossless sink from tetra (not encoding), and then on could catch this stream in flumotion and encode from there (and thus using the fgdp protocol for streaming 100% open source in WEBM).
the FGDP is implemented in Flumotio as in here https://github.com/Flumotion/flumotion/tree/master/flumotion/component/consumers/fgdpand lucas santos implemented fgdp in landell (a gsttreamer python software which is quite similar in theory, to mbc-tetra) https://github.com/lucasa/landell-fgdp/blob/master/sltv/output/fgdp.py
— Reply to this email directly or view it on GitHubhttps://github.com/inaes-tic/mbc-tetra/issues/1 .
Actually the feature would be only an enhancement for outputing (sink). Once I could not install ans test mbc-tetra and could not test its outputs but I have seen something about a udp output, right ? If we could have high quality mux from there, we could make a localhost network and flumotion would grab tha stream and re-stream that in WEBM to another Flumotion Server in the cloud. What other output are implemented ?
On Oct 31, 2013 3:35 PM, "Daniel Roviriego" notifications@github.com wrote:
Actually the feature would be only an enhancement for outputing (sink). Once I could not install ans test mbc-tetra and could not test its outputs but I have seen something about a udp output, right ? If we could have high quality mux from there, we could make a localhost network and flumotion would grab tha stream and re-stream that in WEBM to another Flumotion Server in the cloud. What other output are implemented ?
I clearly see how it's usefull you don't need to preach me :)
We currently have file and UDP output. Our current goal is to connect it with mbc-playout and for that we need a bit of infrastructure work (that is the current priority).
Anyway, definitively a feature we want to have.
— Reply to this email directly or view it on GitHub.
Ok, that´s great. Could you give me instructions for installing ?
Which inputs cards ( v4l2src ? decklinksrc ? pipeline ?) are supported at the moment ?
As soon as I install, I ´ll try the UDP method for interacting with Flumotion.
On Oct 31, 2013 3:42 PM, "Daniel Roviriego" notifications@github.com wrote:
Ok, that´s great. Could you give me instructions for installing ?
You need tot gst + pygi That should be the only dep (i'm en the phone right now, in a few hours i get to a machine and can confirm)
Which inputs cards ( v4l2src ? decklinksrc ? pipeline ?) are supported at the moment ?
We ingest from v4l2 and have special optimizations for c920 1080p webcams.
As soon as I install, I ´ll try the UDP method for interacting with Flumotion.
Awesome, Adrian should pick this up as soon as he's in front of a computer.
Cheers,
Which inputs cards ( v4l2src ? decklinksrc ? pipeline ?) are supported at the moment ?
Just for now we aim at the Logitech C920 and C910 with specific emphasis on the C920 and newer models that have built-in H264 compression (so we can trade picture quality for number of cameras) but one of our goals is to also support decklink and framegrabbers.
As soon as I install, I ´ll try the UDP method for interacting with Flumotion.
Currently our outputs are only the composited and compressed video and audio via tcp and rtsp, the latter needs more thought. From the discussion above with xaiki what we can easily do is to add another output with shmsink and the raw streams can be picked up with Flumotion, gst-launch, etc.
. From the discussion above with xaiki what we can easily do is to add another output with shmsink and the raw streams can be picked up with Flumotion, gst-launch, etc.
This should be a quick hack (3 lines ui, 4 lines code) let's just do it.
That's awesome! For sure is the cleanest and better way to do it, raw!
2013/10/31 Niv Sardi notifications@github.com
. From the discussion above with xaiki what we can easily do is to add another output with shmsink and the raw streams can be picked up with Flumotion, gst-launch, etc.
This should be a quick hack (3 lines ui, 4 lines code) let's just do it.
— Reply to this email directly or view it on GitHubhttps://github.com/inaes-tic/mbc-tetra/issues/1#issuecomment-27536284 .
Daniel Roviriego (21) 35920701 (21) 99561654
That's awesome! For sure is the cleanest and better way to do it, raw!
Done! it's at https://github.com/inaes-tic/mbc-tetra/tree/gtk3-swift-decoupling-clean
One thing very important that i had completely forgotten: Flumotion is not yet ported to gstreamer 1.0 (there is a gsoc for the task http://www.googblogs.com/tag/gsoc/page/2/ ) , so, using mbc-tetra and flumotion at the same machine would be a bit difficult I suppose. what do you think ? We go back to the other idea of catching the stream using a secondery machine in a best quality possible (on a localhost).
you can, on debian at least there is no problem on having both installed at the same time.
cheers.
On 4 November 2013 14:46, Daniel Roviriego notifications@github.com wrote:
One thing very important that i had completely forgotten: Flumotion is not yet ported to gstreamer 1.0 (there is a gsoc for the task http://www.googblogs.com/tag/gsoc/page/2/ ) , so, using mbc-tetra and flumotion at the same machine would be a bit difficult I suppose. what do you think ? We go back to the other idea of catching the stream using a secondery machine in a best quality possible (on a localhost).
— Reply to this email directly or view it on GitHubhttps://github.com/inaes-tic/mbc-tetra/issues/1#issuecomment-27706108 .
Niv Sardi
One thing very important that i had completely forgotten: Flumotion is not yet ported to gstreamer 1.0 (there is a gsoc for the task http://www.googblogs.com/tag/gsoc/page/2/ ) , so, using mbc-tetra and flumotion at the same machine would be a bit difficult I suppose.
I think they can live together, but the format for caps is different. Just managed to get audio from 1.2 to 0.10 but I'm having some issues specifying the video format.
For the record, I used
gst-launch shmsrc socket-path=tetra-audio.shm ! audio/x-raw-int,rate=48000,channels=2, format=S16LE, endianness=1234, signed=true,depth=16 ! autoaudiosink
and the caps for 1.2 were:
audio/x-raw, rate=(int)48000, channels=(int)2, format=(string)S16LE
Ok, so I managed to get audio and video with GStreamer 0.10, what I missed was that I had to specify the video format as fourcc, so the caps became
video/x-raw-yuv,format=(fourcc)I420,width=640,height=480,framerate=24/1
on my test example.
TO be able to stream WEBM to Flumotion Server, would be nice to implement the flumotion protocol (FGDP) as a output sink from mbc-tetra. That would allow a direct link to a flumotion Server. Another way of implementing such a thing, would be a lossless sink from tetra (not encoding), and then on could catch this stream in flumotion and encode from there (and thus using the fgdp protocol for streaming 100% open source in WEBM).
the FGDP is implemented in Flumotio as in here https://github.com/Flumotion/flumotion/tree/master/flumotion/component/common/fgdp and lucas santos implemented fgdp in landell (a gstreamer-python software which is quite similar in theory, to mbc-tetra) https://github.com/lucasa/landell-fgdp/blob/master/sltv/output/fgdp.py