Closed CCMeta closed 3 years ago
The answer is I don't know. I'm yet to try it.
From a 2 second Google search it seems like Xamarin.Anrdoid + FFmpeg should work. Xamarin.iOS + FFmpeg seems possible but trickier.
I have been able to deploy the XamarinForms example app to my iPhone. So if you do try with iOS and get stuck on a programming issue I can at least look into it.
So Seems worth to try at least . I'll on it. danke~
What's your goal with using FFmpeg on Android? Video encoding? Or do you need to hook into the audio/video devices as well?
I've been playing around with Xamarin Android a bit and it seems that native libraries can be bundled into the application relatively painlessly. The painful bit seems to be building the native libraries but I suspect once it's been done once it's straight foward. If you're only need video encoding then building libvpx
for Android and using SIPSorceryMedia.Encoders could be easier than the FFmpeg route.
Yes, this goal is Video encoding, And seems need to play such as rtp, rtsp, sdp straightly. The best result is just using Xamarin Forms , I can run it both on iOS and Android.
Is you said SIPSorceryMedia.Encoders
is same sort of SIPSorceryMedia.FFmpeg
? And I just need to use only one of them ?there be OK?
Yes if Android can run with this way, I want check iOS much be curious.
thanks for you kind note.~
The encoding and device integration with sipsorcery is a bit convoluted. This is due to the lack of available C# libraries to do either of those things and where they are available not in a cross platform manner.
The current situation is:
sipsorcerymedia.encoders
: wrapper for libvpx (video encoding) including win x64 native libvpx dll.sipsorcerymedia.ffmpeg
: wrapper for ffmpeg libraries (audio and video encoding + more) with included win x86 and x64 dll's BUT also with the ability to use FFmpeg dll's on Linux and macOS. Not tested on Android or iOS.sipsorcerymedia.windows
: hooks into WinRT media controls for webcam capture.Ultimately the plan is to get rid of sipsorcerymedia.encoders
and replace it with a C# port of libvpx or a H264 encoder but I have no idea if/when that will happen.
After that different solutions will still be required for using webcam and audio devices on each platform in the same way sipsorcerymedia.windows
currently facilitates.
Can we build servals ffmpeg
of all platforms. I still think is encoding task is almost nothing to do with C# libraries.
It's all work for platform's ffmpeg
guy.
So I like sipsorcerymedia.ffmpeg
a little more.
And perhaps this repo called FFmpeg.AutoGen
seems can not satisfied for sipsorcerymedia.ffmpeg
well with all platforms, specially arm gang.
If things allowed, I choose to deny him. Just play with C# Strings to running ffmpeg
.
eg:
First build all platfroms ffmpeg
stuff into dist.
Then If a app worked on android-arm and android-x86. We run two different command for that.
for android-x86 /bin/x86/ffplay audio.wav
for android-arm /bin/arm64/ffplay audio.wav
If you want to, PLZ teach me about this point's problem. Will Much thanks for that.
Can we build servals ffmpeg of all platforms.
It's possible but over time would end up being a big maintenance burden. Anytime a native library is required cross platform support can quickly spiral out of control...
FFmpeg does have an advantage that it can be installed independently as a separate package on a number of OS thereby removing the need for packaging native libraries. Like any software the FFmpeg library will also change over time so there;s maintenance involved.
I still think is encoding task is almost nothing to do with C# libraries.
There are no free or opensource C# VP8
or H264
libraries that I know of. But that doesn't mean it's not suitable. I actually think C# would be fine for video encoding in client applications where there are likely to be a small number of streams.
I'm leaning towards spending time trying to port libvpx
or openh264
rather than doing more native library builds and wrappers...
So this is bit like Game Theory :(
If some hard device just only use HEVC , I need to set it into somebody like ffmpeg
to encoding new stuff out for SIP.
Suppose I insist opinion of my words. Everything I need is just SIPSorceryMedia.Abstractions
Right?
I know nothing about codec, So Next counting on you! thank!
If some hard device just only use HEVC , I need to set it into somebody like ffmpeg to encoding new stuff out for SIP.
To support a wide range of codecs, especially video ones, native wrappers are the only viable option. Of the native libraries FFmpeg is one of the best, if not the best, option since it wraps up a bunch of other libraries in a consistent interface.
For Android and/or iOS if you can work out how to get FFmpeg installed then it shouldn't be too much of a leap to use this SIPSorceryMedia.FFmpeg
library to use the install for VP8 and H264 encoding and decoding.
Suppose I insist opinion of my words. Everything I need is just SIPSorceryMedia.Abstractions Right?
If your question is:
If I want to write an audio or video endpoint to work with the main SIPSorcery library are all the required interfaces to implement in
SIPSorceryMedia.Abstractions
.
The answer is yes. The SIPSorceryMedia.Abstractions
package has been designed to be the interface between the main library and audio/video capabilities.
This Month I got a big sick until last day. Sorry about the late comment. Two things.
rtp://
or rtcp://
. I see there is a demo named webrtc-to-ffplay
. But there is just build a sdp
file and no any uri. I want to make webrtc-to-uri
. Could U give me some suggest with it?OnRtpPacketReceived => RTPPacket.PayloadType
is always equals 111
, Is this normally? I think some lib even used follow rfc3551
still can not recognize dynamic value 111
.This day I find Android And iOS both have MediaCodec
, But it's basically. I want try them without any ffmpeg
first.
How to build a URI about rtp:// or rtcp://. I see there is a demo named webrtc-to-ffplay. But there is just build a sdp file and no any uri. I want to make webrtc-to-uri. Could U give me some suggest with it?
FFmpeg
supports the rtp
scheme (nothing supports the rtcp
scheme it's part of RTP, did you mean rtsp
?). But to be able to consume an RTP stream you need to know what the payload ID's correspond to. RTP does not have any way to negotiate media formats etc. That's where the SDP file comes in.
Unfortunately FFmpeg
does not support WebRTC so you won't be able to use it directly without some sort of bridge application along the lines of my demos. You could try gstreamer
instead. It's supposed to be very similar to FFmpeg
and does support WebRTC. I don't have any experience using it yet.
My OnRtpPacketReceived => RTPPacket.PayloadType is always equals 111, Is this normally? I think some lib even used follow rfc3551 still can not recognize dynamic value 111.
It's probably OPUS
and yes dynamic payload ID's are perfectly normal and part of the standard. That's what you need the SDP exchange for. Each peer needs to know what the dynamic payload ID's correspond to and then use them to negotiate common formats.
It has never occurred to me that play a SDP file is such complex.
First, Try Xamarin Android SDK Codec
;
Second, Try gradle lib org.webrtc:google-webrtc
of Native Java;
Third, Try gstreamer
beyond FFMPEG
;
Seems It is a long way to finish this. thanks for reply.
Could I build a rtsp stream (Such as rtsp://locahost:port/video.sdp
) with method CreateRtpSession
So this is My time midnight 2:00. I am so excitedly to tell U. The WebRTC RTP Media channel of Xamarin-Android is complete to connecting without any FFMPEG, GStreamer or Native WebRTC Java codes. It just use a nuget package called VLC. No more. Soon if you want have some information. tell me yet.
Deeply Thanks of all your kind helps.
Wow that's big news. I'd love to see how you've done it!
https://github.com/CCMeta/Xamarin-Forms-demo
To run this demo and you will see every code of this. This demo is so unfunctional and can not make RTC offer to sth. You need to open https://shadow-board.ccmeta.com/web-rtc-canvas-with-css.html to make a RTC offer to Xamarin android. When RTC connection status is OnRTPPacket it will CreateLocalRTPSession like your code ffmpeg. Then to Click a start button on Android UI. The audio will come out. I just write Xamarin receive audio from other client. So you need to open web-rtc-canvas-with-css.html with microphone. This demo is a shack. Just want to show you the Core Point is nuget package LibVLC.
Just two folders are necessary. Xamarin-Forms-demo
and Xamarin-Forms-demo.Android
If I understand your code correctly what you're doing is:
Does the information below represent what you're doing?
Browser Sending:
Canvas (video)/Microphone -> RTCPeerConnection (javascript) -> network
Android Receiving:
network -> RTCPeerConnection (C#) -> RTPSession -> VLC player
That's a neat trick with the loopback RTPSession
to VLC player.
Yes. Absolutely like as your words. I wonder LibVLC can do lot things. Such as decoding, renders, convert and more. In my information and mind. This package should work on all kinds of platform. At beginning I guess it has a lag at decode step. Not so 'Real-time'. But now it runs well enough. Delay = 100ms or so.
I think there is nothing I can do for this Lib, But I want to. :)
Thanks your help. I'll close this.
Hey Here I am again :P
For a long while I am watching this lib. I know this is for Linux, Mac, Windows. Let me take a possible. If I build
FFmpeg
library to iOS or Android this and make sure it could found in console(environment variables). ThisSIPSorceryMedia.FFmpeg
will work well like desktop platform yet ? I want to have some little test, So do you recommend me to do this? Or this is just fantasy for me? T_Twaiting for answer and thanks.