microsoft / MixedReality-WebRTC

MixedReality-WebRTC is a collection of components to help mixed reality app developers integrate audio and video real-time communication into their application and improve their collaborative experience
https://microsoft.github.io/MixedReality-WebRTC/
MIT License
909 stars 283 forks source link

Support iOS and Android SDK #28

Open anujb opened 5 years ago

anujb commented 5 years ago

Summary:

Support iOS and Android devices

Value proposition:

Creates standard communications model for 1st party and 3rd party devices (iOS, Android, Magic Leap) and introduces network effects that can be leveraged with existing investments in the cloud/edge infrastructure across providers (Azure, AWS, Verizon, etc).

Background:

Existing native SDKs: iOS WebRTC SDK: https://webrtc.org/native-code/ios/ Android WebRTC SDK: https://webrtc.org/native-code/android/ Cross Platform support: TBD

djee-ms commented 5 years ago

Hi @anujb, thanks for opening that task.

Indeed, the underlying Google implementation has some form of support for iOS and Android. However this has never been tested with the current project, so would require some work to make sure this can be leveraged, and to setup some infrastructure for testing and CI for those platforms to ensure proper support.

At the moment this looks like a time investment, and we have other areas of focus with clear immediate customer needs, so this is not a priority. I am happy to reprioritize if there is a strong signal that this can be helpful to customers.

Note however that we do plan to support iOS and Android deployment via Unity at some point. Would that work for you, or are you looking for iOS/Android support for non-Unity native apps (C++ and/or C#)?

valentasm1 commented 4 years ago

Maybe it is obvious, but i want to know if it will be possible to use it in Xamarin.Android native or even Xamarin.Forms? Btw i am impressed in your work. Library looks astonishing and for sure i will use it in future projects

djee-ms commented 4 years ago

Thanks @valentasm1! I don't think anyone ever tried Xamarin, and I don't have any experience with it, so I cannot tell you if it works. I can only tell that this is not an officially supported scenario/platform.

AtosNicoS commented 4 years ago

It looks like the Android support was added with the linked PR. From the readmes it looks to me that android support only works in android studio? Is there any information on xamarin.android available yet?

djee-ms commented 4 years ago

Android support is limited to Unity deployments. There is no plan to add generic Android support (native apps or Xamarin) at this time.

drejx commented 4 years ago

Hello!

I would like to dive into using the MixedReality SDK for Unity mobile, but I'm curious about how Android and iOS support is fairing. I see that work for Android support has already been merged to master (woot!), but from poking around it appears as though it's not functional yet? For example there are some crashes/deadlocks (#335, #329) and video capture is not implemented (#246).

The only documentation I found so far is how to build the Android MixedReality .aar from sources:

https://github.com/microsoft/MixedReality-WebRTC/tree/master/tools/build/android

If I'm not mistaken to get started with MixedReality Android I need to build Google's Android libwebrtc from sources? Is it not possible to use the pre-built Android libs from Google here:

https://bintray.com/google/webrtc/google-webrtc/1.0.30039#files/org%2Fwebrtc%2Fgoogle-webrtc%2F1.0.30039

Or does it need to be built from the Google source in order to link and produce mrwebrtc.aar?

Apologies if some of the questions are newbish, but I'm just starting to dip my toes in the water and don't know if I should dive right in!

Regarding iOS I understand that is on the roadmap. Is there an approximate timeline when work on it will begin?

Thanks, Andrej

djee-ms commented 4 years ago

from poking around it appears as though it's not functional yet?

It's not in a great state, as it's missing video capture (#246), which directly triggers #335 so they're essentially the same unit of work (kind of). #329 is easily worked around; I just have to figure some time slot to investigate how to make the change permanent since it's in a Unity-generated file. Other than that if you are not capturing video on Android but only receiving, and/or using audio or data, everything works. So this is not a great experience but shouldn't be an immediate blocker unless you need video capture. Note that remote video rendering (displaying video received from a non-Android remote peer) works.

I need to build Google's Android libwebrtc from sources?

Yes the README files checked in should describe the process. There are some bash scripts to run which will do all the work for you. It requires a Linux environment (or WSL2, but I'd recommend a proper Linux environment; I use a VM in Hyper-V locally); this is a constraint from Google. Checkout is large and slow, but only need to be done once. Building is reasonably fast after that. Start with tools/build/libwertc, then after that tools/build/android.

Is it not possible to use the pre-built Android libs from Google here

As far as I know they deprecated them at the same time we were starting to look at this. https://groups.google.com/d/msg/discuss-webrtc/Ozvbd0p7Q1Y/M4WN2cRKCwAJ : "We have decided to discontinue the distribution of precompiled libraries for Android and iOS."

does it need to be built from the Google source in order to link and produce mrwebrtc.aar?

Yes. Currently we build mrwebrtc.aar by linking dynamically libwebrtc.aar which is pretty bad. We want to get rid of libwebrtc.aar entirely because it contains many thing we don't need (entire Java API surface) and get a single self-contained mrwebrtc.aar, but we initially had issues with static linking so decided to postpone that and move forward with completing the PR which was adding Android support, and revisit later.

Regarding iOS I understand that is on the roadmap. Is there an approximate timeline when work on it will begin?

I don't have any more info at this time. We are actively working on the 2.0 release, we are roughly feature complete and need to fix and polish the last few items, so this is our work for the next few weeks or so.

drejx commented 4 years ago

Hi @djee-ms,

Thank you for taking the time to respond to my questions.

So this is not a great experience but shouldn't be an immediate blocker unless you need video capture. Note that remote video rendering (displaying video received from a non-Android remote peer) works.

Understood. In my case we do need local/outgoing video in the short term, but as you say it's not a showstopper. As long as local/outgoing video will be available in the next few weeks :)

I was also wondering if the following features are available with the Android version:

  1. Support for ExternalVideoTrackSource for the client application to supply the video frames.
  2. Volume control for audio. This is pretty key in video game applications where voice volume needs to be independently controlled from the music and sfx.

Regarding iOS, although it's not on a specified timeline anywhere, can you say if it's planned to be on the next roadmap? It's something key for planning mobile development since Android and iOS are like peas in a pod 😄

Cheers, Andrej

drejx commented 4 years ago

@djee-ms Hi Jerome,

Sending out a friendly poke if you could take a look at my follow-up questions above. 🙏

Thanks, Andrej

djee-ms commented 4 years ago

Hi @drejx, sorry I missed it!

As long as local/outgoing video will be available in the next few weeks :)

It will.

I was also wondering if the following features are available with the Android version:

  1. Support for ExternalVideoTrackSource for the client application to supply the video frames.

Yes, this is independent of the platform. Actually this should already work; did you try?

  1. Volume control for audio. This is pretty key in video game applications where voice volume needs to be independently controlled from the music and sfx.

What is the feature needed here exactly? I would expect the volume to be controlled by the rendering/output code on the receiver side, no? I think a priori (didn't check) that by default 1) there's no volume associated with an audio track other than the intrinsic amplitude of the raw audio data itself (so there's no gain variable), and 2) automated gain control (AGC) is active on the source when using a microphone. When using a custom audio source (not available yet; external audio track coming soon too) then you're in control of what you produce. Can you describe what you are looking for?

can you say if it's planned to be on the next roadmap? It's something key for planning mobile development since Android and iOS are like peas in a pod.

ACK on the request and the criticality for cross-platform dev, which I completely understand and agree with; unfortunately I cannot share anything about our roadmap beyond 2.0 at this time, sorry.

drejx commented 4 years ago

Support for ExternalVideoTrackSource for the client application to supply the video frames.

Support for ExternalVideoTrackSource for the client application to supply the video frames. Yes, this is independent of the platform. Actually this should already work; did you try?

Not yet, but was curious since it's a key feature for my case at least. I've been setting up the Android build and now starting to fidget with code. But I imagine you're trying to keep the mrwebrtc API surface the same across all platforms.

Volume control for audio. This is pretty key in video game applications where voice volume needs to be independently controlled from the music and sfx.

What is the feature needed here exactly?

My question was based on my previous experience with the Windows (webrtc) version where there is no control on the client/user side to control the webrtc voice volume of remote audio streams. For a use case example, say there is a 2 player game/app and you want to increase the voice volume of the remote (incoming) stream because you can't hear them speak.

....iOs planned to be on the next roadmap?

ACK on the request and the criticality for cross-platform dev, which I completely understand and agree with; unfortunately I cannot share anything about our roadmap beyond 2.0 at this time, sorry.

Oh, that's too bad. Would you be able to say when you think there will be an update/announcement related to "beyond 2.0" so I can be on the lookout? I'm sure I'm not the only one ;)

Cheers, Andrej