Closed ShoziX closed 2 years ago
@zhangtao1104 I just saw V2.5 was released, does that update happen to incorporate this? 🤞
Not yet, we plan to update the unity sdk in the end of September.
@zhangtao1104 I haven't quite given up on this yet. I've got the Agora SDK integrated into my app but this is the last step. Currently I'm using the camera feed but this obviously becomes an issue when using arcore and arkit because they all need access to camera.
I'm just completely stuck on this wrapping native sdk step.
I'm hoping I can get this working in the next couple weeks because I've got some conferences coming up in 4 weeks and I'd really like to get this feature working :)
I also love the agora platform and would love to stick with it.
Is there any other way to get the setexternalvideosource and pushvideoframe functions working? 🤞
OK, our customer support engineer will contact with you. Please communicate with him first, so that we can provide you with more help.
Not yet, we plan to update the unity sdk in the end of September.
Do you have any update on this? This would be a great feature for many developers I think.
Yes, you can download agora sdk from unity Asset store, it already support this. @biemann @ShoziX @ARUtility
void Start() { mRect = new Rect(0, 0, Screen.width, Screen.height); mTexture = new Texture2D((int)mRect.width, (int)mRect.height,TextureFormat.RGBA32 ,false); } void Update () { StartCoroutine(cutScreen()); }
IEnumerator cutScreen() { yield return new WaitForEndOfFrame(); //videoBytes = Marshal.AllocHGlobal(Screen.width * Screen.height * 4); mTexture.ReadPixels(mRect, 0, 0); mTexture.Apply(); Renderer rend = GetComponent<Renderer> (); rend.material.mainTexture = mTexture; byte[] bytes = mTexture.GetRawTextureData(); int size = Marshal.SizeOf(bytes[0]) * bytes.Length; IRtcEngine rtc = IRtcEngine.QueryEngine(); if (rtc != null) { int a = rtc.PushVideoFrame((int)mRect.width, (int)mRect.height, bytes); Debug.Log(" pushVideoFrame = " + a); } }
can you show how to setexternalvideosource() ??
SetExternalVideoSource( true, false)
Assets/PushVideo.cs(45,25): error CS1061: Type agora_gaming_rtc.IRtcEngine' does not contain a definition for
PushVideoFrame' and no extension method PushVideoFrame' of type
agora_gaming_rtc.IRtcEngine' could be found. Are you missing an assembly reference?
Did you try with PushExternVideoFrame
instead ?
This is my definition of CutScreen(). I modified the version given by Jake of Agora's Slack channel:
private IEnumerator CutScreen() {
yield return new WaitForEndOfFrame();
mTexture.ReadPixels(mRect, 0, 0);
mTexture.Apply();
byte[] bytes = mTexture.GetRawTextureData();
IRtcEngine rtc = IRtcEngine.QueryEngine();
if (rtc != null)
{
ExternalVideoFrame externalVideoFrame = new ExternalVideoFrame()
{
type = ExternalVideoFrame.VIDEO_BUFFER_TYPE.VIDEO_BUFFER_RAW_DATA,
format = ExternalVideoFrame.VIDEO_PIXEL_FORMAT.VIDEO_PIXEL_BGRA,
buffer = bytes,
stride = (int) mRect.width,
height = (int) mRect.height,
cropLeft = 0,
cropTop = 0,
cropRight = 0,
cropBottom = 0,
rotation = 0,
timestamp = i++
};
rtc.PushExternVideoFrame(externalVideoFrame);
}
}
Did you try with
PushExternVideoFrame
instead ?
Not getting any reference for PushExternVideoFrame either!!
Do you download the newest sdk package from the unity asset store? @kkarannn
You can send the file named AgoraGamingRtcEngine.cs to me ,i can help you to check it.
According to the file that you provided, i can confirm that you didn't download the sdk from the unity asset store, please download it from the unity asset store and try again. @kkarannn
I downloaded new SDK. rtc.PushExternVideoFrame(externalVideoFrame); is now getting referenced but Failure to initialize! Your hardware does not support this application, sorry!
Do you run the application in the android phone?
I think it is because the android plugin include the architect arm64-v8a.
You have two choice, you can choose any one: 1: delete the architect arm64-v8a. (Google has mandatory requirements for the arm64 architecture of Android Applications ,you can do this if you're just debugging locally instead of publishing)
2: Change the scripting backend in build settings from mono to IL2CPP. The mono don't support to build arm64, but the IL2cpp can support.
@kkarannn
Please try again according to my suggestions,it will work well.
I got it working!! But there are couple of issues moving forward! Appreciate your quick responses. Thank you Sir.
1.Is there a to have wayExternalVideoFrame.VIDEO_PIXEL_FORMAT RBGA instead of BGRA The colour format is appearing wrong on my spectating device. The colours are hued i guess BGRA is causing it.
2.And also my arcore device which is pushing external video, the screen is duplicated and flipped!
1: about the bgra, you can set the TextureFormat to TextureFormat.BGRA32 and try again.
2: About the flip, you can flip the gameobject before you bind the videoSurface to the gameObject that you want to render your video.
@kkarannn can you please share woking code of ARFoundation i am trying for long time and struck
Please see the README file in the latest SDK regard ARFoundation.
For now, there is no way I can modify the texture that is being transmitted over the network. I am building an AR app in unity and I want the feed to be shared after the AR processing is done and If I want to show game objects in camera view, they should also be shared over the network. If I have access to the texture which is getting sent from my device I can handle this on my own but now dll handles everything other than on which gameObject feed gets rendered. Is there anything am I missing? Or any lead through which I can make it work?