Unity-Technologies / com.unity.webrtc

WebRTC package for Unity
Other
738 stars 186 forks source link

[BUG]: VideoReceiveSample received texture not updating #1000

Closed Julius-Caesar6 closed 8 months ago

Julius-Caesar6 commented 8 months ago

Package version

3.0.0-pre.6

Environment

* OS: mac OS Big Sur
* Unity version: 2021.3.10f1

Steps To Reproduce

  1. In this environment from the WebRTC samples choose the 'VideoReceive' sample.
  2. In the script VideoReceiveSample.cs
  3. Go to line 83 where the RawImage texture is set to the texture of the VideoStreamTrack.
  4. add a line below line 83 and cast the 'tex' live texture of the received video to a Texture2D (tex as Texture2D), get its pixels and with Debug.Log print the values of the first pixel of the texture
  5. Run and in play mode start the call and add the track.

Current Behavior

The WebRTC connection works and the video is received on the RawImage on the right side and is live. The created Texture2D however will not be updated and always display the samte pixel values. setting receiveImage.texture to the Texture2D object works and produces a live image, but accessing the pixels fo the Texture2D object will be static and give only a single value.

Expected Behavior

correctly show the live pixel data of the received texture of the VideoStreamTrack OnVideoReceived.

Anything else?

No response

karasusan commented 8 months ago

@Julius-Caesar6 Can you give me a code snippet to replicate your issue?

Julius-Caesar6 commented 8 months ago

Of course. In the VideoReceive sample in VideoReceiveSample.cs from line 81 ("video.OnVideoReceived += tex =>") instead of line 83: "receiveImage.texture =tex;", paste this code: ` // Original line where rawImage texture is set to tex. // receiveImage.texture = tex; Texture2D originalReceivedTexture = tex as Texture2D;

                    // Setting rawImage texture to the Texture2D also works and gives the correct live view.
                    // receiveImage.texture = originalReceivedTexture; 

                    // Get pixels of texture into array of Colors.
                    Color[] texturePixels = originalReceivedTexture.GetPixels();

                    // Pixel manipulation ... (for example grayscaling)

                    // Apply pixels back to new texture
                    Texture2D processedTexture = new Texture2D(tex.width, tex.height);
                    processedTexture.SetPixels(texturePixels);
                    processedTexture.Apply();

                    // In this instance in play mode the rawImage only shows a gray image. 
                    receiveImage.texture = processedTexture;`

So what happens is that by doing nothing to the pixels yet and just getting them from one texture and setting them to another, the live aspect is somehow lost. I don't necessarily need to set the pixels to a new texture (in this snippet it is just to illustrate), but need the live pixel values of "tex".

Thank you for your time and help!

karasusan commented 8 months ago

@Julius-Caesar6 Hi, thanks for the detail, I replicated your issue. originalReceivedTexture is updated asynchronously on GPU. CPU memory in texture is not updated, therefore GetPixels returns gray color buffer.

Can you try ReadPixels instead of GetPixels? ReadPixels can read CPU memory from rendertarget. In addition, reading CPU memory is high cost generally, please check the performance.  

Julius-Caesar6 commented 8 months ago

@karasusan Thank you for you reply! I will try your recommendations and see if I can get it working.