Unity-Technologies / Unity-Robotics-Hub

Central repository for tools, tutorials, resources, and documentation for robotics simulation in Unity.
Apache License 2.0
2k stars 413 forks source link

Unity performance drastically drops when receiving images #185

Closed Jordi24 closed 3 years ago

Jordi24 commented 3 years ago

For a project, we are trying to stream 4 CompressedImage topics which are publishing at 30Hz to Unity. In the search for a ROS-Unity connection that can handle this, we came across your Unity Robotics Hub which promises to provide faster speeds and better performance than for example the ROS# package.

After following the guides, we managed to successfully setup a connection. However, as soon as we start publishing the image data (even from only just one camera), the Unity screen basically freezes and the FPS shown in the statistics widget of Unity drastically drops from ~190 FPS to 0.1 FPS. At the moment we are using the settings as shown in the screenshot below.

Is there anything specific that we need to change/set correctly before the connection can handle images?

image

mpinol commented 3 years ago

Hi @Jordi24

How are you capturing the image data? Is it occurring in Update or FixedUpdate?

Based on your description my guess is that the image capture and publish commands are occurring on the main thread and blocking the execution of the rest of the scene.

I would suggest putting the image capture and publish functions in a coroutine and see if that helps. https://docs.unity3d.com/ScriptReference/Coroutine.html

I hope this helps and let me know how it goes!

Jordi24 commented 3 years ago

Hi @mpinol, thanks for your reply.

Currently, I use the following code to capture the image data and show it on screen:

using UnityEngine;
using Unity.Robotics.ROSTCPConnector;
using CompressedImage = RosMessageTypes.Sensor.CompressedImage;

public class SubscriberCamera : MonoBehaviour
{
    public string TopicName;

    private Texture2D texture2D;
    [SerializeField] private MeshRenderer meshRenderer;

        private void Start()
        {
        texture2D = new Texture2D(1, 1);
                meshRenderer.material = new Material(Shader.Find("Standard"));
        ROSConnection.instance.Subscribe<CompressedImage>(TopicName, GetImage);
        }

    private void GetImage(CompressedImage Message)
    {
        texture2D.LoadImage(Message.data);
                meshRenderer.material.SetTexture("_MainTex", texture2D);
    }
}

As you can see, no Update or FixedUpdate is involved in the process. However, I did some further research just now and I noticed that when I comment out the lines in the GetImage function, the FPS is just fine whilst the images are still being received. Thus, I figured that not receiving the images but the texture2D.LoadImage is causing the problems. Do you have any suggestions how we could make this process quicker and more efficient? Important to mention is that we want to display the images as soon as they are received.

tgroechel commented 3 years ago

Following an older thread on image loading it looks like someone has made an async/multithreaded texture loader last Oct you may want to check out here

mpinol commented 3 years ago

Hey @Jordi24 ,

Sorry for such a delay in following up, I have been asking around with those more familiar with the Texture APIs and they mostly pointed towards a solution like tgroechel suggested.

I looked into my earlier suggestion of using a coroutine more and don't believe it would help as the texture calls will need to be executed on the main thread.

I also found that Texture2D.LoadImage does texture compression which should cause a slow down versus doing something like,

tx = new Texture2D(img.Width, img.Height, TextureFormat.ARGB32, false);
tx.LoadRawTextureData(bytTx);

Doing it this way should load the raw texture data without any compression and might be a little better than using the LoadImage API.

Have you made any progress on your issue?

Jordi24 commented 3 years ago

Hi @mpinol (and also @tgroechel), thank you for your messages thus far.

Over the last few weeks I have been very busy but in the mean time I also continued working on this issue. Unfortunately, the problem is still not solved. The solution suggested by @tgroechel keeps creating new threads for newly received images which causes my RAM to quickly fill up till the point that Windows forces me to close Unity without any satisfactory result on screen.

@mpinol: When I feed the LoadRawTextureData function with the Message.data byte array, it keeps giving me "not enough data provided (will result in overread)" errors whilst I am quite sure that I have set the correct resolution and TextureFormat. Any ideas how I could solve this?

Please keep in mind that we have 4 camera's, each publishing at 30 Hz which results in 120 jpeg CompressedImage-type messages per second. Thus, the main issue here is speed: how can we make sure that Unity processes and shows the images fast enough? I feel it should be possible to get this streaming of images working and I am looking forward to your suggestions 😄.

mpinol commented 3 years ago

hey @Jordi24 ,

I have run some tests locally and have been unable to reproduce this issue. Do you have a project that exhibits this issue that you would not mind sharing? Even a very simple project would be helpful. Can you please send it to unity-robotics@unity3d.com.

Do you see the same fps drop when publishing only from a single camera?

Getting the correct TextureFormat, height, and width settings can be a little tricky sometimes. What is the total length of the Message.data byte array and what is the expected height and width of the image?

Jordi24 commented 3 years ago

Hi @mpinol, thanks for your message.

Unfortunately, I am not allowed to share the project and the bag files we use for testing. However, for the camera images, we use the standard ROSConnection script that comes from the GitHub combined with this custom code:

using System.Collections;
using System.IO;
using UnityEngine;
using Unity.Robotics.ROSTCPConnector;
using CompressedImage = RosMessageTypes.Sensor.CompressedImage;

public class SubscriberCamera : MonoBehaviour
{
    public string TopicName;

    private bool messageProcessed = false;
    private Texture2D texture2D;
    [SerializeField] private MeshRenderer meshRenderer;

    private int frameCount = 0;
    private int receivedCount = 0;
    private float dt = 0.0f;
    public int FPS = 0;         // Frames Per Second
    public int RPS = 0;         // Received Per Second
    private int updateRate = 1; // 1 update per sec

        private void Start()
        {
        texture2D = new Texture2D(1,1);
                meshRenderer.material = new Material(Shader.Find("Standard"));
        ROSConnection.instance.Subscribe<CompressedImage>(TopicName, GetImage);
        }

    private void Update()
    {       
        if(messageProcessed)
        {
            meshRenderer.material.mainTexture = texture2D;
            frameCount++;
            messageProcessed = false;
        }

        UpdateFPS();
    }

    private void GetImage(CompressedImage Message)
    {
        receivedCount++;

        if(!messageProcessed)
        {
            StartCoroutine(ProcessImage(Message.data));
        }
    }

    private void UpdateFPS()
    {
        dt += Time.deltaTime;
        if(dt > 1.0f/updateRate)
        {
            FPS = Mathf.RoundToInt(frameCount / dt);
            RPS = Mathf.RoundToInt(receivedCount / dt);
            frameCount = 0;
            receivedCount = 0;
            dt -= 1.0f/updateRate;
        }
    }

    private IEnumerator ProcessImage(byte[] ReceivedImage)
    {
        texture2D.LoadImage(ReceivedImage);
        messageProcessed = true;

        yield return null;
    }
}

Testing gives the following results:

Of course, this makes sense as 4*7=28 processed images per second.

Thus, my idea is that if we could speed up the texture2D.LoadImage function in the IEnumerator ProcessImage, the problem would be solved.


Concerning your second remark, the images are expected to have a 1920x1208 (no typo here) resolution and due to the fact that they are formatted as JPEG, I assumed that the texture format would be TextureFormat.RGB24 although I also tested TextureFormat.RGBA32 (both unsuccessfully).

Running Debug.Log(Message.data.Length) gives seemingly random results. A snippet of the log sequence: 123185, 175224, 144423, 310366, 123782, 142630, 123790, 174902, 304706.

Jordi24 commented 3 years ago

@mpinol @tgroechel, unfortunately we are still not able to solve the issue. Any suggestions from your side? (please see my previous message for the current best-performing solution)

mpinol commented 3 years ago

hey @Jordi24 ,

The issue of loading textures is a bit out of my wheelhouse. However, one developer I asked suggested adding the false flag when creating a new Texture2D which should hopefully resolve the error you were seeing before. I believe the flag tells Unity that it is a compressed image.

var texture = new Texture2D(570, 129, TextureFormat.RGBA32, false);
texture.LoadRawTextureData(imageData.bytes);
texture.Apply();

Aside from that I cannot really offer any other guidance on improving the performance of loading textures.

That being said, would you mind briefly describing your use case? What are the images being streamed into Unity and how are they being used? There might be some other method of implementation that might be more performant.

Jordi24 commented 3 years ago

Hi @mpinol,

Thanks for your reply. Unfortunately the suggested false flag did not solve the problem. Do you have any suggestions who I can contact to discuss the texture loading?

The use case for us is the following: we have four cameras connected to ROS which are publishing at 30 Hz each. Thus, in total 120 images are send over the ROS connection in four separate CompressedImage topics every second. In the end, we would like to create a real-time, live view of these cameras in Unity. As said in my previous messages, receiving the 120 images is not the issue but showing them in Unity. To my knowledge, one always needs to use textures when showing images in Unity. If there is any other method to show the received CompressedImages from ROS on screen in Unity, please let me know!

mpinol commented 3 years ago

Hey @Jordi24

It sounds like asking in the general graphics forum would be a good place to start. I asked internally and am still waiting to hear back but will update you if I get any other suggestions.

A-Ivan commented 3 years ago

Hi @Jordi24

I haven't tested it out yet, but there is a github repo (Ros_reality) from a group that used Unity and ROS for VR. They used the ROS# repo, which will be different than what Unity uses today, but what I think will be interesting for you is just to see how multiple images are treated and displayed in Unity via the C# script they used.

In the Ros_reality project, they send RGB images and depth images from the robot (ROS) to Unity, so it may be interesting for you to take a look at and see if something in there helps you.

Here is the main repo page - https://github.com/h2r/ros_reality Here is the directory with the camera/image scripts - https://github.com/h2r/ros_reality/tree/master/Assets/Scripts Here is the Fork this group maintains of the ROS# with some Unity code for Sensor visualization that may help: https://github.com/dwhit/ros-sharp As a bonus, here is how camera calibration was done (it may also help): https://github.com/ShibataLab/kinect_baxter_calibration

This is something I am also interested in, so if it works for you, and if you can, let me know if this helps solve your problem.

Jordi24 commented 3 years ago

Hi @A-Ivan,

Sorry for my late reply. The team is very busy at the moment and I am not sure in what timeframe we will have moment to look at your suggestions. I will let you know when we have some results.

hyounesy commented 3 years ago

Hi @Jordi24, I am closing this issue as there is no further action to take at the moment. Feel free to reopen or open a new issue if you had updates or questions.

Threepain commented 1 year ago

Sorry to reopen this question, in my project, I subscribed to the topic type of /sensor_msgs/Image in ROS and displayed on the Texture in Unity, but the result is displayed as a red question mark. I have tried to load image data with different APIs, but that has no effect. I want to know what I should do? I usedtexture.LoadImage(imageData),if changed totexture.LoadRawTextureData(imageData); texture.Apply();, No question mark will be displayed, but meaningless and chaotic colors will appear.