micro-ROS / micro-ROS-Agent

ROS 2 package using Micro XRCE-DDS Agent.
Apache License 2.0
97 stars 51 forks source link

Repeated creation of publishers causes memory usage to grow #144

Closed laogui-wind closed 2 years ago

laogui-wind commented 2 years ago

Issue template

Steps to reproduce the issue

Just run the following line of statements in a loop to reproduce: ros2 topic pub --once <topic forwarded by agent> <msg-type> <msg-data>

Actual behavior

Using htop to continuously observe, the memory usage grows slowly

pablogs9 commented 2 years ago

Does this happen if you do not use the micro-ROS Agent? If you subscribe using ros2 topic echo ...?

laogui-wind commented 2 years ago

Sorry, I didn't describe the problem exactly. The actual situation is that the memory usage of the application micros_ros_agent is increasing.

pablogs9 commented 2 years ago

Which version of the micro-ROS agent are you using?

laogui-wind commented 2 years ago

the commid id is bc2287cc6e6a685f8796183aee5a9f3050a63d83.

pablogs9 commented 2 years ago

Latest commit in micro-ROS agent foxy is 4dd8ffe63bee1e9d1e0400d734706d0be9c775ad, as you can see here: https://github.com/micro-ROS/micro-ROS-Agent/commits/foxy

Please use the latest version and let us know if the problem persists.

laogui-wind commented 2 years ago

OK,thank you!

laogui-wind commented 2 years ago

After updating to the latest version, we found that micro_ros_agent still had memory growth.

pablogs9 commented 2 years ago

Could you provide steps for replicating this?

The micro-ROS client code is important for knowing how the whole system is performing.

laogui-wind commented 2 years ago

Steps to reproduce the issue We reproduce this phenomenon based on a routine: https: //micro.ros.org/docs/tutorials/core/first_application_linux/ In the first terminal, we create an agent:

ros2 run micro_ros_agent micro_ros_agent udp4 --port 8888

PS: commit id is 4dd8ffe63bee1e9d1e0400d734706d0be9c775ad

In Terminal 2, create a subscriber named /microROS/ping:

ros2 topic echo /microROS/ping std_msgs/msg/Header

In Terminal 3, run the following script to loop through the creation of publishers:

#!/bin/bash

source /opt/ros/foxy/setup.bash

while true
do
    ros2 topic pub --once /microROS/ping std_msgs/msg/Header "{'frame_id' : '1'}"
done

Actual behavior By observing the agent through htop, it can be found that the value of RES is increasing: Screenshot from 2022-06-07 00-04-26 2 3 4

pablogs9 commented 2 years ago

I have replicated your whole environment in micro-ROS Humble and it does not seem to increase the RES memory. Could you make a test in the newest release?

laogui-wind commented 2 years ago

My test is based on the foxy branch and the commit id is 4dd8ffe63bee1e9d1e0400d734706d0be9c775ad. What is the id of the latest release you are talking about? In addition, the above test may take one to two hours to have obvious symptoms.

pablogs9 commented 2 years ago

I'm talking about v3.0.2/Humble, I'll take into account the time considerations in my next test

laogui-wind commented 2 years ago

OK, thanks for your suggestion.

laogui-wind commented 2 years ago

We will have many difficulties in changing the ros version. So, is there any commit on the humble branch that can be merged into the foxy branch to solve this problem?

pablogs9 commented 2 years ago

I have tested the environment with micro-ROS Agent and I have been able to replicate your scenario. Also, I have replicated the environment using a standalone Micro XRCE-DDS Agent and there is no memory leak there. I we can infer that there is a memory leak in some part of the micro-ROS Agent, probably in the graph manager due to the nature of the scenario (that creates a new participant for each topic pub).

I will investigate more tomorrow.

laogui-wind commented 2 years ago

Well, thanks for your quick reply and great assistance!!!

pablogs9 commented 2 years ago

Hello @laogui-wind could you check this patch: https://github.com/micro-ROS/micro-ROS-Agent/pull/147 ?

laogui-wind commented 2 years ago

Ok, we're testing the latest commit, thanks again for your help.

pablogs9 commented 2 years ago

Please report so we can close this issue

laogui-wind commented 2 years ago

After a period of testing, there is no memory leak. I think the problem has been solved, thanks for your help.