Open AshleyRoth opened 9 months ago
Hi Ashley
Certainly.
Connect one camera and run ros2 run pylon_instant_camera node
. A line like this should appear in the log:
Using device [daA1600-60uc] with full name [2676:ba03:2:2:4], user defined name [] and serial number [123456789].
Repeat with the other camera.
You can use either the full name, the user defined name or the serial number to select a particular camera. I just added the serial number option because I found it to be more reliable than the full name (which in case of USB cameras includes topology information which can change).
You may also want to remap the namespace so the output topic will not clash:
ros2 run pylon_instant_camera node --ros-args --param serial_number:="12345678" -r __ns:=/front_camera
ros2 run pylon_instant_camera node --ros-args --param serial_number:="23456789" -r __ns:=/back_camera
Good luck with your project!
Hello!
I will check it today and write an answer.
I'm afraid if i run two instances of the program, the performance will decrease.
Thanks.
Sorry for the long answer. I checked, it works.
I want to ask, how can I reduce the load on the CPU? If two cameras are started, there is a high load on the CPU and "buffer error" errors appear
@fhwedel-hoe By the way. I want to ask you something else. The camera's FPS is lower than in PylonViewer, why is that?
I checked, it works.
Nice to hear that.
how can I reduce the load on the CPU? If two cameras are started, there is a high load on the CPU and "buffer error" errors appear
I reduced the CPU overhead as much as I could. As part of it's communications design, ROS2 will always create a copy of the data which is published. That is what causes a large amount of CPU load on my set-up. Keep in mind that I cannot know for sure what causes the CPU load in your particular set-up.
There has been some extensive discussion here at https://github.com/ros-perception/image_common/issues/216. Bottom line is: If you want speed, do not use ROS2 for image transport. Have one ROS node which connects to one camera. Do the image analysis immediately within that node. Publish only the relevant data (small amounts of data e.g. position of detected markers). Of course, you will not be able to view the image via ROS2 tools. :(
The camera's FPS is lower than in PylonViewer, why is that?
Unfortunately, I have no idea. By default, this node just connects to the camera and starts grabbing. The settings used in PylonViewer are kept. However, I have been told that some Basler cameras revert to their default settings when power cycled. You can save the PylonViewer settings into a file and then supply that file to this node by using the parameter camera_settings_pfs
(it expects a path to the pfs file).
@fhwedel-hoe hi, Thanks for answer.
Last question. Let me ask you something. Why did you decide to write your own frame capture code instead of using the available official driver?
The official node uses the camera in software trigger mode. The node has the authority. That means:
This driver can use the "overlap" mode (you need to configure your camera to use that in PylonViewer) available in dart cameras (and possibly other models, too). Che camera has the authority. With this, the procedure is:
Since the camera can transfer one frame while capturing the next one, the effective frame-rate is doubled.
Hello!
Tell me please. How can I connect two cameras and publish them to different topics?