carla-simulator / carla

Open-source simulator for autonomous driving research.
http://carla.org
MIT License
11.35k stars 3.69k forks source link

Radar point cloud not appropriate #6345

Open UmangBDhanani opened 1 year ago

UmangBDhanani commented 1 year ago

I am trying to convert the raw data output (depth, azimuth, elevation, velocity) from the carla radar output into cartesian coordinate system to visualize the point in X, Y, Z coordinates. I have applied the 3d polar to cartesian rules to do so. But the point cloud I get in ROS from this X,Y,Z coordinates does not seem proper and it is spreaded all over the place. My horizontal FOV of the radar is 170°. Please help if anybody has already an idea about it. Screenshot from 2023-03-26 22-24-38

6NeverGiveup commented 1 year ago

建议参考一下manual_control.py文件,里面关于radar的部分有可视化,他显示的点云确实是这个样子,就是看上去比较杂乱,我感觉可能是没问题的,你是打算做什么吗

UmangBDhanani commented 1 year ago

I am planning to display the radar points in X,Y Z coordinate frame of the sensor. But the radar output from the Carla does not provide directly the X,Y,Z information and rather outputs azimuth, elevation, depth and velocity of every point detected. I simply converted the 3D polar coordinated outputs into Cartesian system using mathematical equations. But the values of X,Y,Z after conversion, when plotted, outputs the figure as shown above., The code used to convert from polar to Cartesian is also attached with it.

UmangBDhanani commented 1 year ago

The points displayed above are spreaded in all directions while practically the radar has only front field of view in front of the vehicle. Hence the point cooud should be formed in front of the car, in the X direction of the vehicle or radar sensor.

JosenJiang commented 1 year ago

You can try the method in the /PythonAPI/Example/manual_control.py:

    radar_points_list = []
    for detect in radar_data:
        azi = math.degrees(detect.azimuth)
        alt = math.degrees(detect.altitude)
        # The 0.25 adjusts a bit the distance so the dots can
        # be properly seen
        fw_vec = carla.Vector3D(x=detect.depth)
        carla.Transform(
            carla.Location(),
            carla.Rotation(
                pitch=alt,
                yaw=azi,
                roll=0.0)).transform(fw_vec)
        radar_points_list.append([fw_vec.x, -fw_vec.y, fw_vec.z, detect.velocity])
s0966066980 commented 1 year ago

I am trying to convert the raw data output (depth, azimuth, elevation, velocity) from the carla radar output into cartesian coordinate system to visualize the point in X, Y, Z coordinates. I have applied the 3d polar to cartesian rules to do so. But the point cloud I get in ROS from this X,Y,Z coordinates does not seem proper and it is spreaded all over the place. My horizontal FOV of the radar is 170°. Please help if anybody has already an idea about it. Screenshot from 2023-03-26 22-24-38

May I ask if you have success? I am very interested in your code, I am currently doing the same thing, I try to map it to a 2D image, as the result of /PythonAPI/Example/manual_control.py, but I still can't succeed, I am input image and input azimuth, elevation, depth and velocity

S-Lyu commented 4 months ago

@UmangBDhanani , @s0966066980

You can try this code. I wrote this due to slow operation of the method in the /PythonAPI/Example/manual_control.py.

import numpy as np
from scipy.spatial.transfrom import Rotation

def get_radar_data(publisher, radar_data, location, radar_yaw=None, is_back=False):
    '''Calculate pointcloud from the radar raw data then publish.

    Args:
        publisher: topic publisher for ROS.
        radar_data: radar raw data.
        location: Y axis translate of radar sensor related to the ego vehicle.
        radar_yaw: Z axis rotation of radar sensor related to the ego vehicle.
        is_back: if the radar sensor is located and capturing back.
    Returns:
        pointcloud: calculated pointcloud.
    '''
    raw_data = np.frombuffer(radar_data.raw_data, dtype=np.float32)
    raw_data = raw_data.reshape([len(radar_data), 4])

    azimuths = raw_data[:, 1]
    altitudes = raw_data[:, 2]
    depths = raw_data[:, 3]
    velocities = raw_data[:, 0]

    rotations = Rotation.from_rotvec(
        np.array([np.zeros_like(depths), altitudes, -azimuths]).T
    ).as_matrix()
    pointcloud = np.zeros_like(raw_data)
    pointcloud[:, 0] = depths
    pointcloud[:, 3] = velocities

    pointcloud[:, :3] = np.einsum('ijk, ij -> ik', rotations, pointcloud[:, :3])
    if radar_yaw is not None:
        yaw = Rotation.from_euler('z', radar_yaw, degrees=True).as_matrix()
        pointcloud[:, :3] = (yaw @ pointcloud[:, :3].T).T
    if is_back:
        pointcloud[:, 0] *= -1
    pointcloud[:, 1] += location

    publisher.publish(pointcloud)
    return pointcloud

This calculates the same result as the method in the /PythonAPI/Example/manual_control.py while the operation time is greatly reduced. timeit_result

This shows the pointcloud from 4 radar sensors attached to the ego vehicle. (front-left, front-center, front-right, back-center) Radar

The tested carla environment is below. Red bounding box is the ego vehicle. environment