carla-simulator / carla

Open-source simulator for autonomous driving research.
http://carla.org
MIT License
10.81k stars 3.48k forks source link

Generate velodyne like lidar data in lidar #7060

Open hrnbot opened 5 months ago

hrnbot commented 5 months ago

Thanks for contributing to CARLA!

If you are reporting an issue, please use the following outline:

CARLA version: Platform/OS: Windows Problem you have experienced: I want to generate velodyne like results What you expected to happen: i want result dense like below image but it is like this image

Steps to reproduce: Below settings i am using

self.carla_world = self.world
self.carla_bp_lib = self.carla_world.get_blueprint_library() 
self.lidar_obj = self.carla_bp_lib.find("sensor.lidar.ray_cast")

n_channels = 64
h_fov_deg = 360
rotation_rate_ms = 20
self.range_m=30
role_name = 'ego_lidar'
self.lidar_obj.set_attribute('channels', str(32))
self.lidar_obj.set_attribute('horizontal_fov', str(360))
self.lidar_obj.set_attribute('rotation_frequency', str(500))
self.lidar_obj.set_attribute('range', str(60))
# self.lidar_obj.set_attribute('dropoff_general_rate', '0.1')
# self.lidar_obj.set_attribute('dropoff_intensity_limit', '0.1')
# self.lidar_obj.set_attribute('dropoff_zero_intensity', '0.75')
self.lidar_obj.set_attribute('points_per_second', str(1000000))
self.lidar_obj.set_attribute('upper_fov', str(30))
self.lidar_obj.set_attribute('lower_fov', str(-30))

self.carla_bp_lidar = self.world.spawn_actor(
blueprint=self.lidar_obj,
transform=carla.Transform(carla.Location(0,0,2.50)),
# transform=carla.Transform(carla.Location(x=20, z=0.0)),
attach_to=self.player,attachment_type=carla.AttachmentType.Rigid)

self.carla_bp_lidar.listen(lambda data: self.show_lidar_data(data))
def show_lidar_data(self,lidar_data):
  # print("lidar_data--",lidar_data.shape)
  # print("Add lidar",self.lidar_queue.qsize())

  image_size = (608, 608)
  image = np.zeros((image_size[0], image_size[1], 3), dtype=np.uint8)

  # Define a scale factor to fit the LiDAR data within the image dimensions
  # This will depend on the range of your LiDAR
  scale = min(image_size) / (2.0 * self.range_m)  # Adjust lidar_sensor.range to your LiDAR's range

  # Take the points from the LiDAR data
  # Each point is (x, y, z, intensity)
  points = np.frombuffer(lidar_data.raw_data, dtype=np.dtype('f4'))
  points = np.copy(np.reshape(points, (int(points.shape[0] / 4), 4)))  # Make a writable copy

  # Normalize the intensity values to 255
  intensities = (points[:, 3] - min(points[:, 3])) / (max(points[:, 3]) - min(points[:, 3]))
  intensities = (intensities * 255).astype(np.uint8)

  # Transform 3D coordinates to 2D image coordinates
  points[:, :2] *= scale
  points[:, :2] += (0.5 * image_size[0], 0.5 * image_size[1])
  points[:, :2] = np.fabs(points[:, :2])  # Absolute value (mirror the y-axis for visualization)
  points = points.astype(np.int32)

  # Draw the points on the image
  for i in range(points.shape[0]):
  x, y = points[i, :2]
  if 0 <= x < image_size[0] and 0 <= y < image_size[1]:
      image[y, x, :] = intensities[i]

  # Save the image
  # cv2.imwrite('lidar_image.png', image)
  cv2.imshow("Lidar", image)
  cv2.waitKey(1)

what i want to do is use pretrained vehicle model on the velodyne data set in simulator or collect dataset for the annotation of the model.

PatrickPromitzer commented 5 months ago

The points_per_second variable is split between all the pointcloud you get in the time frame of 1 second.

rotation_frequency = 500 (default is 10) points_per_second = 1000000 even with a hight rotation_frequency, you still have
rotations_per_second = rotation_frequency

If you set fixed_delta_seconds in the carla setting, it would be calculated like this

Example Points per pointcloud = points_per_second fixed_delta_seconds 100000 = 1000000 0.1 50000 = 1000000 * 0.05

For one line, you have to split them between each channel Example point amount in a row = (points_per_second fixed_delta_seconds) / channels 3125 = (1000000 0.1) / 32 1562,5 = (1000000 * 0.05) / 32

To increase the density of the points, you have to increase the rotations_per_second rotation_frequency = 10 points_per_second = 100,000,000

Points per pointcloud = points_per_second fixed_delta_seconds 10,000,000= 100,000,000 0.1 point amount in a row = (points_per_second fixed_delta_seconds) / channels 312,500 = (100,000,000 0.1) / 32

hrnbot commented 5 months ago

How to set the fixed_delta seconds and the problem i am getting is that i am only receiving 10% of the part of the lidar when frequency set to low.

Is there any way to resolve that?

PatrickPromitzer commented 5 months ago

This is the code to set the fixed_delta_seconds.

fixed_delta_seconds = 0.1

client = carla.Client(server_address, server_port) 
world = client.load_world(self.map_parameters.carla_map_path)
settings = world.get_settings()
settings.fixed_delta_seconds = fixed_delta_seconds  # do not decrease below 0.1 !!
world.apply_settings(settings)

With that, you always have 0.1 seconds (simulation seconds) between each time you get lidar (sensor) data from Carla.

Edit: I forgot to remove one "self."

Blyron commented 2 months ago

Hello! Are you happy to close this issue?

hrnbot commented 2 months ago

Not got satisfactory answer...

Blyron commented 2 months ago

Okay, we will look into this