metadriverse / scenarionet

ScenarioNet: Scalable Traffic Scenario Management System for Autonomous Driving
Apache License 2.0
189 stars 23 forks source link

Extracting leading vehicle in nuScenes after conversion #65

Open samueleruffino99 opened 9 months ago

samueleruffino99 commented 9 months ago

Hello, thank you very much for your work. Do you think it would be possible to extract the leading vehicle given the ego position in nuScenes dataset after conversion? What I am thinking about is to:

I am not quite sure about whether using just lanes or also other objects, actually. Anyway, I was wondering whether you have some implementations that extract the lane from the vehicle position or something like that, also in curvilinear coordinates (in metadrive as well).

QuanyiLi commented 9 months ago

Yeah, we do have that tools. If the map is loaded with ScenarioMap, you can get all lanes by map.road_network and get the polygon for each lane by lane.polygon. If there is no polygon for the lane, we will generate one from its lane center line, so this API will definetely return a polygon. With these polygons, you can easily find which polygon the point (vehicle position) is in and hence get the lane id and lane center line. You can prompt GPT to write the point-in-polygon code :). Actually, there is an advanced function, ray_localization(), allowing you to do this directly. Check: https://github.com/metadriverse/metadrive/blob/c326ae5f6b409ed90d2bcda8b5bc12689c8c03b5/metadrive/component/navigation_module/edge_network_navigation.py#L159 It will return a set of lanes that the point is on. I think this one might be faster because we do the point-in-polygon calculation with physics engine API.

For getting curvilinear coordinates, check PointLane https://github.com/metadriverse/metadrive/blob/c326ae5f6b409ed90d2bcda8b5bc12689c8c03b5/metadrive/component/lane/point_lane.py#L17 The polyline of a lane is indeed the centerline. So just create a PointLane object with the lane center line. Then you can get the longitudinal and lateral position given an object with lane.local_coordinates(position)

samueleruffino99 commented 9 months ago

Do you also have some tool to get the leading vehicle ? Thank you very much for your help!

QuanyiLi commented 9 months ago

No... Only this function which did the similar thing: https://github.com/metadriverse/metadrive/blob/c326ae5f6b409ed90d2bcda8b5bc12689c8c03b5/metadrive/policy/idm_policy.py#L82

samueleruffino99 commented 9 months ago

Screenshot 2024-02-22 151050 I am plotting here the all the objects that intersect with the current lane where the ego vechile is. However it seems that the ego and other vehicles are misaligned in some way (maybe I have some bug in the code). It is strange since I am using the same function to plot the occupancy box given a state of an object. `def get_object_occupancy(self, state): """Extracts the occupancy of the object without considering velocity but only rotation Args: state (State): The state of the object Returns: Polygon: The occupancy polygon rotated based on heading"""

    # Extract the position, heading, length, and width from the state object
    obj_position = state.position[:2]
    obj_heading = -state.heading
    obj_length = state.length
    obj_width = state.width

    # Compute the rotation matrix
    cos_angle = np.sin(obj_heading)
    sin_angle = np.cos(obj_heading)
    rotation_matrix = np.array([[cos_angle, -sin_angle],
                                [sin_angle, cos_angle]])

    # Define the vertices of the box relative to its center
    half_length = obj_length / 2
    half_width = obj_width / 2
    vertices_relative = np.array([[-half_length, -half_width],
                                  [half_length, -half_width],
                                  [half_length, half_width],
                                  [-half_length, half_width]])

    # vertices_relative = np.array([[-half_width, -half_length],
    #                               [half_width, -half_length],
    #                               [half_width, half_length],
    #                               [-half_width, half_length]])

    # Rotate the vertices using the rotation matrix and translate to the object's position
    rotated_vertices = np.dot(vertices_relative.squeeze(), rotation_matrix.T) + obj_position

    return rotated_vertices`

Apparently it works for ego but not for other objects, why could this happend? I have also checked the object size, and apparently length and width for ego is different wrt to other objects (width larger in ego, while legth larger in others).

QuanyiLi commented 9 months ago

Where is the scenario from? Waymo? or NuScenes?

QuanyiLi commented 9 months ago

If you are testing with Nuscenes data, it is a bug from nuscenes... We extract the length and width for all nuScenes objects with the same nuScenes API, but the ego car's width and length turn out to be inverse. So if it is a nuScenes data, the length is actually width and width is length.

samueleruffino99 commented 9 months ago

Yes, I am using nuScenes. I switched width and length for ego vehicels and it works. But apparently I have problems with the velocities as well (see attached pictures). All the velocities directions are correct, except for the ego one (it should be roatte by 180 deg). In the first image the ego vehicle is going in the up-left direction in time. This nuScenes scene has been converted with your API few months in the past. Screenshot 2024-02-23 144345 In this other picture all the velocity are correct (converted few days ago). id you paraphs fixed a bug about the velocities? Screenshot 2024-02-23 144639

QuanyiLi commented 9 months ago

Yes, I guess so. The previous nuScenes converter may be buggy. If the recent one is good, that's fine. Also, you can find some scenarios in MetaDrive/assets, which can serve as your test case as well.

samueleruffino99 commented 9 months ago

Actually I am having problems with the current version. Anyway I will check with your test cases and update you.