OpenDriveLab / TCP

[NeurIPS 2022] Trajectory-guided Control Prediction for End-to-end Autonomous Driving: A Simple yet Strong Baseline.
Apache License 2.0
309 stars 40 forks source link

Visualization waypoint #50

Closed azamshoaib closed 8 months ago

azamshoaib commented 10 months ago

@penghao-wu Hi, Since tcp_agent is dumping the predicted waypoints, can you please share the visualization code for projecting it on the bev image.

penghao-wu commented 10 months ago

Hi, visualizing the waypoints in BEV should be easy as you do not need to do perspective projection. I use the following function to visualize the waypoints. If you want to visualize the waypoints in the front-view camera, you should first do the perspective transformation.

def draw_trajectory_on_ax(ax: Axes, trajectories, labels, colors, line_type='o-', transparent=True, xlim=(-70, 70), ylim=(0, 120)):
    '''
    ax: matplotlib.axes.Axes, the axis to draw trajectories on
    trajectories: List of numpy arrays of shape (num_points, 2 or 3)
    confs: List of numbers, 1 means gt
    '''
    for idx, (trajectory, label, color) in enumerate(zip(trajectories, labels, colors)):
        ax.plot(-1*trajectory[:, 1], trajectory[:, 0], line_type, label=label, color = color)
    if xlim is not None:
        ax.set_xlim(*xlim)
    if ylim is not None:
        ax.set_ylim(*ylim)
    ax.legend()

    return ax
azamshoaib commented 10 months ago

@penghao-wu I have run your function but the waypoints are projecting out of the image bound.

penghao-wu commented 10 months ago

If you want to show them on the front-view image, you need to transform the waypoints using camera's intrinsic and extrinsic matrix first.

azamshoaib commented 10 months ago

For perspective I have projected the waypoints but I need to project it on BEV image. The function you give for projecting the waypoints on BEV image is projecting the waypoints out of bound.

penghao-wu commented 10 months ago

For the predicted waypoints, the x is to the left and the y is upwards. So you might need to adjust them according to your definition of your BEV coordinate.

azamshoaib commented 10 months ago

@penghao-wu

import matplotlib.pyplot as plt
from matplotlib.axes import Axes
import json
import numpy as np
def draw_trajectory_on_ax(ax: Axes, trajectories, labels, colors, line_type='o-', transparent=True, xlim=(-70, 70), ylim=(0, 120)):
    for idx, (trajectory, label, color) in enumerate(zip(trajectories, labels, colors)):
        ax.plot(-1*trajectory[:, 1], trajectory[:, 0], line_type, label=label, color=color)
    if xlim is not None:
        ax.set_xlim(*xlim)
    if ylim is not None:
        ax.set_ylim(*ylim)
    ax.legend()
img = plt.imread('bev/0763.png')
with open('meta/0763.json', 'r') as f:
    data = json.load(f)
fig, ax = plt.subplots()
ax.imshow(img, extent=[-70, 70, 0, 120]) # Set extent based on your BEV image dimensions
wp_json = np.array([data['wp_1'], data['wp_2'], data['wp_3'], data['wp_4']])
wp_json[:, 1] = -wp_json[:, 1] # Flip y
wp_json = wp_json[:, [1, 0]] # Swap x and y
trajectories = [wp_json]
draw_trajectory_on_ax(ax, trajectories, ['Waypoints'], ['r'])
plt.show()

I have written this code your visualization function. Can you please tell me where I am doing the mistake. Thank you

penghao-wu commented 10 months ago

Could you please share the bev image and the json file?

azamshoaib commented 10 months ago

@penghao-wu 0763 Json file

{
    "speed": 5.222938537597656,
    "steer": 0.054270822865752015,
    "throttle": 0.6499840478102348,
    "brake": 0.0,
    "wp_4": [
        2.357274055480957,
        10.448390007019043
    ],
    "wp_3": [
        1.3261723518371582,
        8.031651496887207
    ],
    "wp_2": [
        0.6107930541038513,
        5.470467567443848
    ],
    "wp_1": [
        0.21580669283866882,
        2.746610641479492
    ],
    "aim": [
        0.21580669283866882,
        2.746610641479492
    ],
    "target": [
        37.07563781738281,
        31.653648376464844
    ],
    "desired_speed": 5.3593800862630205,
    "angle": 0.04991787848456755,
    "angle_last": 0.2567279621075312,
    "angle_target": 0.5501185481779519,
    "angle_final": 0.04991787848456755,
    "delta": 0.1364415486653643,
    "agent": "traj",
    "steer_ctrl": 0.08452033996582031,
    "steer_traj": 0.054270822865752015,
    "throttle_ctrl": 0.9460892677307129,
    "throttle_traj": 0.6499840478102348,
    "brake_ctrl": 0.0,
    "brake_traj": 0.0,
    "status": 0
}
penghao-wu commented 10 months ago

I am sorry for the previous mistake. The y should be upwards, so you should modify the Flip y line to wp_json[:, 0] = -wp_json[:, 0] # Flip x. Besides, as you want to project the trajectory onto the image, you need to translate the trajectory to make the origin at the center of the image and scale the waypoints according to the camera parameters. You can also view this as a perspective transformation and conduct the projection accordingly.