jingGM / DTG

10 stars 1 forks source link

dataset sample or structure #2

Open xiaofeifei-1 opened 1 week ago

xiaofeifei-1 commented 1 week ago

Hello, bro! your method is awesome, thanks for your open source! I'm trying to test your method, but there is no train data. Now, I want to use my own data, and could you explain the data-structure of your dataset or provide some samples, so that we can train our special dataset? And how the target is the GPS tranformed to UTM? This is confused.

jingGM commented 1 week ago

We will publish our dataset soon and the dataset structure for this project is: |- data_root |- - files.pkl: each instance |- - data.pkl: meta information including topological graph and all positions of the robot.

You need to process the raw dataset into the .pkl files, and each .pkl file includes: Lidar points, target in meters, past and ground truth trajectories, 10 frames of velocities according to the frequency, pose, and local traversability map. We also have camera images for visualization.

Converting GPS to meters is only used for real-world experiment. In training, we choose targets in the global maps.

xiaofeifei-1 commented 1 week ago

Thank you for your quickly reply!