Petrichor625 / HLTP

[IEEE TIV] Official PyTorch Implementation of ''A Cognitive-Based Trajectory Prediction Approach for Autonomous Driving.''
40 stars 3 forks source link

Preprocessing script for NGSIM data #3

Open IndefiniteBen opened 7 months ago

IndefiniteBen commented 7 months ago

I'm trying to reproduce your results. Can you please share a script for converting the NGSIM csv/txt data to the MAT format this repo expects? Edit: conversion script exists, see my next comment.

I tried using the files from BATraj NGSIM release but when I run python evaluate_teacher.py I get the following error:

IndexError: Caught IndexError in DataLoader worker process 0.
Original Traceback (most recent call last):
  File "/home/ben/hubrisDev/HLTP/venv/lib/python3.9/site-packages/torch/utils/data/_utils/worker.py", line 302, in _worker_loop
    data = fetcher.fetch(index)
  File "/home/ben/hubrisDev/HLTP/venv/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 58, in fetch
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/home/ben/hubrisDev/HLTP/venv/lib/python3.9/site-packages/torch/utils/data/_utils/fetch.py", line 58, in <listcomp>
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/home/ben/hubrisDev/HLTP/loader2.py", line 54, in __getitem__
    lane_stu = self.getLane_stu(vehId, t, vehId, dsId)
  File "/home/ben/hubrisDev/HLTP/loader2.py", line 205, in getLane_stu
    refPos = refTrack[np.where(refTrack[:, 0] == t)][0, 5]
IndexError: index 5 is out of bounds for axis 1 with size 4

I get the same error if I fix the paths and try to run evaluate_teacher.py.

IndefiniteBen commented 7 months ago

Apologies, I was mistaken. There is a script for NGSIM data: preprocessdata.m

My confusion came from using the "Export" button on the NGSIM data website which produces CSVs. On the NGSIM data catalog page you need to download the attachments:

Extract the text files from the *-vehicle-trajectory-data.zip files within the above zip files. These should be stored in a raw_data directory in the same folder as the HLTP repo folder (the script looks in ../raw_data/) for these files:

The script did not work until I changed L159 to be:

load('allData_s','traj');
IndefiniteBen commented 7 months ago

However, when I tried to use the created mat files for training I got this error:

Traceback (most recent call last):
  File "/home/ben/hubrisDev/HLTP/train_teacher.py", line 132, in <module>
    main()
  File "/home/ben/hubrisDev/HLTP/train_teacher.py", line 46, in main
    trSet = ngsimDataset('./data/dataset_t_v_t/TrainSet.mat')
  File "/home/ben/hubrisDev/HLTP/loader2.py", line 19, in __init__
    self.D = scp.loadmat(mat_file)['traj']
  File "/home/ben/hubrisDev/HLTP/venv/lib/python3.9/site-packages/scipy/io/matlab/_mio.py", line 226, in loadmat
    MR, _ = mat_reader_factory(f, **kwargs)
  File "/home/ben/hubrisDev/HLTP/venv/lib/python3.9/site-packages/scipy/io/matlab/_mio.py", line 80, in mat_reader_factory
    raise NotImplementedError('Please use HDF reader for matlab v7.3 '
NotImplementedError: Please use HDF reader for matlab v7.3 files, e.g. h5py

I added ,'-v7' to the save command for the Train/Val/TestSet (lines 190/193/196) and this seems to work for training the teacher.

Zhangxx1218 commented 7 months ago

same problem!Did you solve it?

IndefiniteBen commented 7 months ago

Yes. I wrote how in my comments...

Petrichor625 commented 6 months ago

It seems like you're having trouble executing the HLTP code. Could you provide a screenshot of the specific lines of code where the error occurs? We tested it before uploading and it worked fine for us. This will help us understand the issue better and assist you accordingly.

IndefiniteBen commented 6 months ago

I think I've detailed it in my other comments, but I can do you even better, here is a link to the specific line:

load('allData','traj');

When the preprocess_data_.m script runs in MATLAB I get the following output:

Loading data... Parsing fields... Error using load Unable to find file or directory 'allData'.

Error in preprocessdata (line 159) load('allData','traj');

I therefore changed this line to load('allData_s','traj'); and the script runs correctly.

IndefiniteBen commented 6 months ago

Additionally (perhaps I should make a separate issue), I discovered after I tried to use the generated MAT file with train_teacher.py that there was another issue in the preprocesing. I got the error quoted in this comment when I tried to run train_teacher.py, the relevant part is:

NotImplementedError: Please use HDF reader for matlab v7.3 files, e.g. h5py

I think either you use an older version of MATLAB or have set your default MAT-file format to v7. Modern releases of MATLAB by default save their MAT file as v7.3 which is incompatible with your train_teacher.py script.

The solution to this is to explicitly save the Train/Test/ValSet MAT files at v7, this way it doesn't matter what version of MATLAB is being used or what settings a user has chosen.

For example, change line 190 in preprocess_data_.m to be: save('TrainSet','traj','tracks','-v7');