ncgarcia / modality-distillation

Tensorflow code for the paper 'Modality Distillation with Multiple Stream Networks for Action Recognition', ECCV 2018
19 stars 3 forks source link

How to prepare for training data? #2

Open claudehang opened 5 years ago

claudehang commented 5 years ago

imagenet_ckpt = '/home/pavis/Documents/ng/tf-ckpt/resnet_v1_50.ckpt' uwa3dii_dir = '/media/pavis/3TB-HD2/datasets/uwa3dii/tfrecords/' ntu_dir = '/datasets/ntu/tfrecords_aligned_improved/' nwucla_dir = '/datasets/nwucla/tfrecords/'

Could you share the related data in these folders? Many thanks!

pmorerio commented 5 years ago

Hi, the slim resnet50 checkpoint can be downloaded from the official page. As for the tfrecords, they are cumbersome, expecially for NTU. We can work out some way to share them, otherwise @ncgarcia may share the code to create them from the original videos.

claudehang commented 5 years ago

Thanks~ I am a PyTorch user and new to TensorFlow, so really appreciate your code and later release of data part!

claudehang commented 5 years ago

Could you just provide the .tfrecord files for train/val/test data? That way the training process can get started. Thanks a lot!

ncgarcia commented 5 years ago

Hi claudehang, I'm sorry for the late reply. I'm committing the code used to produce the tfrecords (commit c7a411a53a9fcda9a021a743d7f0e213ac845a37) It is only needed to download the dataset (uwa3D) and adjust the paths to the RGB and Depth frames folders. The process is the same for the other datasets, excluding the small adjusts for the filenames and labels. thanks!