guiggh / hand_pose_action

Dataset and code for the paper "First-Person Hand Action Benchmark with RGB-D Videos and 3D Hand Pose Annotations", CVPR 2018.
257 stars 31 forks source link

Baseline LSTM for action recognition #7

Closed yasserboutaleb closed 4 years ago

yasserboutaleb commented 4 years ago

Hello,

Will the codes of the baseline LSTM for action recognition be open-source?

Best regards, Yasser BOUTALEB

guiggh commented 4 years ago

The implementation used on the paper experiment was a vanilla LSTM using tensorflow based on this public repo. Don't forget to normalize the sequences (or download the already normalized ones from the dataset folder) if you are attempting to replicate the results on the paper!

yasserboutaleb commented 4 years ago

Thank you for your quick reply and for sharing. Yes I'm trying to replicate the results (the action recognition part).

yasserboutaleb commented 4 years ago

Hello again @guiggh,

Please, I have one last question, did you have padded the sequences with zeros to a fixed max size ? Thank you in advance.

guiggh commented 4 years ago

Hi @Bestyasser, we used 120 frames as fixed max size and padded with zeros up to that number if the sequence is shorter. If the sequence is longer, we just cut the tail of the sequence and fill up to 120 frames. Hope it helps!