balakg / posewarp-cvpr2018

MIT License
193 stars 47 forks source link

Have you used the Human 3.6M dataset for your model? #1

Closed dulucas closed 6 years ago

dulucas commented 6 years ago

Hi Guha, Glad that you release your splendid work with us ! Have your ever test your methods on the Human 3.6M dataset? Because I think this dataset might perfectly fit your method. Thanks!

dulucas commented 6 years ago

0_0_2_result 0_0_13_result 0_0_7_result It works for H3.6M dataset, here are some examples of the network without post-precessing of GAN

balakg commented 6 years ago

Thanks, we hadn't tried it yet, though we were going to. These results look good, let us know how it goes!

Lotayou commented 6 years ago

@lucasdu007 Nice job. The background seems a bit blurry though, as if the camera was shaking. Is it normal?

dulucas commented 6 years ago

@Lotayou It will be better if use the gan provided

dulucas commented 6 years ago

@balakg I resize the H3.6M dataset's image to 224x224, and then change the keypoints' number from 14 to 16, and the rest remain unchanged. Every iteration I random choose a pair of images of a same person filmed from a same camera, I only used 1 camera's view, though I think it's better to use all camera's data, because it provides much more training data and at the same time provides 4 images of different views of a same pose, which, in my opinion, will largely boost the performance. These photos are of iteration 44000, it becomes much better after applying gan(iteration 6000).

UdonDa commented 6 years ago

@lucasdu007 In H3.6M, which did you use? Please teach me!

2018-07-06 15 21 51
dulucas commented 6 years ago

actually all of it @UdonDa, you should first extract each scene of each person, and make sure that the input pairs are from same person and same camera(however I suppose that the network will benefit from videos from different viewpoints, which would provide more infos(multi-view) of the same person).

UdonDa commented 6 years ago

@lucasdu007 Sounds like being busy. Thank you very much!