Closed chuangyu-robotics closed 3 years ago
Hello Chuang,
I'm happy that ProxEmo interests you!. Regarding the dataset, the action joints are from the ELMD dataset mentioned in the readme. But the original dataset was labeled for action recognition. Our group got it labeled for emotion. I have attached the data and the labels with this email (features_EMLD.h5 https://drive.google.com/file/d/1GxUaP2WJ9h20zkHkHQlz7nVpjiDJEFkP/view?usp=sharing file drive link). You can use this database but please cite the original papers that introduced these datasets in your work. I have given the original papers below.:
https://bitbucket.org/jonathan-schwarz/edinburgh_locomotion_mocap_dataset/src/master/
Let me know if you are able to download all the files. Feel free to let me know if you have any questions.
Regards, Venkatraman
On Mon, 9 Nov 2020 at 05:18, Chuang YU notifications@github.com wrote:
proxemo is interestingly used in Human-robot interaction. Thank you for you to share your work here.
Could you tell me where I can download a whole database of proxemo?
Thanks in advance!
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/vijay4313/proxemo/issues/1, or unsubscribe https://github.com/notifications/unsubscribe-auth/AH237244GBCBSWWZZQF3TNDSO66XVANCNFSM4TPEMDXA .
Hi, Venkatraman,
Yes, I can download it.
Thank you very much!!!
So kind of you with so detailed information.
Best wishes,
Chuang
proxemo is interestingly used in Human-robot interaction. Thank you for you to share your work here.
Could you tell me where I can download a whole database of proxemo?
Thanks in advance!