zc-alexfan / hold

[CVPR 2024✨Highlight] Official repository for HOLD, the first method that jointly reconstructs articulated hands and objects from monocular videos without assuming a pre-scanned object template and 3D hand-object training data.
https://zc-alexfan.github.io/hold
MIT License
301 stars 7 forks source link

Problems on ECCV'2024 Workshop on ARCTIC #11

Closed ACondaway closed 3 months ago

ACondaway commented 3 months ago

Hi, @zc-alexfan ! I am going to take part in the ECCV'2024 Workshop on ARCTIC! But I need a better tutorial on ARCTIC dataset gaining. Some questions are as below:

  1. Should the full dataset of ARCTIC be downloaded?
  2. Should we ourselves preprocess the ARCTIC dataset?

Thank you for some suggestions!

zc-alexfan commented 3 months ago

Hi, Thanks for participating our challenge. It will be a selection of clips from some ARCTIC seqs because the task is still quite challenging and I want to simplify the setting for people first. I plan to release the clips, hopefully by this weekend.

While you are waiting, I would suggest to study the HOLD code (say how to preprocess a single hand sequence with an object) because that should take a week or two anyway. We also have two hand preprocessing script right now, so you can maybe try to capture a custom sequence with your two hands interacting with an object to try out HOLD (maybe start with simple two hands holding an object slowly to get a working sequence and add more challenging motions to see).

Hope it helps.

ACondaway commented 3 months ago

Hi, Thanks for participating our challenge. It will be a selection of clips from some ARCTIC seqs because the task is still quite challenging and I want to simplify the setting for people first. I plan to release the clips, hopefully by this weekend.

While you are waiting, I would suggest to study the HOLD code (say how to preprocess a single hand sequence with an object) because that should take a week or two anyway. We also have two hand preprocessing script right now, so you can maybe try to capture a custom sequence with your two hands interacting with an object to try out HOLD (maybe start with simple two hands holding an object slowly to get a working sequence and add more challenging motions to see).

Hope it helps.

Yeah, I am looking forward to it! Have a nice day~

ACondaway commented 3 months ago

Hi, Thanks for participating our challenge. It will be a selection of clips from some ARCTIC seqs because the task is still quite challenging and I want to simplify the setting for people first. I plan to release the clips, hopefully by this weekend.

While you are waiting, I would suggest to study the HOLD code (say how to preprocess a single hand sequence with an object) because that should take a week or two anyway. We also have two hand preprocessing script right now, so you can maybe try to capture a custom sequence with your two hands interacting with an object to try out HOLD (maybe start with simple two hands holding an object slowly to get a working sequence and add more challenging motions to see).

Hope it helps.

And I am going to capture some video by myself to learn about the preprocess procedure~

zc-alexfan commented 3 months ago

I've updated instructions for preprocessing a custom sequence now. You can find the details at Two-hand setting: Bimanual category-agnostic reconstruction under README.md

zc-alexfan commented 3 months ago

I've sent an email for the newest instructions. Marking this as resolved. Let me know how it goes.