mikeqzy / 3dgs-avatar-release

3DGS-Avatar: Animatable Avatars via Deformable 3D Gaussian Splatting
MIT License
276 stars 25 forks source link

About training on PeopleSnapshot #17

Open Connaught0 opened 2 months ago

Connaught0 commented 2 months ago

Spcially thanks for your excellent work! Recently, I have tried to train with the PeopleSnapshot, I use the dataset after processing with Neuralbody( the vertex in npy). However, I don't gain the right result, as shown in the below. render_c01_f000756

What should i do for preprocessing the dataset?

mikeqzy commented 3 weeks ago

Hi, thank you for your interest. To run on the PeopleSnapshot dataset you'll need to preprocess it into arah-format. Unfortunately I could not find the preprocessing script, but it can easily adapted from https://github.com/taconite/arah-release/blob/main/preprocess_datasets/preprocess_ZJU-MoCap.py. Hope it helps!

GostInShell commented 3 weeks ago

Hi! @mikeqzy Thank you for contributing the code of your excellent work! I also have several questions regarding the details to achieve the reported results on the Peoplesnapshot dataset.

  1. For testing, do you also optimize the poses as InstantAvatar?
  2. Are the results obtained using white background?
  3. Do you use the latent code of the last training frame as the input of the texture MLP for all testing poses?