Closed JosephKKim closed 1 year ago
Sorry for that wrong command, please check the following:
YOUR_BLENDER_PYTHON_PATH/python -m pip install -r prepare/requirements_render.txt
Please install these packages in prepare/requirements_render.txt
for the specific python
used in your blender.
Let me know if you still have more questions.
Thanks for the quick reply!
Even after I typed the command you gave in the reply, it give the message which says
bash: /snap/blender/3135/2.93/python: Is a directory
Do you have any idea?
Thanks
plus, can you please give me the numpy and pandas version of your environment?
Thanks for the quick reply!
Even after I typed the command you gave in the reply, it give the message which says
bash: /snap/blender/3135/2.93/python: Is a directory
Do you have any idea?
Thanks
You need to use the python executable file rather than a folder. It seems /snap/blender/3135/2.93/python
is a folder.
We will consider providing a docker image for running environment soon
Thanks I think I found the executable file! I didn't know much about the blender.. BTW, I want to ask where can I find instructions for the datasets. Even running the demo file requires dataset but I cannot find instructions for the dataset in the README.
Thanks again for the fast reply!
You can visit https://github.com/EricGuo5513/HumanML3D for the dataset setup and then update HUMANML3D.ROOT
and HUMANML3D.SPLIT_ROOT
in https://github.com/ChenFengYe/motion-latent-diffusion/blob/main/configs/assets.yaml.
Actually, demo should not require the dataset for most tasks. We will try to update that part to remove this dependence and provide this dataset setup detail.
Thanks for the reply! It would be really helpful if you can answer one more question. I've already posted issue on the HumanML3D, but didn't get a right answer, they wrote that we should process raw KIT dataset with the given ipynb file, but when I downloaded KIT dataset from the official website, it is npz file instead of npy file... I think that I misunderstood something... Can you tell me how to process the KIT file if you remember the process? Thanks
I do not suggest you download KIT from its official website, you can check the KIT provided by HumanML3D.
The below is from the readme of HumanML3D
For KIT-ML dataset, you could directly download [Here]. Due to the distribution policy of AMASS dataset, we are not allowed to distribute the data directly. We provide a series of script that could reproduce our HumanML3D dataset from AMASS dataset.
https://drive.google.com/drive/folders/1MnixfyGfujSP-4t8w_2QvjtTVpEKr97t
Thanks for the so much help, After downloading and organizing HumanML3D dataset directory I see that demo is reading npy files from the dataset. But, I'm seeing the ValueError: not enough values to unpack (expected 2, got 0) .
While debugging I found that many motions goes into the bad_count and it skips the loop in dataset.py resulting empty new_name_list and length_list.
Empty new_name and lenght_list is true but, I think it's because of other reason. Is this a bug or am I doing something wrong again?
Thanks.
But, I'm seeing the ValueError: not enough values to unpack (expected 2, got 0
Could you provide the line and its file for this error? I think you might generate a wrong npy file.
it is where the error occurs, and I am using HumanML3D dataset which is a bunch of npy files after unrar new_joint_vecs.rar file.
Before running the demo, could you please use the double-check process in humanml3d? https://github.com/EricGuo5513/HumanML3D/blob/main/cal_mean_variance.ipynb I guess your npy files might be broken.
You can also try to skip the dataset if you only want to run demo.
Okay actually I am going to use this model for train/test. I ran cal_mean_variance.ipynb you referred to, encoutered an error processing the code below.
Am I using wrong npy file...? I thought I could skip this step after extracting the given rar file from the repository...
I believe your npy file is wrong. These npy files could be broken while your humanml generation. You could refer to https://github.com/EricGuo5513/HumanML3D/issues/3 for help.
Hello, I fixed the npy file and then ran demo. 'python demo.py --cfg ./configs/config_mld_humanml3d.yaml --cfg_assets ./configs/assets.yaml --example ./demo/example.txt '
Seems like it created npy and txt file in samples_
After that I've followed all instructions from the visualization part, I want you to specify some details for me.
python -m fit --dir YOUR_NPY_FOLDER --save_folder TEMP_PLY_FOLDER --cuda
does YOUR_NPYFOLDER means the folder containing the npy files from the demo?
In my case samples
And TEMP_PLY_FOLDER is where I want to save ply files?
Added: What is the cuda option? Thanks
You are right.
YOUR_NPY_FOLDER
means the folder containing the npy files from the demo.
TEMP_PLY_FOLDER
means anywhere you want to save ply files.
--cuda
means using cuda to speed up this optimization.
我也出现了这样的问题
ValueError: not enough values to unpack( expected 2,got 0)
我重新提取了好几次humanml3d数据,但是问题没有解决。
Hi thanks for releasing interesting work,
I am following the instruction you've provided in README file, after following instructions from TEMOS, I found that path to blender in my linux system is /snap/blender/3135/2.93 When I set this path as a YOUR_BLENDER_PATH, this the first command doesn't work. There's blender file in /snap/blender/3135, so when I ran the command, it gives the blender version followed by OSError which says "GIT_REPOSITORY_PATH/motion-latent-diffusion/-m" could not be opened: No such file or directory" In this case, what should I do?
Thanks, Joseph.