Open FireMonkey796 opened 3 years ago
Thank you. The preprocess of 3d keypoint doesn't appear in https://github.com/open-mmlab/mmpose/blob/master/docs/data_preparation.md. Maybe you can make some modifications on it.
And I wonder which kinds of dataset I should download, and how to place them originally to run the script?
Thanks for your suggestion, we have added the 3d keypoint into data_preparasion.md.
To run the preprocessing script, you need to download "Videos", "D2_Positions", "D3_Positions_mono" for each subject, and also the metadata in the official dataset code package. The original files should be placed like:
original
├── s1
| ├── Videos.tgz
| ├── D2_Positions.tgz
| `── D3_Positions_mono.tgz
├── s2
...
And the path to the original folder and the metadata file needs to be specified in the arguments of the preprocessing script.
Thank you. I wonder what files are contained in the metadata? Is it downloaded here?
And I find some package error
Traceback (most recent call last):
File "/home/qwe/Desktop/mmpose/tools/dataset/preprocess_h36m.py", line 15, in
Please find the "metadata.xml" file in the v1.2 software package:
@cherryjm Could you please help with the pycdf issue?
And I find some package error Traceback (most recent call last): File "/home/qwe/Desktop/mmpose/tools/dataset/preprocess_h36m.py", line 15, in from spacepy import pycdf File "/home/qwe/anaconda3/envs/mm/lib/python3.7/site-packages/spacepy/pycdf/init.py", line 1288, in 'before import.').format(', '.join(_libpath))) Exception: Cannot load CDF C library; checked . Try 'os.environ["CDF_LIB"] = library_directory' before import.
It is probably due to unsuccessful installation of NASA CDF library. Please refer to the official installation guide. If you still have trouble with pycdf, alternatively, you can try another Python package -- cdflib, which should be easier to install.
Thank you! I have successfully run the script to preprocess the data. But I met another problem when I am going to train human36m key-point model: _FileNotFoundError: Body3DH36MDataset: [Errno 2] No such file or directory: '/media/qwe/Windows/1TB_dataset/mmpose_3d_data/annotation_body3d/fps50/h36mtrain.npz' How can I obtain the folder "50fps"?
The folder "50fps" will be generated automatically by running the preprocessing script with sample_rate=1
. The default sample_rate
is 5, which will result in 10fps annotations.
So the preprocess file should be run twice to enable training for human36m key-point, right? One for sample_rate=1 and one for sample_rate=5
It depends on which model you use. For example, we use 50fps data to train and evaluate SimpleBaseline3D. Please refer to configs for detailed data configurations.
Please find the "metadata.xml" file in the v1.2 software package:
@cherryjm Could you please help with the pycdf issue?
Hello, can v1.0 metadata.xml be used for data set preprocessing? I don't have the v1.2 version.
Hello. What should I do if I want to train human36m 3d keypoint model? I only find the document with human36m mesh detection. And I can only find the procedures of testing data preprocess