sony / nnabla-examples

Neural Network Libraries https://nnabla.org/ - Examples
Apache License 2.0
306 stars 93 forks source link

Error running FOM colab, animated.py #395

Open G-force78 opened 1 year ago

G-force78 commented 1 year ago

2023-10-17 10:57:05,909 [nnabla][INFO]: Initializing CPU extension... usage: animate.py [-h] [--config CONFIG] [--params PARAMS] [--source SOURCE] [--driving DRIVING] [--out-dir OUT_DIR] [--context {cudnn,cpu}] [--output-png] [--fps FPS] [--only-generated] [--detailed] [--full] [--adapt-movement-scale] [--unuse-relative-movement] [--unuse-relative-jacobian] animate.py: error: unrecognized arguments: \


NameError Traceback (most recent call last)

in <cell line: 2>() 1 get_ipython().system('python animate.py --source imgs/sample_src.png --driving imgs/sample_drv.mp4 --adapt-movement-scale --fps 24 \') ----> 2 --detailed --full

NameError: name 'detailed' is not defined

G-force78 commented 1 year ago

Also, how can it be used to generate with ted384 model and yaml? Thanks

TomonobuTsujikawa commented 1 year ago

Thank you for creating issue, we will check it soon.

TomonobuTsujikawa commented 1 year ago

Sorry, I didn't reproduce your issue. Did you click the "RESTART RUNTIME" button after executing first paragraph? or did you add backslash the end of line?

!python animate.py --source imgs/sample_src.png \
                  --driving imgs/sample_drv.mp4 \
                  --adapt-movement-scale --fps 24 --detailed --full

2023-10-18 02:52:36,823 [nnabla][INFO]: Initializing CPU extension...
2023-10-18 02:52:37,717 [nnabla][INFO]: Initializing CUDA extension...
2023-10-18 02:52:37,740 [nnabla][INFO]: Initializing cuDNN extension...
voxceleb_trained_info.yaml: 100% 1.71k/1.71k [00:00<00:00, 7.49MB/s]
pretrained_fomm_params.h5: 100% 228M/228M [00:24<00:00, 9.58MB/s]
Loading pretrained_fomm_params.h5 for image animation...
100% 125/125 [00:20<00:00,  6.15it/s]
TakuyaYashima commented 1 year ago

Hi, thanks for trying our demo. Regarding the 2nd question,

Also, how can it be used to generate with ted384 model and yaml? Thanks

I suppose that you mean this model (or, pretrained weights and config file, to be precise). Though it depends on the internal neural network architecture and parameters' name, I don't think ted384 model can not be applied to our model as is.

The best way would be , git clone our source code and modify that to be able to load their model.

G-force78 commented 1 year ago

Hi, thanks for trying our demo. Regarding the 2nd question,

Also, how can it be used to generate with ted384 model and yaml? Thanks

I suppose that you mean this model (or, pretrained weights and config file, to be precise). Though it depends on the internal neural network architecture and parameters' name, I don't think ted384 model can not be applied to our model as is.

The best way would be , git clone our source code and modify that to be able to load their model.

Hi, thanks for quick reply, the modifications are what I would like to know, it seems most of the code is almsot identical to articulated-animation so I assume there would not be much to modify to get the 384 model to run?