facebookresearch / mmf

A modular framework for vision & language multimodal research from Facebook AI Research (FAIR)
https://mmf.sh/
Other
5.49k stars 935 forks source link

Unable to run models in CPU for inference #1310

Open soonchangAI opened 1 year ago

soonchangAI commented 1 year ago

❓ Questions and Help

Hi I would like to run some checkpointed models in CPU , unable to do so. Inference is still on CUDA In mmf/configs/defaults.yaml, I set

training:
      device: cpu
evaluation:
      use_cpu: true