Closed kHarshit closed 5 years ago
You can install detectron2 on CPU and run inference.
If you google the error message, all occurrences of the message suggest a bug in binutils / elfutils . You may need to upgrade / downgrade your system or compiler toolchain.
Thanks, it worked after downgrading binutils
.
It happened to me when running some tools from the DensePose project that it always tried to compile the model using CUDA, which I don't have installed because I'm running some experiments on MacOS.
If you want to use CPU for inference you can add the option MODEL.DEVICE cpu
to the command you are running, or add that same setting in the configuration file you are using:
MODEL:
DEVICE: "cpu"
Thanks @leopiney, I was able to run it on CPU using the MODEL.DEVICE
option.
I also have the same error on my MacOS, please how i can use the command MODEL.DEVICE cpu or where should i write to the config file , thank you!
I also have the same error on my MacOS, please how i can use the command MODEL.DEVICE cpu or where should i write to the config file , thank you!
Check https://detectron2.readthedocs.io/tutorials/getting_started.html#inference-with-pre-trained-models i.e. in order to run inference on cpu
python demo/demo.py --config-file configs/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml \
--input input1.jpg input2.jpg \
--opts MODEL.WEIGHTS detectron2://COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x/137849600/model_final_f10217.pkl \
MODEL.DEVICE cpu
If you're talking about the error build/temp.linux-x86_64-3.7/home/harshit/PycharmProjects/detectron2/detectron2/layers/csrc/vision.o: file not recognized: file format not recognized
, then I solved it by downgrading binutils
to 2.30-5.
❓ Questions and Help
Can I install detectron2 on CPU? Currently, I'm getting the following error while building detectron2 on manjaro.
Env:
Error: