I followed all of the instructions in Setup/Installation and I think I haven't seen any error reported. While when I tried to run the model using the command:
It firstly gave me a feedback but was still running:
DGL backend not selected or invalid. Assuming PyTorch for now.
Setting the default backend to "pytorch". You can change it in the ~/.dgl/config.json file or export the DGLBACKEND environment variable. Valid options are: pytorch, mxnet, tensorflow (all lowercase)
However, it ends at sending another error message as:
After that, it just keep sending me this error message whenever I tried to run the model using the first command. Could you please help me check with this issue? Thanks!
Hi,
I followed all of the instructions in Setup/Installation and I think I haven't seen any error reported. While when I tried to run the model using the command:
python -m rf2aa.run_inference --config-name {your inference config}
It firstly gave me a feedback but was still running:
DGL backend not selected or invalid. Assuming PyTorch for now. Setting the default backend to "pytorch". You can change it in the ~/.dgl/config.json file or export the DGLBACKEND environment variable. Valid options are: pytorch, mxnet, tensorflow (all lowercase)
However, it ends at sending another error message as:
Error parsing override 'inference' missing EQUAL at ''
See https://hydra.cc/docs/1.2/advanced/override_grammar/basic for details
Set the environment variable HYDRA_FULL_ERROR=1 for a complete stack trace.
After that, it just keep sending me this error message whenever I tried to run the model using the first command. Could you please help me check with this issue? Thanks!