Open arsalan993 opened 3 years ago
I get the same error. I have a GPU but I'm running Ubuntu in Windows so I can't use it without installing the Windows Insider beta version of Windows.
If a GPU is required to run, it would be nice to have that stated in the documentation.
But from what I can see, it seems that predictions on the input text were made. So maybe the error does not affect the results.
though it doesn't effect ur desired output bcoz it's generated before the error. However, copying the command from readme file as it is, including the argument --mode
as splitpredict
is the issue here.
setting it as predict
will just predict and stop before the error. but the results will be not good as the first one.
The issue lies in openie6/run.py
.
rescored = rescore(inp_fp, model_dir=hparams.rescore_model, batch_size=256)
needs to be changed to
rescored = rescore(inp_fp, model_dir=hparams.rescore_model, batch_size=256, cuda_device=(0 if has_cuda else -1))
https://github.com/dair-iitd/openie6/blob/master/imojie/imojie/aggregate/score.py#L88 is called with cuda_device=0
This results in a call to
https://github.com/dair-iitd/openie6/blob/master/imojie/allennlp/allennlp/commands/evaluate.py#L92
with cuda device of 0, but as the documentation specifies, -1 is required for no GPU.
RUN sed -i 's|rescored = rescore(inp_fp, model_dir=hparams.rescore_model, batch_size=256)|rescored = rescore(inp_fp, model_dir=hparams.rescore_model, batch_size=256, cuda_device=(0 if has_cuda else -1))|' openie6/run.py
worked in my dockerfile. A bit busy to make a PR, but may tackle it later.
Although i set
gpus = 0
since i dont have any GPU installed scripted did predicted the output and saved it in prediction.txt file but right before exiting run.py file compiler throughs this error