ThilinaRajapakse / simpletransformers

Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI
https://simpletransformers.ai/
Apache License 2.0
4.11k stars 727 forks source link

use_cuda= False generates ValueError #486

Closed NaRuecker closed 4 years ago

NaRuecker commented 4 years ago

Hi,

thank you so much for the wonderful and easy to use package!

I have been happily using it for a few days running a binary classification script. It works fine on google colabs using the available GPU. Now I wanted to test a new environment that only has CPU, suddenly the use_cuda argument throws an error. I went back to colabs, same error in there.

Here is my code: image

Here is the unexpected output: image

1. The error content seems wrong to me. ValueError: 'use_cuda' set to True when cuda is unavailable. Make sure CUDA is available or set use_cuda=False. Shouldn't it rather be 'use_cuda' set to True when cuda is available. Make sure CUDA is available or set use_cuda=False 2. I have set use_cuda=False. So why won't it run anymore?

Anyone any ideas?

ThilinaRajapakse commented 4 years ago

use_cuda is a method argument and not part of the model's args. It should work with something like this:

model = ClassificationModel(model_type=mt, model_name=mn, use_cuda=False, args=model_args)
NaRuecker commented 4 years ago

Thank you Thilina! That worked!