obi-ml-public / ehr_deidentification

Robust de-identification of medical notes using transformer architectures
MIT License
43 stars 11 forks source link

How to infer using GPU? #8

Closed omeruth closed 10 months ago

omeruth commented 1 year ago

Could you please tell at which part of the code we can check for gpus and if available use those for inference? Assume train and testing flags are false under config file. Thanks

prajwal967 commented 1 year ago

This line begins running the forward pass: https://github.com/obi-ml-public/ehr_deidentification/blob/88751ab1f95d23d54ded39385adb8a27f57a6f72/src/robust_deid/sequence_tagging/sequence_tagger.py#L600

The checking for GPU's is handled by the HuggingFace trainer object - we don't do anything explicitly. You will make sure your parameters in the config file are correct.

FOr inference you will need to set do_predict to True though