AI4Finance-Foundation / FinRL

FinRL: Financial Reinforcement Learning. 🔥
https://ai4finance.org
MIT License
10.03k stars 2.42k forks source link

Making FinRL use GPU #359

Closed Abaminog closed 2 years ago

Abaminog commented 2 years ago

FinRL doesn't seem to recognise and use a GPU on my PC. Other machine learning algorithms based on Tensorflow do recognise and use it. Any word of advice on how to force FinRL to use a GPU for DRL model training?

Athe-kunal commented 2 years ago

From which library you are using the RL models? SB3, RLlib or ElegantRL

Abaminog commented 2 years ago

I believe it's SB3 - as per here https://github.com/AI4Finance-Foundation/FinRL/blob/master/FinRL_StockTrading_NeurIPS_2018.ipynb

Athe-kunal commented 2 years ago

Then it must be using GPU or Cuda enabled device. As per https://github.com/DLR-RM/stable-baselines3/blob/master/stable_baselines3/ppo/ppo.py#L90, you can see the default is auto which is for GPU. If it is not detecting for a certain reason, then you can pass the device argument while declaring the model. For example PPO(policy,env...., device='cuda') But it should detect if Cuda-enabled GPU is present or not. If you are running on a local machine, make sure you have a proper torch with cudnn and cuda installed.