Open Eoin-McMahon opened 4 years ago
@Eoin-McMahon, Google Colaboratory provides GPU resources. You should move the code to a Jupyter Notebook for training on GPUs. Also, similar platforms such as Kaggle Kernels, Microsoft Azure and IBM Watson provide solutions of Jupyter notebooks but I do not remember which of them supports GPUs.
issue is described in the title, is there a pretrained model that I can use stored on google drive or something? My system would take weeks to train if I let it.