Closed aga-relation closed 1 year ago
Hi @aga-relation ! If you are looking at requirements.txt - I believe this is for the Streamlit app. I believe tensorflow-gpu was installed for training the transformer model, e.g. see this .yml.
Thank you very much @Carldeboer! Sadly, attempts at installing the yaml file give lots of conflicts, despite using a fresh conda environment...
@1edv Any advice?
Definitely!
Option 1: The code can be run using the Code Ocean capsule without requiring any installation: https://codeocean.com/capsule/8020974/tree/v1
Option 2: Use the docker image:
docker run -it --rm --entrypoint /bin/bash edv123456789/evolution_app
Thanks for the suggestions! Sadly, I am having issues with both:
1) Code Ocean only generates predictions using the already trained model, there is no option to train from scratch
2) The docker image doesn't have tensorflow-gpu for tf 1.14 which is what is required to train the model...
Hi @aga-relation,
In response to your original question, we trained on TPUs, and this did not require us to install tensorflow-gpu
.
However, if you wish to train using GPUs, you should be able to simply install tensforflow-gpu
in a fresh conda environment after running first running pip install -r requirements.txt
. You may have to follow these instructions for installing tensorflow-gpu
on your machine (including CUDA / CUDNN steps).
You should also be able to train on Code Ocean by running the notebook at this path: /code/models/tpu_model/train_model.ipynb
, but it may be easier to install tensorflow-gpu
on your machine.
Good luck!
Hi! Just wanted to say thank you for all the above tips :) I managed to get the code running with the latest container from here (22.09-tf1-py3) and just installing pydot
and seaborn
.
That is great to hear, good luck with your projects!
Hi!
I am trying to re-train the model but looks like only
tensorflow
is installed, as opposed totensorflow-gpu
? Is there a reason for this? How were you able to train the GPU model withouttensorflow-gpu
installed?Thank you very much!