I am using a Titan X GPU with 12GB memory. It handles language model training on a large protein dataset, and finetuning on a smaller one. But during classification on the small dataset, suddenly CUDA runs out of memory. Have you faced any such issue? How much GPU memory did you use during the training?
I am using a Titan X GPU with 12GB memory. It handles language model training on a large protein dataset, and finetuning on a smaller one. But during classification on the small dataset, suddenly CUDA runs out of memory. Have you faced any such issue? How much GPU memory did you use during the training?