Hi, not a bug, but I tried to run this on AWS with 16G of memory on a Tesla V100-SXM2-16GB GPU and I couldn't load the model. It ran out of memory. Anyone know what might be going on? I ran it in colab just fine (at least loading the model) on 16GB memory. Thanks!
.
Hi, not a bug, but I tried to run this on AWS with 16G of memory on a Tesla V100-SXM2-16GB GPU and I couldn't load the model. It ran out of memory. Anyone know what might be going on? I ran it in colab just fine (at least loading the model) on 16GB memory. Thanks! .