Closed mitmelon closed 2 years ago
@mitmelon You could theoretically run it on any machine you like as long as you have enough storage and memory to start the server (the big model is around 10GB). You can run it on a CPU only but it's very slow and takes minutes for a prediction. It's much faster on a GPU. I personally used an AWS p2 instance with a GPU to run the model & server.
Please what are the hardware Requirements to use this?
Thanks.