Kent0n-Li / ChatDoctor

Apache License 2.0
3.45k stars 402 forks source link

Abandoned (core dumped) #27

Open coomiit opened 1 year ago

coomiit commented 1 year ago

Hello, I am a college student reading your paper. My server GPU is only 48G, does that mean I don't have enough memory in my GPU to do the inference

Kent0n-Li commented 1 year ago

For inference, 20GB is enough.

laotao commented 10 months ago

For inference, 20GB is enough.

Is it possible to run inference with two 8GB GPUs?