Closed ClashSAN closed 1 year ago
Hi! running the quantized models on android sounds pretty exciting. were you able to run inference with a 4gb device? It is also interesting to know what kind of processor you had used! thanks!
never tried on devices with 4 GiB or less. I work for a SoC company, we have all kinds of fancy Android devices with large amount of DRAM. Mainly on devices with MediaTek 9000/9000+/9200 series
thank you for your reply, I hope to learn more by experiment!
Hi! running the quantized models on android sounds pretty exciting. were you able to run inference with a 4gb device? It is also interesting to know what kind of processor you had used! thanks!