Closed learnermaxRL closed 2 months ago
Can you please detail the system requirements , can this run on mac m2 air ?
Hi, Groma-7b takes 30-40G memory for inference on a single GPU. But we have not tested it on CPU. I guess you may need to quantize the model as in LLaVA to make it run on Mac.
Can you please detail the system requirements , can this run on mac m2 air ?