Open atanudasgupta opened 1 week ago
@atanudasgupta Due to lack of a Mac-M2 machine, we are not sure it will fit in that kind of chip. Theoretically, it takes around 50 GB of vRAM to load a whole Aria model into it. If you have any experience of how to run Aria on M2, we are glad to have it and discuss with you.
Hi
Is it possible to run Aria on Mac-M2 GPUs?
Waiting for this great developer's adaptation https://github.com/Blaizzy/mlx-vlm/issues/39
Hi
Is it possible to run Aria on Mac-M2 GPUs?