GAIR-NLP / anole

Anole: An Open, Autoregressive and Native Multimodal Models for Interleaved Image-Text Generation
https://huggingface.co/spaces/ethanchern/Anole
618 stars 33 forks source link

how much vram do i need for fine tuning or inference? #11

Closed Manni1000 closed 1 month ago

Manni1000 commented 1 month ago

how much vram do i need for fine tuning or inference?

EthanC111 commented 1 month ago

Thank you for your interest! Releasing a quantized model is on our TODO list!

captainst commented 1 month ago

You probably need a GPU with compute capability >= 8.0 (for bf16 ops) and VRAM > 32GB.

tellsiddh commented 1 month ago

how much vram do i need for fine tuning or inference?

Just tried it and saw it used 28Gb VRAM.

Manni1000 commented 1 month ago

thanks