Closed purefire closed 8 months ago
interested as well!
It's the same model but this repo cannot use those weight formats directly. We have an internal export script that transforms other formats into our simple blob format, but this is not yet ready for production/open-sourcing. Please comment on #11, which discusses which input formats would be useful.
We do not use Rocm, this repo is currently CPU-only, see #18.
@purefire adding an export script is a high priority to unlock model variations / fine tuning. Please chime in https://github.com/google/gemma.cpp/issues/11 on the source format that would be most useful.
Hi @julien-c :wave: :)
Consolidating weight export discussions to the issue https://github.com/google/gemma.cpp/issues/11 . We're working on making a script available soon - feel free to chime in with further comments there. Thanks!
Is it possible to use weights on HuggingingFace, e.g https://huggingface.co/google/gemma-2b-it ? BTW - does it work on Rocm(AMD)?