Open garyyang85 opened 1 month ago
The issue is found in gguf version 0.6.0. When I upgrade gguf to latest version 0.9.1, new issue here:
Traceback (most recent call last):
File "/mnt/vol-5ojabq86/ollama/convert.py", line 1440, in <module>
main()
File "/mnt/vol-5ojabq86/ollama/convert.py", line 1325, in main
metadata = gguf.Metadata.load(args.metadata, dir_model, model_name)
^^^^^^^^^^^^^
AttributeError: module 'gguf' has no attribute 'Metadata'
From your first output log, it’s possible that this issue is related to the python env.
ImportError: cannot import name 'BaseVocab' from 'gguf' (/root/miniconda3/envs/llama.cpp.311/lib/python3.11/site-packages/gguf/__init__.py)
Even if you haven't set NO_LOCAL_GGUF
. it seems that your code is still using the system Python env's gguf
, rather than the gguf-py
provided by the repo.
Here is the source code about how import gguf
from the convert_legacy_llama.py
. You can refer to it for guidance.
if 'NO_LOCAL_GGUF' not in os.environ:
# use .parent.parent since we are in "examples" directory
sys.path.insert(1, str(Path(__file__).parent.parent / 'gguf-py'))
import gguf
And there is a similar issue #8925 here to yours, it might be an issue with the pip package.
What happened?
I want to convert gemma2 model to GGUF, I am using this code convert-legacy-llama.py which is present inside examples folder. But got this error ImportError: cannot import name 'BaseVocab' from 'gguf'. Same issue with https://github.com/ggerganov/llama.cpp/issues/7776 but I did not set the NO_LOCAL_GGUF. output: