run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
36.74k stars 5.27k forks source link

[Bug]: Can‘t setting global tokenizer #16902

Closed pzc163 closed 21 hours ago

pzc163 commented 4 days ago

Bug Description

Can‘t setting global tokenizer as below code

from llama_index.core.utils import get_tokenizer, set_global_tokenizer import tiktoken

set_global_tokenizer(tiktoken.encoding_for_model("gpt-4o").encode)

image

Version

0.11.22

Steps to Reproduce

from llama_index.core.utils import get_tokenizer, set_global_tokenizer from llama_index.core.settings import Settings import tiktoken

set_global_tokenizer(tiktoken.encoding_for_model("gpt-4o").encode)

Relevant Logs/Tracbacks

No response

logan-markewich commented 4 days ago

What python or pydantic version do you have? This works fine for me

logan-markewich commented 4 days ago

If you can reproduce in Google colab, that'd be helpful

pzc163 commented 4 days ago

What python or pydantic version do you have? This works fine for me

Python==3.10,pydantic==2.9.2

my device is MacBookpro M3 silicon

logan-markewich commented 4 days ago

Works fine on google colab with with same https://colab.research.google.com/drive/1uSq5njwifr7fwRgLbZTi4SA9qlGgGnCm?usp=sharing

And runs fine on my mac with the same

Try a fresh venv maybe

pzc163 commented 4 days ago

Works fine on google colab with with same https://colab.research.google.com/drive/1uSq5njwifr7fwRgLbZTi4SA9qlGgGnCm?usp=sharing

And runs fine on my mac with the same

Try a fresh venv maybe

thank u a lot!