Closed decentropy closed 11 months ago
Running into a tiktoken error when i try to run a simpler version of this:
import os
from llama_index.llms import LiteLLM, ChatMessage
# set env variable
os.environ["TOGETHERAI_API_KEY"] = "your-togetherai-api-key"
message = ChatMessage(role="user", content="Hey! how's it going?")
# openai call
llm = LiteLLM(model="together_ai/togethercomputer/llama-2-70b-chat")
chat_response = llm.chat([message])
Thanks for raising this @decentropy will try and have this fixed today
Running into a tiktoken error when i try to run a simpler version of this:
import os from llama_index.llms import LiteLLM, ChatMessage # set env variable os.environ["TOGETHERAI_API_KEY"] = "your-togetherai-api-key" message = ChatMessage(role="user", content="Hey! how's it going?") # openai call llm = LiteLLM(model="together_ai/togethercomputer/llama-2-70b-chat") chat_response = llm.chat([message])
Thanks for raising this @decentropy will try and have this fixed today
same issue here. @krrishdholakia any update?
If I look at the code, this is called because max_tokens
is not set, and then it calles OpenAI Lib to try and figure out the maxtokens. If I set it, I get a logging exception:
litellm/main.py", line 308, in completion
logging.update_environment_variables(model=model, user=user, optional_params=optional_params, litellm_params=litellm_params)
AttributeError: 'NoneType' object has no attribute 'update_environment_variables'
Whis happens because the logger is always None. I tried passing it, but that did not help. After I stubbed out a lot the Logging calls that where erroring out, I end up at my:
chat_engine.chat(prompt)
call and then I end up at litellm main.py:669
response = openai.ChatCompletion.create(
model=model,
messages=messages,
api_base="https://api.deepinfra.com/v1/openai", # use the deepinfra api base
api_type="openai",
api_version=api_version, # default None
**optional_params,
Where it is trying to pass an empty set of messages... so something is not working when the completion part gets folded into LLamaindex here.
I have a PR up on llama index to fix this https://github.com/run-llama/llama_index/pull/7885/files
looks like people are trying to use TG AI & Deep infra - will ensure they work on my PR before merging into llama index
Okay
This PR is ready to be merged by llama index: https://github.com/run-llama/llama_index/pull/7885
I tested Together AI, Deep infra, OpenAI and can confirm they work
@barttenbrinke @decentropy @smyja the PR is merged - can you confirm if the fix works for you?
There's a llamaindex link in the docs, but it only covers langchain.
here's what I tried... I have a simple text file with some info in /mydata I want to index
I get error on querying. AttributeError: 'NoneType' object has no attribute 'update_environment_variables' I tried both togetherai and deepinfra, get same error