Closed ujjawal-ti closed 1 month ago
Shooting in the open here:
Hey @koshyviv ,
It's helpful to add debug logging so you can see where the requests are going. You probably also want to set the model to chat
, which will append /chat/completions
to your base uri, and use a chat format payload. The default is text
, which will make a request to /completions
with a prompt
in the json body.
root = logging.getLogger()
root.setLevel(logging.DEBUG)
handler = logging.StreamHandler(sys.stdout)
handler.setLevel(logging.INFO)
formatter = logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s")
handler.setFormatter(formatter)
root.addHandler(handler)
llm_mistral = OpenAI(api_base = openai.api_base,
api_key='random_value',
model='Mistral-7B-Instruct-v0.1',
model_type="chat"
)
I'm having some other issues that when optimizing, the model seems to revert to a text
model 😭
My issue, using an openai compatible api server was the api_base
missing a trailing /
. Looking at azure_openai_usage.log
made it more obvious:
HTTP Request: POST http://127.0.0.1:5000/v1chat/completions "HTTP/1.1 404 Not Found"
Something like this:
openai.api_base = "https://our_url/api/" # use the IP or hostname of your instance
@ujjawal-ti which api do you use for streaming ?
Do you solve this problem ? I have the same error.
我遇到了类似的问题 猜测原因一方面可能是url后缀问题,一方面是openai的版本过旧 通过执行以下命令我成功获取到了结果
首先,更新openai和dspy到最新版本
pip install --upgrade dspy openai
然后通过以下demo获取结果:
import dspy
import openai
model="gpt-3.5-turbo"
api_base = f'${host}/v1/'
api_key = f"{key}"
turbo = dspy.OpenAI(model=model, max_tokens=250, api_base= api_base, api_key = api_key,model_type="chat")
dspy.settings.configure(lm=turbo)
sentence = "it's a charming and often affecting journey." # example from the SST-2 dataset.
classify = dspy.Predict('sentence -> sentiment')
classify(sentence=sentence).sentiment
最终得到输出: "Sentence: it's a charming and often affecting journey.\nSentiment: Positive"
Hey, I'm trying to use my LLM on vLLM server which is exposed as an API. Usually, I create an openai LLM instance with Langchain like below, and it works fine.
I want to use
dspy.OpenAI
module in similar way,Getting the following error,
Any idea how to use dspy.OpenAI like above ?