guidance-ai / guidance

A guidance language for controlling large language models.
MIT License
18.81k stars 1.04k forks source link

Assistant prompt not working when using a proxy server #868

Open ayush0x00 opened 4 months ago

ayush0x00 commented 4 months ago

The bug I am using a proxy server which expose an endpoint, and after processing the request makes a call to azure openai endpoint. The server is returning the response, which is exactly same as it should return by directly calling the azure endpoint. However, the returned prompt is not being shown by my python notebook. When I directly make call to the azure api, the notebook shows the received prompt. I have attached ss of notebook and the received response from server

gpt_azure = models.AzureOpenAI(azure_endpoint="http://localhost:7777/openai/deployments/gpt-3.5-turbo/chat/completions?api-version=2023-05-15", model="gpt-3.5-turbo", api_key="")
global lm
with system():
    lm = gpt_azure + "You are a helpful assistant."

with user():
    lm += "Hello...how are you?"
with assistant():
    lm += gen(name="resp")

print(lm["resp"])

Notebook looks like

Screenshot 2024-05-30 at 4 46 04 PM

The response sent by the proxy server.

Screenshot 2024-05-30 at 4 47 40 PM

System info: