langchain-ai / langsmith-sdk

LangSmith Client SDK Implementations
https://smith.langchain.com/
MIT License
371 stars 66 forks source link

Langsmith expose is not working Azure OpenAI services #153

Closed mrcmoresi closed 7 months ago

mrcmoresi commented 1 year ago

Issue you'd like to raise.

Hi everyone, I'm trying to deploy and use langsmith locally. I deployed in a docker container using

langsmith start --expose --openai-api-key=<my azure OpenAi key>

the docker container looks good image I opened all the used ports to avoid any problem there, I'm running langsmith in a remote computer

I set up the environment variables LANGCHAIN_TRACING_V2=true LANGCHAIN_ENDPOINT=https://cc23-20-79-217-xxx.ngrok.io LANGCHAIN_API_KEY=

but the interface is not loading the projects image

when I try to access the langsmith endpoint it returns

{
"detail": "Not Found"
}

using the chat example that appears in this repo https://github.com/langchain-ai/langsmith-cookbook/tree/main/feedback-examples/streamlit

I can see in the endpoint https://cc23-20-79-217-xxx.ngrok.io that the runs are being tracked, but I can't see them in the frontend

debugging the front end it is failing trying to fetch the tenants, it's trying to fetch them from http://127.0.0.1:1984/tenants while if I'm not understanding it wrong it should get them from http://20.79.217.xxx:1984/tenants image

could it be a problem with the Azure OpenAI? or did I do something wrong with the installation?

Thanks in advance

Suggestion:

No response

hinthornw commented 1 year ago

Thank you for raising this issue! Sorry to hear it isn't functioning as expected.

Just to confirm, you are running the langsmith server on a remote VM/computer and wish to access the app from your local desktop?

The --expose functionality tunnels the ports for the logging endpoints currently, but it doesn't not provide a tunnel for the web app, which is what's causing the issue (I believe). It was intended for running the langsmith server on a local desktop and viewing the results while logging traces from a remote server or collab notebook.

For self-hosting the server on a remote machine, you needn't use --expose. You could run langsmith start directly and so long as the firewall is configured to allow traffic on port 1984 and the standard http/https ports, you could access it at whatever URL you re hosting from.

The OpenAI key is currently only used for 1 feature: the experimental natural language search, so that is likely a red herring for the loading bug you're experiencing. The Azure OpenAI key won't work for that function however.

mrcmoresi commented 1 year ago

Hi @hinthornw thanks for your answer. I'm currently running langsmith on a remote VM and in a different one the chatbot app. I will redeploy langsmith without the --expose flag, and make sure the traffic is allowed on port 1984 and come back with the results

Update: NOTE: this is running in a cloud compute instance. I updated langsmith to 0.0.25 and deploy it again, this time without the --expose flag, I made sure the traffic is allowed on port 1984, I can access the API URL and make requests directly there.

I'm still getting the same error image The frontend can't access the API

I ran it also locally in my laptop and looks like is working, I would need it to run directly in the cloud, do you think there is a chance to do it?

hinthornw commented 1 year ago

Hi @mrcmoresi Definitely possible! I'll try to debug further - sorry i haven't found time to address this adequately. Thank you for your patience!

filipecasal commented 12 months ago

@hinthornw I think the problem is that the backend URL is hardcoded to 127.0.0.1 and if the backend is running in a different machine than the UI, the browser will try to reach 127.0.0.1 and not the correct ip from the server.

Maybe the backend ip on the UI image can be defined via env?

hinthornw commented 7 months ago

Going to close this issue, as the self-hosted offering has evolved a lot since then