Open sabyasm opened 1 year ago
What you mean?
Only OpenAI option available during the setup
@sabyasm Do you have tried to log in with a cursor account first?
So I was able to see this Azure option after logging in. I set up my Azure details and it still appears to attempt to reach out to OpenAI. I see in the chat interface there is an option between 3.5 and 4, but I never ever want to use anything but my Azure endpoint and the deployment is specified. Any additional tips on debugging this? Further, once I log in to get to that window, can I log out to be sure my data never leaves my workstation or Azure?
@bioshazard This is my setup and it's working great! Except in New AI project, which is a known bug atm.
@thejackwu , do I need to use deployment name "cursor-ai" or you just happen to use it? I have mine set up just like yours with a different deploy name and "Using Key" succeeds green. What do you mean by New AI Project? I just want to use Chat after opening a folder. Thank you for the engagement! Wish it had more verbose logging. Also, you on MacOS?
@bioshazard Yea you don't have to use my deployment-name. If you can see the green button that should mean your Azure API is functional. For me to prevent the app to use OpenAI, I removed my OpenAI key once for all in my setting, so it has to use the Azure one. I can also confirm from my Azure dashboard, that it's calling the API.
By New AI Project I mean this button and screen.
Maybe you can pull up the Azure metrics and see if Cursor is calling it?
@thejackwu appreciate the support here. Will check Azure accordingly. I am also relaying through a localhost socket relay so maybe that is tripping things up. Seems to auth green just fine, but will inspect a bit more thanks. I have no OpenAI key anywhere.
EDIT: Not seeing anything in Azure. Gonna see if I can redirect the URL to some intermediate host to detect if its attempting at all. If I was running on Linux I could strace
it lmao but MacOS is a bit tougher.
EDIT2: Unable to troubleshoot further. It appears to just fail silently. Not sure how to inspect what it is attempting on MacOS. Will watch this and the other Azure issues, but I plan to give up otherwise. Thanks again for the support.
So I wrote an http relay to middle-man the calls to Azure and was able to see the test prompt payload and even get the Azure config to be green. I can confirm now that it does not even attempt to reach out ever again to that same endpoint when I close the window and try to use the chat feature. Is there a way to turn on debug logging or something to better troubleshoot this issue? Gonna try a fresh install in case I cornered myself with an anti-golden path during installation. I think a new Cursor version came out though because now the Chat just kinda fails silently, no more "Cant reach OpenAI", just seems to pretend I never submitting a chat message after waiting a sec. Sorry if I took over this thread inappropriately, lmk if my direction is out of scope I can open a fresh one.
EDIT: Looks like deleting it from the Applications folder is insufficient to clear out all configs. Any tips on how to completely uninstall would also be appreciated. I think at this point unless I find the lingering config and it works on re-install I must truly give up here until I got more debug logging or something.
EDIT2: I found it at ~/Library/Application\ Support/Cursor/
and deleted it to get a truly fresh install but even starting fresh -> login -> set Azure -> "what is NPM" in chat does not seem to even attempt to reach out to the same endpoint I was able to validate in the Azure settings. Even if this is some corporate limitation on my workstation, I should still be able to see some debug message about the nature of the failure.
Hi @bioshazard @kamushadenes @thejackwu @sabyasm I believe we can help with this issue. I’m the maintainer of LiteLLM https://github.com/BerriAI/litellm
TLDR:
We allow you to use any LLM as a drop in replacement for gpt-3.5-turbo
.
If you don't have access to the LLM you can use the LiteLLM proxy to make requests to the LLM
You can use LiteLLM in the following ways:
This calls the provider API directly
from litellm import completion
import os
## set ENV variables
os.environ["OPENAI_API_KEY"] = "your-key" #
os.environ["COHERE_API_KEY"] = "your-key" #
messages = [{ "content": "Hello, how are you?","role": "user"}]
# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)
# cohere call
response = completion(model="command-nightly", messages=messages)
this is great if you don’t have access to claude but want to use the open source LiteLLM proxy to access claude
from litellm import completion
import os
## set ENV variables
os.environ["OPENAI_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your openai key
os.environ["COHERE_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your cohere key
messages = [{ "content": "Hello, how are you?","role": "user"}]
# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)
# cohere call
response = completion(model="command-nightly", messages=messages)
s there way where we can set custom headers for Azure OpenAI in cursor models settings? headers like Ocp-Apim-Subscription-Key .. etc. for azure openai custom models deployments.
The current process supports OpenAI apikey whereas Azure OpenAI keys aren't currently supported. Can you please incorporate this?