Open iamnvt opened 2 months ago
+1 for this ^
it can be as simple as let cursor respect 'OPENAI_API_BASE_URL' env
But it will make you lose the access to openai, and also dont have codebase coding.
From: Cheng Yang @.> Sent: Tuesday, April 23, 2024 4:13:46 PM To: getcursor/cursor @.> Cc: Tommy Nguyen @.>; Author @.> Subject: Re: [getcursor/cursor] Add groq API (Issue #1407)
it can be as simple as let cursor respect 'OPENAI_API_BASE_URL' env
— Reply to this email directly, view it on GitHubhttps://github.com/getcursor/cursor/issues/1407#issuecomment-2071810032, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ADJZKO6LR74G637JETJZGPLY6YQ4VAVCNFSM6AAAAABGRJZJFOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDANZRHAYTAMBTGI. You are receiving this because you authored the thread.Message ID: @.***>
This would be awesome - has anyone figured out a workaround for now it would be a gamechanger prob 10x my dev speed
This would be awesome - has anyone figured out a workaround for now it would be a gamechanger prob 10x my dev speed
yes! just override openai base url to groq https://api.groq.com/openai/v1
@yangcheng Thank you! For some reason the speed seems around the same as current gpt4 in cursor, is this your experience?
for me the chat is noticeably faster. commnad+K is about same, maybe the bottleneck is somewhere else. maybe you can try 8b models to be sure?
for me the chat is noticeably faster. commnad+K is about same, maybe the bottleneck is somewhere else. maybe you can try 8b models to be sure?
its about the same for small models on groq : /
Today I made llm-router, enabling the use of ⌘ K
or ⌘ L
, followed by ⌘ /
, to toggle between openai and groq. If there's interest, I can open source it and cut a release.
@yangcheng Thanks! But how do we pass our API key? https://console.groq.com/docs/api-keys says:
API keys are required for accessing the APIs.
@yangcheng Thanks! But how do we pass our API key? https://console.groq.com/docs/api-keys says:
API keys are required for accessing the APIs.
@spikecodes Set the base URL and API key to Groq. However, you will lose access to OpenAI models until you remove the base URL and replace your Groq key with your OpenAI key.
The llm-router option I mentioned above allows you to provide the Groq API key via an environment variable, enabling you to use OpenAI models and others, like OLLAMA, simultaneously.
Thanks @kcolemangt! I didn't realize, I could pass the Groq API key the same as the OpenAI API key.
Regarding llm-router, I don't think it would help me much as I don't really need to switch back and forth between models. Just want a good fast option.
Does anyone know why cursor is still slow even using groq? I would imagine the completion should be slightly faster if the llm part is done in 1/8th of the time... is this a limitation of extensions built with vscodium ? increasing the speed of cursor seems like the next important milestone to get this to become the best AI editor. @truell20 @Sanger2000 Do you know who would be the best person to ask regarding this ?
@yangcheng Thanks! But how do we pass our API key? https://console.groq.com/docs/api-keys says:
API keys are required for accessing the APIs.
@spikecodes Set the base URL and API key to Groq. However, you will lose access to OpenAI models until you remove the base URL and replace your Groq key with your OpenAI key.
The llm-router option I mentioned above allows you to provide the Groq API key via an environment variable, enabling you to use OpenAI models and others, like OLLAMA, simultaneously.
saw you tweet, very cool,how can we use it?
Today I made llm-router, enabling the use of
⌘ K
or⌘ L
, followed by⌘ /
, to toggle between openai and groq. If there's interest, I can open source it and cut a release.
Yes please, god i need this
this is the game changer for the speed with llama 3.