getcursor / cursor

The AI-powered code editor
https://cursor.sh
20.58k stars 1.38k forks source link

Add groq API #1407

Open iamnvt opened 2 months ago

iamnvt commented 2 months ago

this is the game changer for the speed with llama 3.

spikecodes commented 2 months ago

+1 for this ^

yangcheng commented 2 months ago

it can be as simple as let cursor respect 'OPENAI_API_BASE_URL' env

iamnvt commented 2 months ago

But it will make you lose the access to openai, and also dont have codebase coding.


From: Cheng Yang @.> Sent: Tuesday, April 23, 2024 4:13:46 PM To: getcursor/cursor @.> Cc: Tommy Nguyen @.>; Author @.> Subject: Re: [getcursor/cursor] Add groq API (Issue #1407)

it can be as simple as let cursor respect 'OPENAI_API_BASE_URL' env

— Reply to this email directly, view it on GitHubhttps://github.com/getcursor/cursor/issues/1407#issuecomment-2071810032, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ADJZKO6LR74G637JETJZGPLY6YQ4VAVCNFSM6AAAAABGRJZJFOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDANZRHAYTAMBTGI. You are receiving this because you authored the thread.Message ID: @.***>

CalamariDude commented 2 months ago

This would be awesome - has anyone figured out a workaround for now it would be a gamechanger prob 10x my dev speed

yangcheng commented 2 months ago

This would be awesome - has anyone figured out a workaround for now it would be a gamechanger prob 10x my dev speed

yes! just override openai base url to groq https://api.groq.com/openai/v1

Screenshot 2024-04-24 at 11 30 34 AM
CalamariDude commented 2 months ago

@yangcheng Thank you! For some reason the speed seems around the same as current gpt4 in cursor, is this your experience?

yangcheng commented 2 months ago

for me the chat is noticeably faster. commnad+K is about same, maybe the bottleneck is somewhere else. maybe you can try 8b models to be sure?

CalamariDude commented 2 months ago

for me the chat is noticeably faster. commnad+K is about same, maybe the bottleneck is somewhere else. maybe you can try 8b models to be sure?

its about the same for small models on groq : /

kcolemangt commented 2 months ago

Today I made llm-router, enabling the use of ⌘ K or ⌘ L, followed by ⌘ /, to toggle between openai and groq. If there's interest, I can open source it and cut a release.

spikecodes commented 2 months ago

@yangcheng Thanks! But how do we pass our API key? https://console.groq.com/docs/api-keys says:

API keys are required for accessing the APIs.

kcolemangt commented 2 months ago

@yangcheng Thanks! But how do we pass our API key? https://console.groq.com/docs/api-keys says:

API keys are required for accessing the APIs.

@spikecodes Set the base URL and API key to Groq. However, you will lose access to OpenAI models until you remove the base URL and replace your Groq key with your OpenAI key.

The llm-router option I mentioned above allows you to provide the Groq API key via an environment variable, enabling you to use OpenAI models and others, like OLLAMA, simultaneously.

spikecodes commented 2 months ago

Thanks @kcolemangt! I didn't realize, I could pass the Groq API key the same as the OpenAI API key.

Regarding llm-router, I don't think it would help me much as I don't really need to switch back and forth between models. Just want a good fast option.

CalamariDude commented 2 months ago

Does anyone know why cursor is still slow even using groq? I would imagine the completion should be slightly faster if the llm part is done in 1/8th of the time... is this a limitation of extensions built with vscodium ? increasing the speed of cursor seems like the next important milestone to get this to become the best AI editor. @truell20 @Sanger2000 Do you know who would be the best person to ask regarding this ?

yangcheng commented 2 months ago

@yangcheng Thanks! But how do we pass our API key? https://console.groq.com/docs/api-keys says:

API keys are required for accessing the APIs.

@spikecodes Set the base URL and API key to Groq. However, you will lose access to OpenAI models until you remove the base URL and replace your Groq key with your OpenAI key.

The llm-router option I mentioned above allows you to provide the Groq API key via an environment variable, enabling you to use OpenAI models and others, like OLLAMA, simultaneously.

saw you tweet, very cool,how can we use it?

krishnapraveen7 commented 1 month ago

Today I made llm-router, enabling the use of ⌘ K or ⌘ L, followed by ⌘ /, to toggle between openai and groq. If there's interest, I can open source it and cut a release.

Yes please, god i need this

kcolemangt commented 1 month ago

https://github.com/kcolemangt/llm-router