getcursor / cursor

The AI Code Editor
https://cursor.com
22.83k stars 1.46k forks source link

Add support for Amazon Bedrock from AWS #1249

Open dgallitelli opened 6 months ago

dgallitelli commented 6 months ago

Is your feature request related to a problem? Please describe. Currently, only OpenAI models are supported by Cursor. I would like to use AWS models from Amazon Bedrock, such as Amazon Titan or Anthropic Claude or Meta Llama-2.

Describe the solution you'd like In the Cursor Settings, I would like to be able to connect to my AWS account, and configure my Bedrock model.

Additional context

Why only OpenAI? :( image

ibrahimkettaneh commented 6 months ago

I would like to boost this. I would greatly appreciate this feature.

forkfork commented 5 months ago

This would help me enormously - specifically Claude 3 Sonnet via Bedrock (and later Opus).

stirredo commented 4 months ago

+1

Please consider bedrock support now that Opus is supported. It supports people that use company AWS infrastructure for everything.

arvehisa commented 4 months ago

+1 Would be very appreciate if cursor can support Claude 3 via Amazon Bedrock.

josegtmonteiro commented 3 months ago

+1

yeralin commented 3 months ago

+1 should be fairly straightforward since Anthropic API is already integrated

Sugi275 commented 2 months ago

+1 As shown in the following URL, the Anthropic SDK is able to request Claude Model through Amazon Bedrock. https://docs.anthropic.com/en/api/claude-on-amazon-bedrock

from anthropic import AnthropicBedrock

client = AnthropicBedrock(
    # Authenticate by either providing the keys below or use the default AWS credential providers, such as
    # using ~/.aws/credentials or the "AWS_SECRET_ACCESS_KEY" and "AWS_ACCESS_KEY_ID" environment variables.
    aws_access_key="<access key>",
    aws_secret_key="<secret key>",
    # Temporary credentials can be used with aws_session_token.
    # Read more at https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_temp.html.
    aws_session_token="<session_token>",
    # aws_region changes the aws region to which the request is made. By default, we read AWS_REGION,
    # and if that's not present, we default to us-east-1. Note that we do not read ~/.aws/config for the region.
    aws_region="us-west-2",
)

message = client.messages.create(
    model="anthropic.claude-3-5-sonnet-20240620-v1:0",
    max_tokens=256,
    messages=[{"role": "user", "content": "Hello, world"}]
)
print(message.content)

I would greatly appreciate it if you could support calling the Claude Model using AWS IAM Keys.

wilsonhou commented 1 month ago

+1 to this! would really appreciate bedrock support.

wooferclaw commented 1 month ago

+1 could be really useful to have this!

tonyrusignuolo commented 1 month ago

+1 would be useful for employees since they are only allowed to share code artifacts with Bedrock's internal models.

yeralin commented 1 month ago

@tonyrusignuolo

+1 would be useful for employees since they are only allowed to share code artifacts with Bedrock's internal models.

I found https://continue.dev a good replacement for Cursor that works with Bedrock out of the box.

gwailoTr0n5000 commented 1 month ago

Disappointing that this still isn't implemented. May have to test out continue.dev in that case.

KaliCharan-V commented 1 month ago

Much required with Sonnet gaining popularity

metaskills commented 1 month ago

So GitHub Models came out yesterday. This is their inference platform similar to Amazon Bedrock. Google has Model Garden. Please support all of these. But yes, yes, yes please do add support for Amazon Bedrock first ;)

vinodvarma24 commented 3 weeks ago

Much needed

refactorthis commented 3 weeks ago

I would also like to bump this. Working in a corporate environment where everything must go through our AWS LZ. This would open up Cursor for corporate scenarios where strict backend control is required.

jubinpyli commented 2 weeks ago

+1

binarycrayon commented 2 weeks ago

subscribed +1

l4time commented 2 weeks ago

+1

As a workaround I suggest using this repository to work as a proxy: https://github.com/aws-samples/bedrock-access-gateway

Then in cursor override the OpenAI base URL and set the custom API key.

kingstarfly commented 2 weeks ago

+1

As a workaround I suggest using this repository to work as a proxy: aws-samples/bedrock-access-gateway

Then in cursor override the OpenAI base URL and set the custom API key.

@l4time I tried following your workaround and I've verified that the proxy is working.

However, I wasn't able to use Cursor's chat function - it says that Seems like we are having an issue with your API key - please confirm the base URL is publicly accessible and the API key is correct in the settings. If this persists, please email us at hi@cursor.sh..

I wonder if you figured out a way around this

adityapapu commented 2 weeks ago

+1

I would also like to bump this. Working in a corporate environment where everything must go through our AWS LZ. This would open up Cursor for corporate scenarios where strict backend control is required.

Thandden commented 2 weeks ago

+1

dzhou1221 commented 2 weeks ago

+1

Eternaux commented 2 weeks ago

+1

martin-nginio commented 2 weeks ago

+1

ron137 commented 1 week ago

+1

orangewise commented 1 week ago

+1

JurgenLangbroek commented 1 week ago

+1

davidshtian commented 1 week ago

+1

juan-abia commented 1 week ago

Really lookin fordward to this feature. It would be huge

kevin-longe-unmind commented 1 week ago

Is cursor going to add this?

hassanxelamin commented 1 week ago

waiting very patiently......

Hitesh-Sisara commented 1 week ago

looking for this feature as well

drankard commented 6 days ago

+1

MArsalanJaved commented 6 days ago

+1

interskh commented 5 days ago

+1

andrewrreed commented 4 days ago

+1

vdpappu commented 4 days ago

+1

mosschief commented 2 days ago

+1

dshvadskiy commented 10 hours ago

+1