getcursor / cursor

The AI Code Editor
https://cursor.com
24.84k stars 1.55k forks source link

Implement Local AI Integration with Ollama for Offline AI Assistance #1811

Open ThalesAugusto0 opened 1 month ago

ThalesAugusto0 commented 1 month ago

Description: We need to enhance Cursor IDE by implementing support for local AI models using Ollama, similar to the Continue extension for VS Code. This will enable developers to use AI-powered code assistance offline, ensuring privacy and reducing dependency on external APIs.

1. Ollama Integration:

Add options in Cursor’s settings to configure and manage local AI models. This should include the ability to switch between different AI providers, like Ollama and any cloud-based alternatives. Implement a configuration UI that allows users to easily select and manage their local AI setups. Performance and Usability:

Optimize the interaction between Cursor and the local AI models to minimize latency and resource usage. Ensure that the local AI features are as seamless and user-friendly as their cloud-based counterparts, with clear feedback on model performance and any potential issues.

Air1996 commented 1 month ago

Considering the size and effectiveness of the local model and the commercialization of the Cursor product, the likelihood of this proposal coming to fruition is quite small. 😂

ThalesAugusto0 commented 1 month ago

Considering the size and effectiveness of the local model and the commercialization of the Cursor product, the likelihood of this proposal coming to fruition is quite small. 😂

The company doesn't need to do this, since the code is open, why can't we, the development community, do this?

vertis commented 1 month ago

Cursor is not open source. This is an issues only repo.

tcsenpai commented 1 month ago

Upping this anyway. The company can still monetize with the thousands of devs that do not have a powerful GPU.

Mateleo commented 1 month ago

Check this : https://github.com/getcursor/cursor/issues/1380#issuecomment-2371534354

sneedger commented 1 month ago

Check this : #1380 (comment)

The devs broke that as well (likely on purpose), they're in for the money and don't care about you and me

Mateleo commented 1 month ago

Check this : #1380 (comment)

The devs broke that as well (likely on purpose), they're in for the money and don't care about you and me

For me it's working perfectly fine, using ollama + ngrok. I use the latest version of cursor

tcsenpai commented 1 month ago

Check this : #1380 (comment)

The devs broke that as well (likely on purpose), they're in for the money and don't care about you and me

I doubt the devs would do such an easily visible thing like breaking support specifically for ollama (as here we are anyway using an OpenAI endpoint, so is pretty generic). Anyway, the loss of quality using 8b models this way is not worth saving 20 bucks per month. They are not in danger.

sneedger commented 1 month ago

Check this : #1380 (comment)

The devs broke that as well (likely on purpose), they're in for the money and don't care about you and me

For me it's working perfectly fine, using ollama + ngrok. I use the latest version of cursor

I applied your workaround properly but I keep on getting error 403 from ngrok like many other people, do I need to forward some port or?