Closed csreekrishna closed 1 month ago
Bumping this, this is a great extension for vscode and would be great if it were fully local (embeddings + chat)!
@kevin.chen up to you to decide if it should be prioritised now or later
Prioritize for later until we understand the retention benefit of embeddings
The primary advantage of this add-on lies in data security. Many individuals are hesitant to use OpenAI for developing patented ideas or trade secrets. This add-on focuses on running everything locally, ensuring that sensitive information remains secure.
This issue is marked as stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed automatically in 5 days.
Version
1.18
Areas for Improvement
What needs to be improved? Please describe how this affects the user experience and include a screenshot.
Ollama now supports embedding models. If Cody can utilize these Ollama embeddings, the context for all queries will be entirely local.
Describe the solution you'd like to see
Update Cody to enable local embeddings with Ollama. This will make my data safer and eliminate the need to use OpenAI.
Describe any alternatives that could be considered
No response
Additional context
No response