sourcegraph / jetbrains

Apache License 2.0
47 stars 11 forks source link

Sourcegraph: Cody + Code Search

Use Cody, the AI coding assistant, plus Code Search directly from your JetBrains IDE.

Cody Features

Autocomplete: Cody writes code for you

Cody autocompletes single lines or whole functions in any programming language, configuration file, or documentation. It’s powered by the latest instant LLMs for accuracy and performance.

Example of using code autocomplete

Chat: Ask Cody about anything in your codebase

Cody is able to search context from your entire codebase — not just your open files. Cody uses advanced code search to retrieve context from both local and remote repositories.

For example, you can ask Cody:

Example of chatting with Cody

Built-in commands

Cody has quick commands for common actions. Select the commands tab or right-click on a selection of code and choose one of the Ask Cody > ... commands, such as:

We also welcome pull request contributions for new, useful commands!

Swappable LLMs

Cody supports multiple LLMs including Anthropic Claude 3, OpenAI GPT-4o, and Mixtral models, with more coming soon. Cody Pro users can swap models used for chat on-demand.

Usage

This plugin works for all Cody plans, including Cody Free, Cody Pro, and Cody Enterprise.

You can find detailed information about Cody's available plans on our website.

Programming language support

Cody works for any programming language because it uses LLMs trained on broad data. Cody works great with Python, Go, JavaScript, and TypeScript code.

Code search

Cody is powered by Sourcegraph’s code search and uses context of your codebase to extend its capabilities. By using context from your chosen repositories, Cody is able to give more accurate answers and generate idiomatic code.

For example:

Cody Enterprise

Cody Enterprise can retrieve context from your entire remote codebase using code search. This allows Cody to understand and answer questions about any of your code, even the repositories that don't live on your local machine.

Contact us to set up a trial of Cody Enterprise. If you’re an existing Sourcegraph Enterprise customer, contact your technical advisor.

Feedback

License

Cody's code is open source (Apache License 2.0).

Code Search features

URL sharing features

Supported IDEs JetBrains Plugin

The plugin works with all JetBrains IDEs, including:

Versions 2022+ Recommended

Exception: Due to a Java bug, search doesn't work with IDE versions 2021.1 and 2021.2 for users with Apple Silicone CPUs.

Installation

Settings

List of in-app settings and how to use them

Git remote setting

By default, the plugin will use the git remote called origin to determine which repository on Sourcegraph corresponds to your local repository. If your origin remote doesn't match Sourcegraph, you may instead configure a Git remote by the name of sourcegraph. It will take priority when creating Sourcegraph links.

Setting levels

You can configure the plugin on three levels:

  1. Project-level On the project level you are able to configure your default account, default branch name and remote url replacements
  2. Application-level All other settings are stored here

System Properties

Autocomplete system properties

In order to disable newly introduced features we are giving an option to disable them via system properties:

  1. Disable formatting autocomplete elements cody.autocomplete.enableFormatting=false

Managing Custom Keymaps

A screenshot of the JetBrains preferences panel inside the Keymap tab

You can configure JetBrains to set custom keymaps for Sourcegraph actions:

  1. Open the JetBrains preferences panel and go to the Keymap page.
  2. Filter by "sourcegraph" to see actions supplied by this plugin.
  3. Now select an option to overwrite the keymap information and supply the new bindings.

Use Ollama models for Chat & Commands

Experience experimental chat and command support with Ollama running locally:

  1. Install and run Ollama.
  2. Set the OLLAMA_HOST to 0.0.0.0.
    1. Please refer to the official Ollama docs for how to set environment variables on your platform.
  3. Set the OLLAMA_ORIGINS.
  4. Install or restart your Ollama app.
  5. Select a chat model (a model that includes instruct or chat, e.g., codegemma:instruct, llama3:instruct) from the Ollama Library.
  6. Pull the chat model locally (Example: ollama pull codegemma:instruct).
  7. Once the chat model is downloaded successfully, open Cody in your IDE.
  8. Open a new Cody chat.
  9. In the new chat window, you should see the chat model you've pulled in the dropdown list at the top.

Note: You can run ollama list in your terminal to see what Ollama models are currently available on your machine.

Questions & Feedback

If you have any questions, feedback, or bug report, we appreciate if you open an issue on GitHub.

Uninstallation