devoxx / DevoxxGenieIDEAPlugin

DevoxxGenie is a plugin for IntelliJ IDEA that uses local LLM's (Ollama, LMStudio, GPT4All, Llama.cpp and Exo) and Cloud based LLMs to help review, test, explain your project code.
https://devoxx.com
MIT License
240 stars 32 forks source link
anthropic assistant azure-ai chatgpt chatgpt-api claude-3 claude-ai copilot copilot-chat gemini genai gpt4all groq intellij-plugin java llm lmstudio mistral ollama openai

Devoxx Genie

X

Devoxx Genie is a fully Java-based LLM Code Assistant plugin for IntelliJ IDEA, designed to integrate with local LLM providers such as Ollama, LMStudio, GPT4All, Llama.cpp and Exo but also cloud based LLM's such as OpenAI, Anthropic, Mistral, Groq, Gemini, DeepInfra, DeepSeek, OpenRouter and Azure OpenAI

We now also support LLM-driven web search with Google and Tavily.

With Claude 3.5 Sonnet, DevoxxGenie isn't just another developer tool... it's a glimpse into the future of software engineering. As we eagerly await Claude 3.5 Opus, one thing is clear: we're witnessing a paradigm shift in Ai Augmented Programming (AAP) πŸ’

Marketplace

Hands-on with DevoxxGenie

DevoxxGenie Demo

More Video Tutorials:

Blog Posts:

Key Features:

GenieExample

We now support also streaming responses which you can enable in the Settings page 🀩 πŸš€

https://github.com/devoxx/DevoxxGenieIDEAPlugin/assets/179457/8081d4f2-c5c4-4283-af1d-19061b7ae7bf

Start in 5 Minutes with local LLM

Start in 2 Minutes using Cloud LLM

LLM Settings

In the IDEA settings you can modify the REST endpoints and the LLM parameters. Make sure to press enter and apply to save your changes.

We now also support Cloud based LLMs, you can paste the API keys on the Settings page.

DevoxxGenieSettings

Smart Model Selection and Cost Estimation

The language model dropdown is not just a list anymore, it's your compass for smart model selection.

Models

See available context window sizes for each cloud model View associated costs upfront Make data-driven decisions on which model to use for your project

Add Project to prompt & clipboard

You can now add the full project to your prompt IF your selected cloud LLM has a big enough window context.

AddFull

Calc Cost

Leverage the prompt cost calculator for precise budget management. Get real-time updates on how much of the context window you're using.

AddCalcProject

See the input/output costs and window context per Cloud LLM. Eventually we'll also allow you to edit these values.

Cost

Handling Massive Projects?

"But wait," you might say, "my project is HUGE!" πŸ˜…

Fear not! We've got options:

  1. Leverage Gemini's Massive Context:

Gemini's colossal 1 million token window isn't just big, it's massive. We're talking about the capacity to digest approximately 30,000 lines of code in a single go. That's enough to digest most codebases whole, from the tiniest scripts to some decent projects.

But if that's not enough you have more options...

  1. Smart Filtering:

The new "Copy Project" panel lets you:

Exclude specific directories Filter by file extensions Remove JavaDocs to slim down your context

Filter

  1. Selective Inclusion

Right-click to add only the most relevant parts of your project to the context.

RightClick

The Power of Full Context: A Real-World Example

The DevoxxGenie project itself, at about 70K tokens, fits comfortably within most high-end LLM context windows. This allows for incredibly nuanced interactions – we're talking advanced queries and feature requests that leave tools like GitHub Copilot scratching their virtual heads!

Local LLM Cluster with Exo

V0.2.7 also supports Exo, a local LLM cluster for Apple Silicon which allows you to run Llama 3.1 8b, 70b and 405b on your own Apple computers 🀩

image

Installation:

Requirements:

Build

Gradle IntelliJ Plugin prepares a ZIP archive when running the buildPlugin task.
You'll find it in the build/distributions/ directory

./gradlew buildPlugin 

Publish plugin

It is recommended to use the publishPlugin task for releasing the plugin

./gradlew publishPlugin

Usage:

1) Select an LLM provider from the DevoxxGenie panel (right corner) 2) Select some code 4) Enter shortcode command review, explain, generate unit tests of the selected code or enter a custom prompt.

Enjoy!