issues
search
devoxx
/
DevoxxGenieIDEAPlugin
DevoxxGenie is a plugin for IntelliJ IDEA that uses local LLM's (Ollama, LMStudio, GPT4All, Llama.cpp and Exo) and Cloud based LLMs to help review, test, explain your project code.
https://devoxx.com
MIT License
156
stars
24
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Support horizontal scrolling of code examples within prompt
#257
stephanj
opened
1 month ago
0
"Shift+Enter" should submit prompt
#256
stephanj
closed
1 month ago
0
Feat #234 : Introduce LMStudioChatModel to catch the token usage
#255
stephanj
closed
1 month ago
0
Refactored LMStudioService
#254
stephanj
closed
1 month ago
0
Refactored getModels to reuse isLMStudioRunning method
#253
stephanj
closed
1 month ago
0
Fix #251 - Check if LMStudio is running
#252
stephanj
closed
1 month ago
0
LMStudio check should happen every time LLM provider was changed
#251
stephanj
closed
1 month ago
1
Model Gardens Support - Claude 3.5 Sonnet via Vertex AI or Bedrock
#250
davideconsonni
closed
1 month ago
2
Calculate token cost shows consistent results after switching projects
#249
mydeveloperplanet
closed
1 month ago
1
Calc token cost is not correct anymore for DevoxxGenie code
#248
mydeveloperplanet
closed
1 week ago
2
Let user decide in options if current open file should be used in prompt context
#247
stephanj
opened
1 month ago
0
Feat #245
#246
stephanj
closed
1 month ago
0
Always show execution time
#245
stephanj
closed
1 month ago
0
The 👋 Jan integration is broken because of an upgrade of Jan
#244
stephanj
closed
3 weeks ago
1
[Refactoring] Make ChatResponsePanel more modular
#243
stephanj
opened
1 month ago
0
Add LMStudio Model Selection
#242
tobiashochguertel
closed
1 month ago
4
[Feature] Consider using Prism.js for code rendering
#241
stephanj
opened
1 month ago
3
Removed unused imports
#240
stephanj
closed
1 month ago
0
Remove unused imports
#239
stephanj
closed
1 month ago
0
Use <CODE_REVIEW></CODE_REVIEW> tags to enclose source code
#238
stephanj
closed
1 week ago
1
[Refactoring] OpenAIChatModelFactory, AnthropicChatModelFactory, ..
#237
stephanj
opened
1 month ago
0
[Refactoring] large class ActionButtonsPanel
#236
stephanj
opened
1 month ago
0
[Refactoring] large class DevoxxGenieToolWindowContent
#235
stephanj
opened
1 month ago
0
Reuse the LLMStudio token usage in response
#234
stephanj
closed
1 month ago
2
Fix #232 Calc correct input tokens when using Ollama + FIX Project c…
#233
stephanj
closed
1 month ago
0
"Add full project to prompt" doesn't include the attached tokens in metric
#232
stephanj
closed
1 month ago
1
Use .gitignore in the "Copy Project to Prompt"
#231
stephanj
closed
3 weeks ago
1
Does Copy Project / Excluded Directories work?
#230
dmcg
closed
1 month ago
2
Feat #228 Show execution time even when no token usage is provided
#229
stephanj
closed
1 month ago
0
Show execution time for any LLM provider
#228
stephanj
closed
1 month ago
0
The plugin text should take into consideration the Appearance Zoom IDE setting
#227
stephanj
closed
3 weeks ago
1
The expandable "Using x References" panel is collapsed by default but still adds height for used files
#226
stephanj
opened
1 month ago
1
Support new cloud provider: OpenRouter
#225
stephanj
closed
4 weeks ago
1
Fix #220
#224
stephanj
closed
1 month ago
0
Feat #221
#223
stephanj
closed
1 month ago
0
Issue 219
#222
stephanj
closed
1 month ago
0
Add multiple selected files using right-click
#221
stephanj
closed
1 month ago
2
Fix "Using 0 references"
#220
stephanj
closed
1 month ago
0
Mention how many files when calculating total tokens
#219
stephanj
closed
1 month ago
1
Local LLMs do not return a tokenUsage, no cost can be calculated.
#218
mydeveloperplanet
closed
1 month ago
0
Prompting local LLMs throws exception
#217
mydeveloperplanet
closed
1 month ago
0
Incorporate code into the prompt window context based on the selected AST options
#216
stephanj
opened
1 month ago
0
Fix context window calc for settings
#215
stephanj
closed
1 month ago
0
Feat #213 : Show input/output tokens, cost and execution time in foot…
#214
stephanj
closed
1 month ago
0
Show (optionally) input/output tokens per request
#213
stephanj
closed
1 month ago
1
Fix #211 Class initialization must not depend on services
#212
stephanj
closed
1 month ago
0
Class initialization must not depend on services. Consider using instance of the service on-demand instead.
#211
stephanj
closed
1 month ago
0
Feat #209 Upgraded to LangChain4j v0.33.0
#210
stephanj
closed
1 month ago
0
Migrate to LangChain4J 0.33.0
#209
stephanj
closed
1 month ago
0
Support the Beta Anthropic prompt-caching / cache_control feature
#208
stephanj
opened
1 month ago
2
Previous
Next