issues
search
twinnydotdev
/
twinny
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.3k
stars
126
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Inline suggestion does not work when there is a .hg folder
#212
sebastianelsner
closed
1 month ago
1
Instructions for the configuration on MacOS with llama.cpp
#211
a-rbts
closed
2 months ago
3
Twinny for Visual Studio Community 2022
#210
schwaben-github
closed
2 months ago
1
VSCodium reports as it is not compatible with VS Code '1.81.1'.
#209
marcusgreen
closed
2 months ago
2
something wrong with the new update no ability to add model name. what could the issue be?
#208
spirobel
closed
3 months ago
5
When using the vscode twinny plugin with remote-ssh for remote development, code suggestions are not working.
#207
0x0101010
closed
2 months ago
3
development > main
#206
rjmacarthy
closed
3 months ago
0
Adds CodeGemma for FIM Completion
#205
jeffistyping
closed
3 months ago
2
Update README.md for correct model pull
#204
sebastianelsner
closed
3 months ago
1
Add custom providers.
#203
rjmacarthy
closed
3 months ago
0
Outputs only "undefined"
#202
Hangover3832
closed
2 months ago
8
Different shortcuts for single-line or multi-line suggestions
#201
KizzyCode
opened
3 months ago
2
Support Comments Translation
#200
ivaquero
closed
2 months ago
5
Possibly my mistake, but I keep getting this error
#198
russsavagedentsu
closed
3 months ago
2
Ideal setup of parallel chat and fim models
#197
kirel
closed
3 months ago
2
Generate commit messages from staged changes
#196
rjmacarthy
closed
3 months ago
0
[Feature] Jetbrains plugins
#195
SectionTN
closed
3 months ago
1
No robot icon, no completion
#194
jjlee
closed
3 months ago
3
Updates to support fully configurable api, enabled support for LiteLLM
#193
rjmacarthy
closed
3 months ago
0
Cannot chat successfully with ollama
#192
brunoais
closed
3 months ago
2
Default ollama `Chat Api Path` points to the wrong URL path
#191
brunoais
closed
3 months ago
2
Enhance Twinny with LiteLLM (and indirectly OpenRouter) Support
#190
bvelker
closed
3 months ago
4
et API Bearer Token is not working
#189
huzhibing
closed
3 months ago
2
Inline completion fixes
#188
rjmacarthy
closed
3 months ago
0
Refactoring
#186
rjmacarthy
closed
3 months ago
0
multiline completion improvements
#185
rjmacarthy
closed
3 months ago
0
Clear completion on abort
#184
rjmacarthy
closed
3 months ago
0
(fix) remove non compliant openai api properties from payload
#182
rjmacarthy
closed
3 months ago
0
Small fixes for completion
#181
rjmacarthy
closed
3 months ago
0
Oobabooga vs. Twinny
#180
zaqhack
closed
3 months ago
11
Error parsing JSON: TypeError: Cannot read properties of undefined (reading '0')
#179
DenisBY
closed
3 months ago
3
Automatic multiline completion
#178
rjmacarthy
closed
3 months ago
0
Support for web version of vscode?
#177
Malte0621
opened
3 months ago
4
Add support to Deci-based models
#176
Quadav
closed
3 months ago
1
FIM requests not being sent / recieved at Ollama API
#175
lewismacnow
closed
3 months ago
1
Add starcoder2 (and dolphincoder) support to autocomplete (not complete yet)
#174
hafriedlander
closed
2 months ago
4
Fetch error: Error: Server responded with status code: 404
#173
JianJroh
closed
3 months ago
3
Chat API endpoint incorrect for Ollama in default settings
#172
jonaslund
closed
4 months ago
1
COMPLETION_TIMEOUT = 20000 is causing chat generation to halt early
#171
Yazorp
closed
4 months ago
1
development > main
#170
rjmacarthy
closed
4 months ago
0
Improvement skip completions cleverly
#169
rjmacarthy
closed
4 months ago
0
Display an indicator when twinny looses connection the LLM provider
#168
onel
closed
3 months ago
4
development > main
#167
rjmacarthy
closed
4 months ago
0
Twinny stops working after machine goes in standby
#166
onel
closed
2 months ago
2
Improve docs with step by step setup
#165
onel
closed
4 months ago
1
Improve docs with step by step setup
#164
onel
closed
4 months ago
2
Use two Ollama Server, one for Chat and one for Fim, to improve Twinny Performance
#163
sthufnagl
closed
3 months ago
6
development > main
#162
rjmacarthy
closed
4 months ago
0
Problem with https/tls configuration -
#161
Marcinj21
closed
3 months ago
11
Fix prefix and suffix percentage in the final prompt.
#160
pacman100
closed
4 months ago
0
Previous
Next