issues
search
twinnydotdev
/
twinny
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.29k
stars
125
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Feature request: Customize template to shown in context menu
#267
poisson-sg
opened
20 hours ago
1
undefined with deepseek-lite-ggml and llama.cpp
#266
micsthepick
opened
1 day ago
0
Update README to use the correct quickstart link.
#265
aqshafei
closed
1 day ago
0
Add support for VSCode 1.70.x again please
#264
imClumsyPanda
opened
3 days ago
3
Document RAG
#262
h3x4g0ns
opened
2 weeks ago
1
Interface lag after a number of messages in chat
#261
bars0um
opened
2 weeks ago
1
Feature Request: newline shortcut in the prompt bindable
#260
crimsonduelist
opened
3 weeks ago
5
Sidebar shortcut/binding doesn t focus/unfocus properly
#259
crimsonduelist
opened
3 weeks ago
0
Feature Request: Addition of a Visual Studio Plugin for Twinny
#258
iammeizu
opened
1 month ago
1
FIM completion flexible context
#257
kv-gits
opened
1 month ago
1
Add OpenAI provider
#255
JamesClarke7283
closed
3 weeks ago
3
Chat workspace on right of visual code
#254
sinaudjango
closed
1 month ago
1
Context Length Option With File Context Enabled Doesn't Limit Length
#253
anaseinea
closed
1 month ago
1
Codeqwen uses same FIM template as stable-code
#252
BeRT2me
closed
1 month ago
3
invalid option provided option=""
#251
jgilfoil
closed
1 month ago
5
Option to save provider configuration to disk
#250
jakern
opened
1 month ago
1
Cannot read long model names when configuring provider
#249
fbl100
closed
1 month ago
1
Fixing #212
#247
sebastianelsner
closed
1 month ago
1
FIM doesn't work with Keep Alive = -1
#246
anaseinea
closed
1 month ago
1
Code snippets in the chat window loose syntax highlighting occasionally
#244
nicikiefer
opened
1 month ago
6
Multiline completion is confusing
#243
yshui
closed
1 month ago
1
Configured providers but twinny not sending any requests to provider.
#242
Duoquote
opened
1 month ago
3
Feature/embeddings
#241
rjmacarthy
opened
1 month ago
0
Code completion works, but chat just spins the progress circle indefinitely
#240
2picus
closed
2 months ago
2
Code completion not working
#239
andremald
opened
2 months ago
27
Feat: embedding workspace files for context
#237
rjmacarthy
closed
1 month ago
1
Robot icon keeps spinning, no inference
#236
handrew
closed
2 months ago
1
Edit and re-submit in chat mode
#233
PkmX
opened
2 months ago
4
feat: open new chat window in new editor tab
#232
hitzhangjie
opened
2 months ago
1
Incomplete Code Autocompletion and Non-Responsive Chat UI in Twinny Extension
#231
YXTR
closed
2 months ago
5
How to configure proxy?
#230
ilyanoskov
closed
2 months ago
1
Improve documentation for FIM
#229
rburgst
closed
2 months ago
2
Change shortcuts for suggestions
#228
nagman
closed
2 months ago
2
command 'twinny.showSidebar' not found
#227
nagman
closed
2 months ago
1
Unable to select model when using Ollama
#226
dchansen
closed
2 months ago
3
Wrong language: getting python suggestions in Javascript file
#225
Lastofthefirst
closed
2 months ago
1
TypeError while attempting to use FIM from remote Open WebUI server
#224
i-ate-a-vm
closed
2 months ago
4
Keyboard short cut: accept part of the FIM suggestion
#223
Fadelis98
opened
2 months ago
4
Better multiline completions
#222
rjmacarthy
closed
2 months ago
0
Type error fetch failed t.streamResponse
#221
FrozzDay
closed
2 months ago
4
Keyboard Shortcuts - Add opt-in option
#220
superlinkx
opened
2 months ago
2
Can't get the FIM to work with LM Studio
#219
Wyzix33
closed
2 months ago
3
Extension does not load at all after update to v3.11.18
#218
mdbooth
closed
2 months ago
2
LM Studio supports "Multi Mode Session", must specify the model name
#217
marcmcd
closed
2 months ago
3
Not showing all model after update to latest version
#216
askareija
closed
2 months ago
8
Business use
#215
CFISOFT
closed
2 months ago
1
Save conversation history using Memento api
#214
rjmacarthy
closed
2 months ago
0
Unable to interact with ollama running VSCode with WSL2
#213
mdbooth
closed
2 months ago
3
Inline suggestion does not work when there is a .hg folder
#212
sebastianelsner
closed
1 month ago
1
Instructions for the configuration on MacOS with llama.cpp
#211
a-rbts
closed
2 months ago
3
Next