issues
search
srikanth235
/
privy
An open-source alternative to GitHub copilot that runs locally.
MIT License
870
stars
43
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Codespace ominous garbanzo 4rggpggg5ggf7rpg
#47
0xvashishth
closed
1 month ago
0
Broken Links in `CONTRIBUTING.md`
#46
0xvashishth
opened
1 month ago
0
will this ext. work with gpt4all
#44
kc8pdr205
opened
2 months ago
0
Unexpected output using deepseek-coder v2
#43
pulpocaminante
opened
4 months ago
1
Bugfix: Wrogn template is used when two different model families used for chat and autocomplete
#42
ldeninski
opened
5 months ago
0
Does not requests to my provider url.
#41
doanaktar
closed
3 months ago
3
Would love to see a separate font size setting for Privy
#40
wwoodsTM
opened
5 months ago
0
add config for separate ollama instances for completion and chat
#39
ser
opened
6 months ago
0
Remove hardcoded "Developer" user, move to config.
#38
nickheyer
opened
6 months ago
1
autocompletion does not work in VSCode
#37
arturshevchenko
opened
6 months ago
7
Support for LMStudio
#35
srikanth235
opened
6 months ago
0
Pivy does request and complete but doesn't hint or fill
#34
Marthaarman
opened
7 months ago
9
Add support for different context providers like docs, files etc.
#26
srikanth235
opened
7 months ago
0
Add status bar for displaying the status of inference server health
#25
srikanth235
opened
7 months ago
0
feat: hiding non-reelvant settings
#24
srikanth235
closed
7 months ago
0
Privy Not working in Mac for some Reason
#23
theashishmaurya
closed
7 months ago
12
Hide non-relevant Privy VSCode settings
#22
srikanth235
closed
7 months ago
0
Add support for timestamps in Privy logs
#21
srikanth235
closed
7 months ago
1
I don't know how to use autocomplete or it is not working for me.
#20
AtmanActive
opened
8 months ago
9
Appreciation and Attribution Inquiry
#19
rjmacarthy
closed
8 months ago
3
privy.providerBaseUrl doesn't seem to be used right now - cannot repoint it from local ollama to remote ollama
#18
kha84
closed
8 months ago
2
Hotkey for triggering autocompletion?
#17
kha84
closed
8 months ago
1
Add support for custom templates
#16
srikanth235
opened
8 months ago
0
Add validation for autocomplete and chat model names
#15
srikanth235
opened
8 months ago
0
Deeper Ollama Integration
#14
srikanth235
opened
8 months ago
0
feat: add support for inline code completions
#13
srikanth235
closed
8 months ago
0
Add support for thumbs up and thumbs down to track LLM performance
#12
srikanth235
opened
8 months ago
0
Explore the integration of newly launched JS library by Ollama
#11
srikanth235
opened
8 months ago
0
Special tags are slipping into chat responses
#10
srikanth235
closed
8 months ago
1
Edit code functionality broken.
#9
srikanth235
opened
8 months ago
0
feat: adding support for deepseek-coder models
#8
srikanth235
closed
8 months ago
0
Add support for aborting chat completion responses
#7
srikanth235
opened
8 months ago
1
Add support for DeepSeek Coder model
#6
srikanth235
closed
8 months ago
0
Provide support for inline code completions
#5
srikanth235
closed
8 months ago
0
feat: adding support for configuring custom Ollama models (#1)
#4
srikanth235
closed
8 months ago
0
How to work with model when using SSH remote connection?
#3
nowak-ninja
closed
8 months ago
2
Update architecture.md
#2
eltociear
closed
8 months ago
1
Add support for retrieving list of models from Ollama inside settings section
#1
srikanth235
closed
8 months ago
0