issues
search
tzachar
/
cmp-ai
MIT License
186
stars
33
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Huggingface Customization
#33
Freaksed
closed
1 week ago
2
Make Codestral more configurable
#32
mrloop
closed
1 week ago
0
Increasing max_tokens for Codestral does not work
#31
tgy
closed
1 week ago
3
Shallow copy headers to avoid mutating the original
#30
mrloop
closed
2 weeks ago
0
Feature: add Claude support
#29
Ernesto905
closed
3 weeks ago
1
Feature Request: Support for Claude
#28
Ernesto905
closed
2 weeks ago
1
cmp config is overridden post installing
#27
TheNaman047
opened
1 month ago
0
Feature: Support more Models out of the box
#26
milafrerichs
closed
2 months ago
3
feat: add support for suffix parameter in ollama
#25
milafrerichs
closed
2 months ago
1
Feature: add Tabby support
#24
chmanie
closed
2 months ago
1
[Feature Request] Settings for own LLM server
#23
Alpensin
closed
2 months ago
3
add more data to format item table for better lspkind integration
#22
mrloop
closed
2 months ago
4
feature request: Tabbyml support
#21
Kamilcuk
closed
3 months ago
1
Working cmp-ai configuration for NVChad?
#20
awonglk
opened
3 months ago
1
Test whether hardcoded value overrides config
#19
jackruder
closed
4 months ago
0
HF logic issue
#18
devvit
closed
4 months ago
1
add support for Codestral by Mistral
#17
bmichotte
closed
4 months ago
0
feat(ollama): support port-forward remote api
#16
lkhphuc
closed
5 months ago
1
ollama completion does not work at all
#15
OneOfOne
opened
5 months ago
8
ollama and requests: support proper streaming
#14
maxwell-bland
opened
6 months ago
10
fix(ollama): Fix parameters passthrough
#13
mdietrich16
closed
6 months ago
0
Support for AWS codewhisperer
#12
mateimicu
closed
3 months ago
1
Debounce, LlamaCpp support, expose prompt as setup option, fix passing parameters to model (ollama)
#11
JoseConseco
opened
9 months ago
9
How to connect with a remote ollama server?
#10
captainko
closed
9 months ago
0
Add delay when firing auto-complete
#9
JoseConseco
closed
9 months ago
2
cannot change ollama model
#8
JoseConseco
opened
9 months ago
0
feat: Add Ollama backend
#7
nifoc
closed
10 months ago
2
OpenAI model not working
#6
henryoliver
closed
11 months ago
1
Llama support
#5
JoseConseco
closed
10 months ago
4
adds notify_callback key to setup table to override default notify
#4
catgoose
closed
1 year ago
3
Bard Error: json.choices returns nil instead of table
#3
mystilleef
closed
1 year ago
3
bard backed has hard-coded python executable path
#2
onelesd
closed
1 year ago
1
doc: `plenary.nvim` is mentioned but not included in the `dependencies`
#1
nyngwang
closed
1 year ago
2