issues
search
withcatai
/
node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Force a JSON schema on the model output on the generation level
https://node-llama-cpp.withcat.ai
MIT License
837
stars
82
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Failed to fetch llama.cpp release info RequestError [HttpError]: Not Found
#25
loretoparisi
closed
1 year ago
2
Error node-llama-cpp build
#24
loretoparisi
closed
1 year ago
2
Feature request: Langchain support
#23
chethankumar
closed
1 year ago
6
Cuda support is not working
#22
pankaj-kumar34
closed
1 year ago
3
fix: import
#21
giladgd
closed
1 year ago
1
Fixed typo on LlamaGrammar import.
#20
nigel-daniels
closed
1 year ago
1
docs(README): add a section explaining about Metal and CUDA support
#19
giladgd
closed
1 year ago
1
chore: remove unused dependencies
#18
giladgd
closed
1 year ago
1
feat: add support for metal and cuda in the `build` command
#17
giladgd
closed
1 year ago
1
Feature request: CUDA support
#16
pankaj-kumar34
closed
1 year ago
1
Raw approach runs into odd response
#15
nigel-daniels
closed
1 year ago
1
Set temperature request.
#14
nigel-daniels
closed
1 year ago
1
feat: add grammar support
#13
giladgd
closed
1 year ago
1
ci: fix build
#12
ido-pluto
closed
1 year ago
1
Failed to load model
#11
nigel-daniels
closed
1 year ago
2
fix: llama.cpp interface breaking change
#10
giladgd
closed
1 year ago
1
Passing grammars
#9
arthurwolf
closed
1 year ago
2
fix: rename a param to match llama.cpp
#8
giladgd
closed
1 year ago
1
docs(README): add missing import
#7
giladgd
closed
1 year ago
1
feat: add TSDoc
#6
giladgd
closed
1 year ago
1
feat: add support for some llama.cpp params on `LlamaModel`
#5
giladgd
closed
1 year ago
1
fix: configurations
#4
giladgd
closed
1 year ago
1
Fix/chat context
#3
ido-pluto
closed
1 year ago
1
feat: general chat wrapper
#2
ido-pluto
closed
1 year ago
1
feat(model): support emoji
#1
ido-pluto
closed
1 year ago
1
Previous