Closed BingKui closed 1 week ago
I run a local model by node-llama-cpp。I hope To use it。
I will make another application of end-side large model, using node-llama-cpp to run the end-side large model, hoping to realize the function of RAG.
This is supported from version 0.1.18
0.1.18
🚀 The feature
I run a local model by node-llama-cpp。I hope To use it。
Motivation, pitch
I will make another application of end-side large model, using node-llama-cpp to run the end-side large model, hoping to realize the function of RAG.