Closed convertsee-dev closed 2 months ago
This is not a bug. The beta docs specifically state how to create a model: https://github.com/withcatai/node-llama-cpp/pull/105
const llama = await getLlama();
const model = await llama.loadModel({
modelPath: path.join(__dirname, "models", "dolphin-2.1-mistral-7b.Q4_K_M.gguf")
});
Thank you for your response, really appreciate it.
On Sat, Apr 13, 2024 at 11:28 AM 'Gilad S.' via Developers < @.***> wrote:
Closed #200 https://github.com/withcatai/node-llama-cpp/issues/200 as completed.
— Reply to this email directly, view it on GitHub https://github.com/withcatai/node-llama-cpp/issues/200#event-12458227632, or unsubscribe https://github.com/notifications/unsubscribe-auth/BHMLWNZNKSYE34NFQHLEA6LY5FFKLAVCNFSM6AAAAABGEPNWQGVHI2DSMVQWIX3LMV45UABCJFZXG5LFIV3GK3TUJZXXI2LGNFRWC5DJN5XDWMJSGQ2TQMRSG43DGMQ . You are receiving this because you authored the thread.Message ID: @.***>
@convertsee-dev The solution was given by @brandon-e2e in this ticket already, so I closed it as completed
Issue description
The constructor was changed to private so we cannot use this in TypeScript.
Expected Behavior
I should be able to follow docs:
Actual Behavior
I get an error: Constructor of class 'LlamaModel' is private and only accessible within the class declaration.ts(2673)
Steps to reproduce
npm i node-llama-cpp@beta and attempt to follow docs to instantiate a new LlamaModel:
My Environment
node-llama-cpp
versionAdditional Context
No response
Relevant Features Used
Are you willing to resolve this issue by submitting a Pull Request?
No, I don’t have the time and I’m okay to wait for the community / maintainers to resolve this issue.