Atome-FE / llama-node

Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.
https://llama-node.vercel.app/
Apache License 2.0
864 stars 62 forks source link

How can I install Types? #13

Closed Achalogy closed 1 year ago

Achalogy commented 1 year ago

I got a lot of error messages like this one:

Cannot find module 'llama-node/dist/llm/llama-rs' or its corresponding type declarations.

How can I install the types?

hlhr202 commented 1 year ago

hi, not sure which version did you install? I tried version 0.0.21, the type definition is working actually.

image

Achalogy commented 1 year ago

Me too, I have 0.0.21 versión :( Maybe some config file(?

tsconfig.json

{
  "compilerOptions": {
    "outDir": "./dist",
    "target": "es2022",
    "module": "nodenext",
    "strict": true,
    "esModuleInterop": true,
    "skipLibCheck": true,
    "forceConsistentCasingInFileNames": true,
    "types": ["node"]
  },
  "include": ["./"]
}
Achalogy commented 1 year ago

Changed "module": "nodenext" to "module": "CommonJS". And it worked.

Achalogy commented 1 year ago

Can you show me what tsconfig do you use? I still getting errors because tsc change commonjs imports to ES.

hlhr202 commented 1 year ago

@Achalogy Hi I guess you are using this library without any bundler but only tsc.

First, set your package.json type to be "module"

in your tsconfig.json

use target: esnext module: esnext moduleResolution: node

then in your ts file that using llama-node import { LLamaCpp } from "llama-node/dist/llm/llama-cpp.js";

please be careful there must be a file extension for llama-cpp.js that enable you run without any bunder.

hlhr202 commented 1 year ago

https://github.com/hlhr202/llama-node/tree/main/example

Updated full examples here.