run-llama / LlamaIndexTS

LlamaIndex in TypeScript
https://ts.llamaindex.ai
MIT License
1.77k stars 338 forks source link

`npm run generate` it does not work with `indexllama v0.0.35 & v.0.0.36` #225

Closed emapeire closed 9 months ago

emapeire commented 10 months ago

I have the same issue that #182 with the latest update, and it still does not work. When I ran: npm run generate, I got this message:

npm run generate

> eloquent-js-chat@0.1.0 generate
> node app/api/chat/engine/generate.mjs

/Users/emapeire/Workspace/AI/eloquent-js-chat/node_modules/llamaindex/dist/index.js:605
      throw new Error("Set OpenAI Key in OPENAI_API_KEY env variable");
            ^

Error: Set OpenAI Key in OPENAI_API_KEY env variable
    at new OpenAISession (/Users/emapeire/Workspace/AI/eloquent-js-chat/node_modules/llamaindex/dist/index.js:605:13)
    at getOpenAISession (/Users/emapeire/Workspace/AI/eloquent-js-chat/node_modules/llamaindex/dist/index.js:621:15)
    at new OpenAI2 (/Users/emapeire/Workspace/AI/eloquent-js-chat/node_modules/llamaindex/dist/index.js:1540:81)
    at serviceContextFromDefaults (/Users/emapeire/Workspace/AI/eloquent-js-chat/node_modules/llamaindex/dist/index.js:2083:71)
    at file:///Users/emapeire/Workspace/AI/eloquent-js-chat/app/api/chat/engine/generate.mjs:41:26
    at file:///Users/emapeire/Workspace/AI/eloquent-js-chat/app/api/chat/engine/generate.mjs:48:3
    at ModuleJob.run (node:internal/modules/esm/module_job:217:25)
    at async ModuleLoader.import (node:internal/modules/esm/loader:316:24)
    at async loadESM (node:internal/process/esm_loader:34:7)
    at async handleMainPromise (node:internal/modules/run_main:66:12)

Node.js v20.9.0

However, I do not know why happen this, because I have the .env.local file with my OPEN_API_KEY key.

logan-markewich commented 10 months ago

@emapeire did you use the latest version of create-llama ?

npx create-llama@latest

emapeire commented 10 months ago

@logan-markewich yes, currently I'm using the latest version.

In addition, I'll send you my package.json file below:

{
  "name": "eloquent-js-chat",
  "version": "0.1.0",
  "scripts": {
    "dev": "next dev",
    "build": "next build",
    "start": "next start",
    "lint": "next lint",
    "generate": "node app/api/chat/engine/generate.mjs"
  },
  "dependencies": {
    "@radix-ui/react-slot": "^1",
    "ai": "^2",
    "class-variance-authority": "^0.7",
    "llamaindex": "^0.0.35",
    "lucide-react": "^0.291",
    "next": "^13",
    "react": "^18",
    "react-dom": "^18",
    "react-markdown": "^8.0.7",
    "react-syntax-highlighter": "^15.5.0",
    "remark": "^14.0.3",
    "remark-code-import": "^1.2.0",
    "remark-gfm": "^3.0.1",
    "remark-math": "^5.1.1",
    "tailwind-merge": "^2"
  },
  "devDependencies": {
    "@types/node": "^20",
    "@types/react": "^18",
    "@types/react-dom": "^18",
    "@types/react-syntax-highlighter": "^15.5.6",
    "autoprefixer": "^10",
    "eslint": "^8",
    "eslint-config-next": "^13",
    "postcss": "^8",
    "tailwindcss": "^3",
    "typescript": "^5"
  }
}
emapeire commented 10 months ago

@logan-markewich

Ok, I have updated llamaindex to v0.0.36 and I have this new issue right now:

npm WARN deprecated fs-promise@2.0.3: Use mz or fs-extra^3.0 with Promise Support

I assumed this was going to cause some problem, so I ran npm run generate again, and this happened:

npm run generate

> eloquent-js-chat@0.1.0 generate
> node app/api/chat/engine/generate.mjs

node:internal/fs/sync:78
  return binding.openSync(
                 ^

Error: ENOENT: no such file or directory, open './test/data/05-versions-space.pdf'
    at Object.open (node:internal/fs/sync:78:18)
    at Object.openSync (node:fs:565:17)
    at Object.readFileSync (node:fs:445:35)
    at Object.<anonymous> (/Users/emapeire/Workspace/AI/eloquent-js-chat/node_modules/pdf-parse/index.js:15:25)
    at Module._compile (node:internal/modules/cjs/loader:1241:14)
    at Module._extensions..js (node:internal/modules/cjs/loader:1295:10)
    at Module.load (node:internal/modules/cjs/loader:1091:32)
    at Module._load (node:internal/modules/cjs/loader:938:12)
    at cjsLoader (node:internal/modules/esm/translators:284:17)
    at ModuleWrap.<anonymous> (node:internal/modules/esm/translators:234:7) {
  errno: -2,
  code: 'ENOENT',
  syscall: 'open',
  path: './test/data/05-versions-space.pdf'
}

Node.js v20.9.0

@yisding can you check this issue please?

logan-markewich commented 10 months ago

@emapeire that error is complaining about missing data or it can't find a folder?

Which backend did you use -- nextjs or express?

emapeire commented 10 months ago

that error is complaining about missing data or it can't find a folder?

@logan-markewich I'm using the same configuration that comes by default with npx create-llama and its folders. So, I don't know why it happens

Which backend did you use -- nextjs or express?

I'm using nextjs 13.5.6

seldo commented 10 months ago

Your initial problem with the missing OPENAI_API_KEY was probably https://github.com/run-llama/LlamaIndexTS/issues/226 but we're having trouble reproducing your second error. We also get the warning about fs-promise on install but generate works fine for us.

marcusschiesser commented 9 months ago

@emapeire, we fixed this issue in the latest version. The generate script is now always getting the key from .env and NextJS is using .env. Please try npx create-llama@latest again and let us know if the fix works for you. Thanks.

emapeire commented 9 months ago

Thanks! It's works