Closed emapeire closed 9 months ago
@emapeire did you use the latest version of create-llama
?
npx create-llama@latest
@logan-markewich yes, currently I'm using the latest version.
In addition, I'll send you my package.json
file below:
{
"name": "eloquent-js-chat",
"version": "0.1.0",
"scripts": {
"dev": "next dev",
"build": "next build",
"start": "next start",
"lint": "next lint",
"generate": "node app/api/chat/engine/generate.mjs"
},
"dependencies": {
"@radix-ui/react-slot": "^1",
"ai": "^2",
"class-variance-authority": "^0.7",
"llamaindex": "^0.0.35",
"lucide-react": "^0.291",
"next": "^13",
"react": "^18",
"react-dom": "^18",
"react-markdown": "^8.0.7",
"react-syntax-highlighter": "^15.5.0",
"remark": "^14.0.3",
"remark-code-import": "^1.2.0",
"remark-gfm": "^3.0.1",
"remark-math": "^5.1.1",
"tailwind-merge": "^2"
},
"devDependencies": {
"@types/node": "^20",
"@types/react": "^18",
"@types/react-dom": "^18",
"@types/react-syntax-highlighter": "^15.5.6",
"autoprefixer": "^10",
"eslint": "^8",
"eslint-config-next": "^13",
"postcss": "^8",
"tailwindcss": "^3",
"typescript": "^5"
}
}
@logan-markewich
Ok, I have updated llamaindex to v0.0.36
and I have this new issue right now:
npm WARN deprecated fs-promise@2.0.3: Use mz or fs-extra^3.0 with Promise Support
I assumed this was going to cause some problem, so I ran npm run generate
again, and this happened:
npm run generate
> eloquent-js-chat@0.1.0 generate
> node app/api/chat/engine/generate.mjs
node:internal/fs/sync:78
return binding.openSync(
^
Error: ENOENT: no such file or directory, open './test/data/05-versions-space.pdf'
at Object.open (node:internal/fs/sync:78:18)
at Object.openSync (node:fs:565:17)
at Object.readFileSync (node:fs:445:35)
at Object.<anonymous> (/Users/emapeire/Workspace/AI/eloquent-js-chat/node_modules/pdf-parse/index.js:15:25)
at Module._compile (node:internal/modules/cjs/loader:1241:14)
at Module._extensions..js (node:internal/modules/cjs/loader:1295:10)
at Module.load (node:internal/modules/cjs/loader:1091:32)
at Module._load (node:internal/modules/cjs/loader:938:12)
at cjsLoader (node:internal/modules/esm/translators:284:17)
at ModuleWrap.<anonymous> (node:internal/modules/esm/translators:234:7) {
errno: -2,
code: 'ENOENT',
syscall: 'open',
path: './test/data/05-versions-space.pdf'
}
Node.js v20.9.0
@yisding can you check this issue please?
@emapeire that error is complaining about missing data or it can't find a folder?
Which backend did you use -- nextjs or express?
that error is complaining about missing data or it can't find a folder?
@logan-markewich I'm using the same configuration that comes by default with npx create-llama
and its folders. So, I don't know why it happens
Which backend did you use -- nextjs or express?
I'm using nextjs 13.5.6
Your initial problem with the missing OPENAI_API_KEY was probably https://github.com/run-llama/LlamaIndexTS/issues/226 but we're having trouble reproducing your second error. We also get the warning about fs-promise on install but generate works fine for us.
@emapeire, we fixed this issue in the latest version. The generate
script is now always getting the key from .env
and NextJS is using .env
.
Please try npx create-llama@latest
again and let us know if the fix works for you. Thanks.
Thanks! It's works
I have the same issue that #182 with the latest update, and it still does not work. When I ran:
npm run generate
, I got this message:However, I do not know why happen this, because I have the
.env.local
file with myOPEN_API_KEY
key.