floydspace / serverless-esbuild

💨 A Serverless framework plugin to bundle JavaScript and TypeScript with extremely fast esbuild
MIT License
452 stars 138 forks source link

Deployment failed when added llamaindex library #550

Open SyedAli00896 opened 3 months ago

SyedAli00896 commented 3 months ago

Hi, i am trying to use llamaindex library in my project. It is working fine in development mode, but when I tried to deploy it, it started giving me errors in bundling of code. Here are the errors:

` ✘ [ERROR] No loader is configured for ".node" files: node_modules/onnxruntime-node/bin/napi-v3/darwin/arm64/onnxruntime_binding.node

node_modules/onnxruntime-node/dist/binding.js:9:8: 9 │ require(../bin/napi-v3/${process.platform}/${process.arch}/onnxruntime_binding.node); ╵ ~~~~~~~~~~~~~~~~~ ✘ [ERROR] No loader is configured for ".node" files: node_modules/onnxruntime-node/bin/napi-v3/linux/x64/onnxruntime_binding.node

node_modules/onnxruntime-node/dist/binding.js:9:8: 9 │ require(../bin/napi-v3/${process.platform}/${process.arch}/onnxruntime_binding.node); ╵ ~~~~~~~~~~~~~~~~~ ✘ [ERROR] No loader is configured for ".node" files: node_modules/onnxruntime-node/bin/napi-v3/darwin/x64/onnxruntime_binding.node

node_modules/onnxruntime-node/dist/binding.js:9:8: 9 │ require(../bin/napi-v3/${process.platform}/${process.arch}/onnxruntime_binding.node); ╵ ~~~~~~~~~~~~~~~~~ ✘ [ERROR] No loader is configured for ".node" files: node_modules/onnxruntime-node/bin/napi-v3/linux/arm64/onnxruntime_binding.node

node_modules/onnxruntime-node/dist/binding.js:9:8: 9 │ require(../bin/napi-v3/${process.platform}/${process.arch}/onnxruntime_binding.node); ╵ ~~~~~~~~~~~~~~~~~ ✘ [ERROR] No loader is configured for ".node" files: node_modules/onnxruntime-node/bin/napi-v3/win32/arm64/onnxruntime_binding.node

node_modules/onnxruntime-node/dist/binding.js:9:8: 9 │ require(../bin/napi-v3/${process.platform}/${process.arch}/onnxruntime_binding.node); ╵ ~~~~~~~~~~~~~~~~~ ✘ [ERROR] No loader is configured for ".node" files: node_modules/onnxruntime-node/bin/napi-v3/win32/x64/onnxruntime_binding.node

node_modules/onnxruntime-node/dist/binding.js:9:8: 9 │ require(../bin/napi-v3/${process.platform}/${process.arch}/onnxruntime_binding.node); `

Versions:

Additional context

Matangub commented 1 month ago

did you find any solution?

SyedAli00896 commented 1 month ago

did you find any solution?

https://github.com/run-llama/LlamaIndexTS/issues/1110

The build issue is resolved with the help of this. But then the tiktoken issue arrived, which I'm still not able to fix. If you are able to run it successfully, please do let me know as well. Thanks.