run-llama / LlamaIndexTS

LlamaIndex in TypeScript
https://ts.llamaindex.ai
MIT License
1.81k stars 348 forks source link

NEXTJS Module parse failed: Unexpected character '�' #273

Closed dominicdev closed 3 months ago

dominicdev commented 9 months ago

getting this error after testing , I trying to run it to api

`import { Document, SimpleNodeParser } from "llamaindex";

const nodeParser = new SimpleNodeParser(); const nodes = nodeParser.getNodesFromDocuments([ new Document({ text: "I am 10 years old. John is 20 years old." }), ]);`

./node_modules/onnxruntime-node/bin/napi-v3/darwin/arm64/onnxruntime_binding.node Module parse failed: Unexpected character '�' (1:0) You may need an appropriate loader to handle this file type, currently no loaders are configured to process this file. See https://webpack.js.org/concepts#loaders (Source code omitted for this binary file)

any idea? here is the sample code I tested

marcusschiesser commented 9 months ago

@dominicdev how does your next.config.js file look like? This is the one from create-llama:

/** @type {import('next').NextConfig} */
const nextConfig = {
  webpack: (config) => {
    // See https://webpack.js.org/configuration/resolve/#resolvealias
    config.resolve.alias = {
      ...config.resolve.alias,
      sharp$: false,
      "onnxruntime-node$": false,
      mongodb$: false,
    };
    return config;
  },
  experimental: {
    serverComponentsExternalPackages: ["llamaindex"],
    outputFileTracingIncludes: {
      "/*": ["./cache/**/*"],
    },
  },
};

module.exports = nextConfig;
dominicdev commented 9 months ago

@dominicdev how does your next.config.js file look like? This is the one from create-llama:

/** @type {import('next').NextConfig} */
const nextConfig = {
  webpack: (config) => {
    // See https://webpack.js.org/configuration/resolve/#resolvealias
    config.resolve.alias = {
      ...config.resolve.alias,
      sharp$: false,
      "onnxruntime-node$": false,
      mongodb$: false,
    };
    return config;
  },
  experimental: {
    serverComponentsExternalPackages: ["llamaindex"],
    outputFileTracingIncludes: {
      "/*": ["./cache/**/*"],
    },
  },
};

module.exports = nextConfig;

here my next config

`/* @type {import('next').NextConfig} / const nextConfig = { typescript: { // !! WARN !! // Dangerously allow production builds to successfully complete even if // your project has type errors. // !! WARN !! ignoreBuildErrors: true, }, experimental: { serverComponentsExternalPackages: ["pdf-parse"], // Puts pdf-parse in actual NodeJS mode with NextJS App Router }, }

module.exports = nextConfig `

dominicdev commented 9 months ago

will try your config

dominicdev commented 9 months ago

and it worked thanks

marcusschiesser commented 9 months ago

@dominicdev you're welcome, you can use npx create-llama to generate a NextJS project with the necessary next.config.js

InfiniteCodeMonkeys commented 3 months ago

This just solved my issue using llamaindex loaders in a NextJS v14 project running Node v20. I"m adding llamaIndex to an existing project and didn't see the create-llama guide. Should we add this snippet to Getting Started / Environments / NextJS?

marcusschiesser commented 3 months ago

@InfiniteCodeMonkeys you mean withLlamaIndex see https://github.com/run-llama/LlamaIndexTS?tab=readme-ov-file#nextjs is not working for you?

InfiniteCodeMonkeys commented 3 months ago

Ha! It's right there in the readme. I should have looked there first -- I went straight to the docs and missed this.