run-llama / create-llama

The easiest way to get started with LlamaIndex
MIT License
718 stars 86 forks source link

fix: import undefined issues in nextjs #80

Closed sagech closed 3 months ago

sagech commented 3 months ago

Bug Fixes

Summary by CodeRabbit

changeset-bot[bot] commented 3 months ago

⚠️ No Changeset found

Latest commit: e1fe4342af1f9c74296830413eaecb9347d764bf

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

coderabbitai[bot] commented 3 months ago

Walkthrough

The update involves a modification to the Webpack configuration in a Next.js project. Specifically, a new external module, llamaindex, has been declared to use the commonjs format. This change likely aims to optimize the build process by excluding llamaindex from the output bundle.

Changes

File Path Change Summary
.../streaming/nextjs/webpack.config.mjs Added llamaindex as a commonjs external in Webpack

🐰✨ In the land of code, where the bits align, A new change rolls out, oh so fine! 'Llamaindex' now external, let's cheer, For smoother builds, we hold dear. Hop, hop, hooray, let's code away! 🎉🐇


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share - [X](https://twitter.com/intent/tweet?text=I%20just%20used%20%40coderabbitai%20for%20my%20code%20review%2C%20and%20it%27s%20fantastic%21%20It%27s%20free%20for%20OSS%20and%20offers%20a%20free%20trial%20for%20the%20proprietary%20code.%20Check%20it%20out%3A&url=https%3A//coderabbit.ai) - [Mastodon](https://mastodon.social/share?text=I%20just%20used%20%40coderabbitai%20for%20my%20code%20review%2C%20and%20it%27s%20fantastic%21%20It%27s%20free%20for%20OSS%20and%20offers%20a%20free%20trial%20for%20the%20proprietary%20code.%20Check%20it%20out%3A%20https%3A%2F%2Fcoderabbit.ai) - [Reddit](https://www.reddit.com/submit?title=Great%20tool%20for%20code%20review%20-%20CodeRabbit&text=I%20just%20used%20CodeRabbit%20for%20my%20code%20review%2C%20and%20it%27s%20fantastic%21%20It%27s%20free%20for%20OSS%20and%20offers%20a%20free%20trial%20for%20proprietary%20code.%20Check%20it%20out%3A%20https%3A//coderabbit.ai) - [LinkedIn](https://www.linkedin.com/sharing/share-offsite/?url=https%3A%2F%2Fcoderabbit.ai&mini=true&title=Great%20tool%20for%20code%20review%20-%20CodeRabbit&summary=I%20just%20used%20CodeRabbit%20for%20my%20code%20review%2C%20and%20it%27s%20fantastic%21%20It%27s%20free%20for%20OSS%20and%20offers%20a%20free%20trial%20for%20proprietary%20code)
Tips ### Chat There are 3 ways to chat with [CodeRabbit](https://coderabbit.ai): - Review comments: Directly reply to a review comment made by CodeRabbit. Example: - `I pushed a fix in commit .` - `Generate unit testing code for this file.` - `Open a follow-up GitHub issue for this discussion.` - Files and specific lines of code (under the "Files changed" tab): Tag `@coderabbitai` in a new review comment at the desired location with your query. Examples: - `@coderabbitai generate unit testing code for this file.` - `@coderabbitai modularize this function.` - PR comments: Tag `@coderabbitai` in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples: - `@coderabbitai generate interesting stats about this repository and render them as a table.` - `@coderabbitai show all the console.log statements in this repository.` - `@coderabbitai read src/utils.ts and generate unit testing code.` - `@coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.` Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. ### CodeRabbit Commands (invoked as PR comments) - `@coderabbitai pause` to pause the reviews on a PR. - `@coderabbitai resume` to resume the paused reviews. - `@coderabbitai review` to trigger a review. This is useful when automatic reviews are disabled for the repository. - `@coderabbitai resolve` resolve all the CodeRabbit review comments. - `@coderabbitai help` to get help. Additionally, you can add `@coderabbitai ignore` anywhere in the PR description to prevent this PR from being reviewed. ### CodeRabbit Configration File (`.coderabbit.yaml`) - You can programmatically configure CodeRabbit by adding a `.coderabbit.yaml` file to the root of your repository. - Please see the [configuration documentation](https://docs.coderabbit.ai/guides/configure-coderabbit) for more information. - If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: `# yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json` ### Documentation and Community - Visit our [Documentation](https://coderabbit.ai/docs) for detailed information on how to use CodeRabbit. - Join our [Discord Community](https://discord.com/invite/GsXnASn26c) to get help, request features, and share feedback. - Follow us on [X/Twitter](https://twitter.com/coderabbitai) for updates and announcements.
marcusschiesser commented 3 months ago

@sagech Seems to be unrelated to https://github.com/run-llama/create-llama/issues/78 - do you have a simple test case of the error that should be fixed with this PR?

sagech commented 3 months ago

@marcusschiesser In my specific scenario, I ran create-llama and selected Chat + Nextjs, and Qdrant as the vector database. Full options listed below:

✔ What is your project named? … <REDACTED>
✔ Which template would you like to use? › Chat
✔ Which framework would you like to use? › NextJS
✔ Would you like to set up observability? › No
✔ Please provide your OpenAI API key (leave blank to skip): …
✔ Which data source would you like to use? › Use an example PDF
✔ Would you like to add another data source? › No
✔ Would you like to use LlamaParse (improved parser for RAG - requires API key)? … no / yes
✔ Would you like to use a vector database? › Qdrant
✔ Would you like to build an agent using tools? If so, select the tools here, otherwise just press enter ›
✔ How would you like to proceed? › Just generate code (~1 sec)
  1. The file generated in ./app/api/chat/engine/index.ts is generated from https://github.com/run-llama/create-llama/blob/6fe240b8546e411a140c155be4af0eab1b74c2fb/templates/components/vectordbs/typescript/qdrant/index.ts#L2
  2. Whenever I chat (POST /api/chat), I get an error 500 (arising from the line above):
    [LlamaIndex] TypeError: llamaindex__WEBPACK_IMPORTED_MODULE_1__.QdrantVectorStore is not a constructor
    at getDataSource (webpack-internal:///(rsc)/./app/api/chat/engine/index.ts:21:19)
    at createChatEngine (webpack-internal:///(rsc)/./app/api/chat/engine/chat.ts:10:78)
    at POST (webpack-internal:///(rsc)/./app/api/chat/route.ts:54:96)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async /<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-route.runtime.dev.js:6:53446
    at async e_.execute (/<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-route.runtime.dev.js:6:44747)
    at async e_.handle (/<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/compiled/next-server/app-route.runtime.dev.js:6:54700)
    at async doRender (/<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/server/base-server.js:1377:42)
    at async cacheEntry.responseCache.get.routeKind (/<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/server/base-server.js:1599:28)
    at async DevServer.renderToResponseWithComponentsImpl (/<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/server/base-server.js:1507:28)
    at async DevServer.renderPageComponent (/<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/server/base-server.js:1924:24)
    at async DevServer.renderToResponseImpl (/<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/server/base-server.js:1962:32)
    at async DevServer.pipeImpl (/<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/server/base-server.js:920:25)
    at async NextNodeServer.handleCatchallRenderRequest (/<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/server/next-server.js:272:17)
    at async DevServer.handleRequestImpl (/<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/server/base-server.js:816:17)
    at async /<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/server/dev/next-dev-server.js:339:20
    at async Span.traceAsyncFn (/<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/trace/trace.js:154:20)
    at async DevServer.handleRequest (/<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/server/dev/next-dev-server.js:336:24)
    at async invokeRender (/<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/server/lib/router-server.js:174:21)
    at async handleRequest (/<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/server/lib/router-server.js:353:24)
    at async requestHandlerImpl (/<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/server/lib/router-server.js:377:13)
    at async Server.requestListener (/<REDACTED>/Projects/ml/testcreatellama/node_modules/.pnpm/next@14.2.3_react-dom@18.3.1_react@18.3.1__react@18.3.1/node_modules/next/dist/server/lib/start-server.js:141:13)
    POST /api/chat 500 in 2955ms
  3. If I make the changes to webpack.config.mjs like in this PR, then the error goes away
marcusschiesser commented 3 months ago

@sagech got it, can you please try with create-llama@0.1.3 again? The issue should be fixed there

sagech commented 3 months ago

@marcusschiesser create-llama@0.1.3 works great for me!

marcusschiesser commented 3 months ago

@sagech great! Then I'll close this PR! Thanks for pointing me to this issue!