run-llama / LlamaIndexTS

LlamaIndex in TypeScript
https://ts.llamaindex.ai
MIT License
1.58k stars 310 forks source link

Whenever I try to use create-llama through CLI, it throws the error: The data stream is hanging. Did you forget to close it with `data.close()`? #927

Closed crown-king closed 3 weeks ago

crown-king commented 3 weeks ago

Here's a full copy of what I did on the CLI and the respective output:

Microsoft Windows [Version 10.0.22631.3593] (c) Microsoft Corporation. All rights reserved.

C:\Users\lamak>npx create-llama@latest --ask-models √ What is your project named? ... my-app √ Which template would you like to use? » Chat √ Which framework would you like to use? » NextJS √ Would you like to set up observability? » No √ Which model provider would you like to use » Gemini √ Please provide your Google API key (or leave blank to use GOOGLE_API_KEY env variable): . √ Which LLM model would you like to use? » gemini-1.5-pro-latest √ Which embedding model would you like to use? » text-embedding-004 √ Which data source would you like to use? » Use an example PDF √ Would you like to add another data source? » No √ Would you like to use LlamaParse (improved parser for RAG - requires API key)? ... no / yes √ Would you like to use a vector database? » No, just store the data in the file system √ Would you like to build an agent using tools? If so, select the tools here, otherwise just press enter » √ How would you like to proceed? » Generate code, install dependencies, and run the app (~2 min) Creating a new LlamaIndex app in C:\Users\lamak\my-app.

Using npm.

Initializing project with template: streaming

Using vector DB: none

No tools selected - use optimized context chat engine

Installing dependencies:

  • @radix-ui/react-collapsible
  • @radix-ui/react-hover-card
  • @radix-ui/react-slot
  • ai
  • ajv
  • class-variance-authority
  • clsx
  • dotenv
  • llamaindex
  • lucide-react
  • next
  • pdf2json
  • react
  • react-dom
  • react-markdown
  • react-syntax-highlighter
  • remark
  • remark-code-import
  • remark-gfm
  • remark-math
  • rehype-katex
  • supports-color
  • tailwind-merge
  • vaul
  • @llamaindex/pdf-viewer
  • @e2b/code-interpreter
  • uuid

Installing devDependencies:

  • @types/node
  • @types/react
  • @types/react-dom
  • @types/react-syntax-highlighter
  • autoprefixer
  • cross-env
  • eslint
  • eslint-config-next
  • eslint-config-prettier
  • postcss
  • prettier
  • prettier-plugin-organize-imports
  • tailwindcss
  • tsx
  • typescript
  • @types/uuid

npm warn deprecated inflight@1.0.6: This module is not supported, and leaks memory. Do not use it. Check out lru-cache if you want a good and tested way to coalesce async requests by a key value, which is much more comprehensive and powerful. npm warn deprecated @humanwhocodes/config-array@0.11.14: Use @eslint/config-array instead npm warn deprecated rimraf@3.0.2: Rimraf versions prior to v4 are no longer supported npm warn deprecated glob@7.2.3: Glob versions prior to v9 are no longer supported npm warn deprecated @humanwhocodes/object-schema@2.0.3: Use @eslint/object-schema instead npm warn deprecated fs-promise@2.0.3: Use mz or fs-extra^3.0 with Promise Support npm warn deprecated dommatrix@1.0.3: dommatrix is no longer maintained. Please use @thednp/dommatrix.

added 986 packages, and audited 989 packages in 2m

291 packages are looking for funding run npm fund for details

3 high severity vulnerabilities

Some issues need review, and may require choosing a different dependency.

Run npm audit for details. Created '.env' file. Please check the settings.

Generating context data...

Copying data from path: C:\Users\lamak\AppData\Local\npm-cache_npx\7bfc2205dda2d438\node_modules\create-llama\dist\templates\components\data\101.pdf Running npm run generate to generate the context data.

my-app@0.1.0 generate tsx app\api\chat\engine\generate.ts

Using 'gemini' model provider Generating storage context... No valid data found at path: cache\doc_store.json starting new store. No valid data found at path: cache\index_store.json starting new store. No valid data found at path: cache\vector_store.json starting new store. Storage context successfully generated in 10.826s. Finished generating storage. Initialized a git repository.

Success! Created my-app at C:\Users\lamak\my-app Now have a look at the README.md (​file://C:\Users\lamak\my-app/README.md​) and learn how to get started.

Running app in C:\Users\lamak\my-app...

my-app@0.1.0 dev next dev

▲ Next.js 14.2.4

  • Local: http://localhost:3000
  • Environments: .env

    ✓ Starting... ✓ Ready in 2.6s ○ Compiling / ... ✓ Compiled / in 10s (2917 modules) GET / 200 in 11380ms ○ Compiling /favicon.ico ... ✓ Compiled /api/chat/config in 5.7s (2812 modules) ✓ Compiled in 0ms (1489 modules) ✓ Compiled in 0ms (1489 modules) ✓ Compiled /api/chat in 1ms (1489 modules) GET /favicon.ico 200 in 4956ms ✓ Compiled (2363 modules) GET /api/chat/config 200 in 6989ms Using 'gemini' model provider GET /api/chat/config 200 in 932ms GET /api/chat/config 200 in 11ms GET /api/chat/config 200 in 10ms The data stream is hanging. Did you forget to close it with data.close()? POST /api/chat 200 in 7783ms

marcusschiesser commented 3 weeks ago

related to create-llama, tracked in https://github.com/run-llama/create-llama/issues/125