run-llama / LlamaIndexTS

LlamaIndex in TypeScript
https://ts.llamaindex.ai
MIT License
1.8k stars 344 forks source link

TypeError: Cannot redefine property: format #1172

Closed AndreMaz closed 2 weeks ago

AndreMaz commented 2 weeks ago

I'm developing a next based app and after updating llamaindex to v0.5.21 I started to see the following error:

⨯ TypeError: Cannot redefine property: format at Function.defineProperty () at eval (webpack-internal:///(rsc)/../../node_modules/@llamaindex/core/dist/prompts/index.js:28:8) at (rsc)/../../node_modules/@llamaindex/core/dist/prompts/index.js (apps/web/.next/server/vendor-chunks/@llamaindex.js:80:1) at __webpack_require__ (apps/web/.next/server/webpack-runtime.js:33:43) at eval (webpack-internal:///(rsc)/../../node_modules/llamaindex/dist/index.edge.js:222:82) at (rsc)/../../node_modules/llamaindex/dist/index.edge.js (apps/web/.next/server/vendor-chunks/llamaindex.js:560:1)

Initially I assumed that the error was caused by https://github.com/run-llama/LlamaIndexTS/pull/1154 (hence my comments in there)

In my attempt to isolate the problem I've checked out to latest commit that was working (lllamaindex v0.5.16) and then started bumping llamaindex step-by-step in my pnpm-workspace.yaml that looks like:

catalog:
  dotenv: ^16.4.5
  eslint: ^9.8.0
  prettier: ^3.3.3
  typescript: ^5.5.4
  zod: ^3.23.8
  llamaindex: 0.5.16

Results: v0.5.16 - OK v0.5.17 - OK v0.5.18 - OK v0.5.19 - OK v0.5.20 - OK v0.5.21 - Error starts here v0.5.22 - - Error continues here

Rolling back to v0.5.20 the error disappears. This leads me be to believe that the culprit is some commit between 0.5.20 and 0.5.21

Here are the changes in pnpm-lock.yaml

--- <unnamed>
+++ <unnamed>
@@ -13,8 +13,8 @@
       specifier: ^9.8.0
       version: 9.8.0
     llamaindex:
-      specifier: 0.5.20
-      version: 0.5.20
+      specifier: 0.5.21
+      version: 0.5.21
     prettier:
       specifier: ^3.3.3
       version: 3.3.3
@@ -233,7 +233,7 @@
         version: 1.2.1
       llamaindex:
         specifier: 'catalog:'
-        version: 0.5.20(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0))(@aws-sdk/credential-providers@3.624.0(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0)))(@notionhq/client@2.2.15(encoding@0.1.13))(encoding@0.1.13)(socks@2.8.3)(typescript@5.5.4)
+        version: 0.5.21(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0))(@aws-sdk/credential-providers@3.624.0(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0)))(@notionhq/client@2.2.15(encoding@0.1.13))(encoding@0.1.13)(socks@2.8.3)(typescript@5.5.4)
       lowlight:
         specifier: ^3.1.0
         version: 3.1.0
@@ -1061,8 +1061,8 @@
     resolution: {integrity: sha512-30iZtAPgz+LTIYoeivqYo853f02jBYSd5uGnGpkFV0M3xOt9aN73erkgYAmZU43x4VfqcnLxW9Kpg3R5LC4YYw==}
     engines: {node: '>=6.0.0'}

-  '@anthropic-ai/sdk@0.21.1':
-    resolution: {integrity: sha512-fqdt74RTdplnaFOYhwNjjK/Ec09Dqv9ekYr7PuC6GdhV1RWkziqbpJBewn42CYYqCr92JeX6g+IXVgXmq9l7XQ==}
+  '@anthropic-ai/sdk@0.27.1':
+    resolution: {integrity: sha512-AKFd/E8HO26+DOVPiZpEked3Pm2feA5d4gcX2FcJXr9veDkXbKO90hr2C7N2TL7mPIMwm040ldXlsIZQ416dHg==}

   '@auth/core@0.27.0':
     resolution: {integrity: sha512-3bydnRJIM/Al6mkYmb53MsC+6G8ojw3lLPzwgVnX4dCo6N2lrib6Wq6r0vxZIhuHGjLObqqtUfpeaEj5aeTHFg==}
@@ -1982,6 +1982,10 @@
     resolution: {integrity: sha512-IndcI5hzlNZ7GS96RV3Xw1R2kaDuXEp7tRIy/KlhidpN/BQ1qh1NZt3377dMLTa44xDUNKT7hnXkA/oUAzD/lg==}
     engines: {node: '>=16.11.0'}

+  '@discoveryjs/json-ext@0.6.1':
+    resolution: {integrity: sha512-boghen8F0Q8D+0/Q1/1r6DUEieUJ8w2a1gIknExMSHBsJFOr2+0KUfHiVYBvucPwl3+RU5PFBK833FjFCh3BhA==}
+    engines: {node: '>=14.17.0'}
+
   '@emotion/babel-plugin@11.11.0':
     resolution: {integrity: sha512-m4HEDZleaaCH+XgDDsPF15Ht6wTLsgDTeR3WYj9Q/k76JtWhrJjcP4+/XlG8LGT/Rol9qUfOIztXeA84ATpqPQ==}

@@ -2751,8 +2755,8 @@
   '@llamaindex/cloud@0.2.2':
     resolution: {integrity: sha512-E0zMO+SEn8V2+T3o9C8v0mmEEnPePR0nsYJBiUtJGdzFTYl7WL4Z/MBihDe1ogFrszOyuYmel7WaVHMaI4oCbg==}

-  '@llamaindex/core@0.1.9':
-    resolution: {integrity: sha512-KcjGrDiIgjfEYytj0ZaBrPqvw9+4chYfpJR/cALx3p4sXlL4FhYDRzjs2vEo91dMn7N0Y73THEwu+u0wJSQwZw==}
+  '@llamaindex/core@0.1.10':
+    resolution: {integrity: sha512-cHU2pAx2ePVLJXCqV7YmRPVYmd4xyTwAD9P/8F6C9g94anhvWpAWRDII+wjGMTfaevG+ZGLzceOYQx+SPvOEMg==}

   '@llamaindex/env@0.1.9':
     resolution: {integrity: sha512-kO24mtV8gl76bDYtu14x2LvPRdAZa4WiQVEDnJhMr1AJwCPbNlonhfJgxHrolc+JqeDLQEH+hdWNH0SKqpzkQQ==}
@@ -2798,8 +2802,10 @@
     resolution: {integrity: sha512-cOZZOVhDSulgK0meTsTkmNXb1ahVvmTmWmfx9gRBwc6hq98wS9JP35ESIoNq3xqEan+UN+gn8187Z6E4NKhLsw==}
     hasBin: true

-  '@mistralai/mistralai@0.5.0':
-    resolution: {integrity: sha512-56xfoC/0CiT0RFHrRNoJYSKCNc922EyHzEPJYY6ttalQ5KZdrNVgXeOetIGX0lDx7IjbxAJrrae2MQgUIlL9+g==}
+  '@mistralai/mistralai@1.0.4':
+    resolution: {integrity: sha512-fLFBD8r4KvITCkKlKcq2ievnNyLd7Oob4xMY7MkY04BqR4nffkTS49DqapnVkemuldtrmHidwPzwD7UT+yFC4A==}
+    peerDependencies:
+      zod: '>= 3'

   '@mixedbread-ai/sdk@2.2.11':
     resolution: {integrity: sha512-NJiY6BVPR+s/DTzUPQS1Pv418trOmII/8hftmIqxXlYaKbIrgJimQfwCW9M6Y21YPcMA8zTQGYZHm4IWlMjIQw==}
@@ -3168,9 +3174,9 @@
   '@petamoriken/float16@3.8.7':
     resolution: {integrity: sha512-/Ri4xDDpe12NT6Ex/DRgHzLlobiQXEW/hmG08w1wj/YU7hLemk97c+zHQFp0iZQ9r7YqgLEXZR2sls4HxBf9NA==}

-  '@pinecone-database/pinecone@2.2.2':
-    resolution: {integrity: sha512-gbe/4SowHc64pHIm0kBdgY9hVdzsQnnnpcWviwYMB33gOmsL8brvE8fUSpl1dLDvdyXzKcQkzdBsjCDlqgpdMA==}
-    engines: {node: '>=14.0.0'}
+  '@pinecone-database/pinecone@3.0.2':
+    resolution: {integrity: sha512-OarESoYHlAEKh09pAzFs7QglCupd6Cv5QUIe9GHiFuVpyIFnBecklcRwWtLL1Qnd0cCFU7XvaWryFwrE4Pr4gA==}
+    engines: {node: '>=18.0.0'}

   '@pkgjs/parseargs@0.11.0':
     resolution: {integrity: sha512-+1VkjdD0QBLPodGrJUeqarH8VAIvQODIbwh9XpP5Syisf7YoQgsJKPNFoqqLQlu+VQ/tVSshMR6loPMn8U+dPg==}
@@ -3692,9 +3698,6 @@

   '@sideway/pinpoint@2.0.0':
     resolution: {integrity: sha512-RNiOoTPkptFtSVzQevY/yWtZwf/RxyVnPy/OcA9HBM3MlGDnBEYL5B41H0MTn0Uec8Hi+2qUtTfG2WWZBmMejQ==}
-
-  '@sinclair/typebox@0.29.6':
-    resolution: {integrity: sha512-aX5IFYWlMa7tQ8xZr3b2gtVReCvg7f3LEhjir/JAjX2bJCMVJA5tIPv30wTD4KDfcwMd7DDYY3hFDeGmOgtrZQ==}

   '@sindresorhus/merge-streams@1.0.0':
     resolution: {integrity: sha512-rUV5WyJrJLoloD4NDN1V1+LDMDWOa4OTsT4yYJwQNpTU6FWxkxHpL7eu4w+DmiH8x/EAM1otkPE1+LaspIbplw==}
@@ -5132,8 +5135,8 @@
     resolution: {integrity: sha512-PDyvQ5f2PValmqZZIJATimcokDt4JjIev8cKbZgEOoZm+U1IJDYuLeTcxZPQdep99R/X0RIlQ6ReQgPOVnPbNw==}
     engines: {node: '>=14.18.0'}

-  cohere-ai@7.10.6:
-    resolution: {integrity: sha512-J9y5wenl6IMqQUjklseocgusXcym0wnmuSoEdWyaNEQSYrNsHqWrpjeOYbQZ3A8/5edpPkR5Qsdwcc4FOJ5DOA==}
+  cohere-ai@7.13.0:
+    resolution: {integrity: sha512-/VTqq2dW7YkQEfeBwEmckAHorQuw1exnfrO3orsixVXASr71oF3TL0w/xi9ZVN9xsoYpXZyVaiD8GBxLEiGJ7Q==}

   color-convert@1.9.3:
     resolution: {integrity: sha512-QfAUtd+vFdAtFQcC8CCyYt1fYWxSqAiK2cSD6zDB8N3cpsEBAvRxp9zOGg6G/SHHJYAT88/az/IuDGALsNVbGg==}
@@ -6412,8 +6415,8 @@
   grid-index@1.1.0:
     resolution: {integrity: sha512-HZRwumpOGUrHyxO5bqKZL0B0GlUpwtCAzZ42sgxUPniu33R1LSFH5yrIcBCHjkctCAh3mtWKcKd9J4vDDdeVHA==}

-  groq-sdk@0.5.0:
-    resolution: {integrity: sha512-RVmhW7qZ+XZoy5fIuSdx/LGQJONpL8MHgZEW7dFwTdgkzStub2XQx6OKv28CHogijdwH41J+Npj/z2jBPu3vmw==}
+  groq-sdk@0.6.1:
+    resolution: {integrity: sha512-K+fWWcgvKeOEFePq7Z7L3Jm7s5M2oKddgW2l3iFEczaVXU5yfGNKgQXd4LQzzm64qxpfOLzndwDzQdcOwi7gZA==}

   gtoken@7.1.0:
     resolution: {integrity: sha512-pCcEwRi+TKpMlxAQObHDQ56KawURgyAf6jtIY046fJ5tIv3zDe/LEIubckAO8fj6JnAxLdmWkUfNyulQ2iKdEw==}
@@ -7072,8 +7075,8 @@
     resolution: {integrity: sha512-LXe8Xlyh3gnxdv4tSjTjscD1vpr/2PRpzq8YIaMJgyKzRG8wdISlWVWnGThJfHnlJ6hmLt2wq1yeeix0TEbuoA==}
     hasBin: true

-  llamaindex@0.5.20:
-    resolution: {integrity: sha512-V91Sg84CUGNIaxh7ZzDR/k0uKLRNneYHq1NDCF/UV1SIN8UeqcWPhVAv6kTH7yFOztMKCw5JwG66uTd0Oe9u/w==}
+  llamaindex@0.5.21:
+    resolution: {integrity: sha512-HizaE6TbyOpPKPyjGrqhpSYX2H++dKndcKx+8ieSiiqK39Ri2IvcYmg9uwDPQszQKQgOZTk9GE6D0OO5sXT7Sg==}
     engines: {node: '>=18.0.0'}
     peerDependencies:
       '@notionhq/client': ^2.2.15
@@ -9303,8 +9306,8 @@
   through@2.3.8:
     resolution: {integrity: sha512-w89qg7PI8wAdvX60bMDP+bFoD5Dvhm9oLheFp5O4a2QF0cSBGsBX4qZmadPMvVqlLJBBci+WqGGOAPvcDeNSVg==}

-  tiktoken@1.0.16:
-    resolution: {integrity: sha512-hRcORIGF2YlAgWx3nzrGJOrKSJwLoc81HpXmMQk89632XAgURc7IeV2FgQ2iXo9z/J96fCvpsHg2kWoHcbj9fg==}
+  tiktoken@1.0.14:
+    resolution: {integrity: sha512-g5zd5r/DoH8Kw0fiYbYpVhb6WO8BHO1unXqmBBWKwoT17HwSounnDtMDFUKm2Pko8U47sjQarOe+9aUrnqmmTg==}

   tiny-case@1.0.3:
     resolution: {integrity: sha512-Eet/eeMhkO6TX8mnUteS9zgPbUMQa4I6Kkp5ORiBD5476/m+PIRiumP5tmh5ioJpH7k51Kehawy2UDfsnxxY8Q==}
@@ -9820,10 +9823,6 @@
   web-namespaces@2.0.1:
     resolution: {integrity: sha512-bKr1DkiNa2krS7qxNtdrtHAmzuYGFQLiQ13TsorsdT6ULTkPLKuu5+GsFpDlg6JFjUTwX2DyhMPG2be8uPrqsQ==}

-  web-streams-polyfill@3.3.3:
-    resolution: {integrity: sha512-d2JWLCivmZYTSIoge9MsgFCZrt571BikcWGYkjC1khllbTeDlGqZ2D8vD8E/lJa8WGWbb7Plm8/XJYV7IJHZZw==}
-    engines: {node: '>= 8'}
-
   web-streams-polyfill@4.0.0-beta.3:
     resolution: {integrity: sha512-QW95TCTaHmsYfHDybGMwO5IJIM93I/6vTRk+daHTWFPhwh+C8Cg7j7XyKrwrj8Ib6vYXe0ocYNrmzY4xAAN6ug==}
     engines: {node: '>= 14'}
@@ -10068,7 +10067,7 @@
       '@jridgewell/gen-mapping': 0.3.5
       '@jridgewell/trace-mapping': 0.3.25

-  '@anthropic-ai/sdk@0.21.1(encoding@0.1.13)':
+  '@anthropic-ai/sdk@0.27.1(encoding@0.1.13)':
     dependencies:
       '@types/node': 18.19.44
       '@types/node-fetch': 2.6.11
@@ -10077,7 +10076,6 @@
       form-data-encoder: 1.7.2
       formdata-node: 4.4.1
       node-fetch: 2.7.0(encoding@0.1.13)
-      web-streams-polyfill: 3.3.3
     transitivePeerDependencies:
       - encoding

@@ -11548,6 +11546,8 @@

   '@discordjs/util@1.1.0': {}

+  '@discoveryjs/json-ext@0.6.1': {}
+
   '@emotion/babel-plugin@11.11.0':
     dependencies:
       '@babel/helper-module-imports': 7.24.7
@@ -12085,10 +12085,10 @@

   '@llamaindex/cloud@0.2.2': {}

-  '@llamaindex/core@0.1.9(@aws-crypto/sha256-js@5.2.0)(js-tiktoken@1.0.14)(pathe@1.1.2)(tiktoken@1.0.16)':
-    dependencies:
-      '@llamaindex/env': 0.1.9(@aws-crypto/sha256-js@5.2.0)(js-tiktoken@1.0.14)(pathe@1.1.2)(tiktoken@1.0.16)
-      '@types/node': 20.14.12
+  '@llamaindex/core@0.1.10(@aws-crypto/sha256-js@5.2.0)(js-tiktoken@1.0.14)(pathe@1.1.2)(tiktoken@1.0.14)':
+    dependencies:
+      '@llamaindex/env': 0.1.9(@aws-crypto/sha256-js@5.2.0)(js-tiktoken@1.0.14)(pathe@1.1.2)(tiktoken@1.0.14)
+      '@types/node': 22.5.4
       zod: 3.23.8
     transitivePeerDependencies:
       - '@aws-crypto/sha256-js'
@@ -12096,12 +12096,12 @@
       - pathe
       - tiktoken

-  '@llamaindex/env@0.1.9(@aws-crypto/sha256-js@5.2.0)(js-tiktoken@1.0.14)(pathe@1.1.2)(tiktoken@1.0.16)':
+  '@llamaindex/env@0.1.9(@aws-crypto/sha256-js@5.2.0)(js-tiktoken@1.0.14)(pathe@1.1.2)(tiktoken@1.0.14)':
     dependencies:
       '@types/lodash': 4.17.7
       '@types/node': 20.14.12
       js-tiktoken: 1.0.14
-      tiktoken: 1.0.16
+      tiktoken: 1.0.14
     optionalDependencies:
       '@aws-crypto/sha256-js': 5.2.0
       pathe: 1.1.2
@@ -12146,11 +12146,9 @@
       rw: 1.3.3
       sort-object: 3.0.3

-  '@mistralai/mistralai@0.5.0(encoding@0.1.13)':
-    dependencies:
-      node-fetch: 2.7.0(encoding@0.1.13)
-    transitivePeerDependencies:
-      - encoding
+  '@mistralai/mistralai@1.0.4(zod@3.23.8)':
+    dependencies:
+      zod: 3.23.8

   '@mixedbread-ai/sdk@2.2.11(encoding@0.1.13)':
     dependencies:
@@ -12470,11 +12468,8 @@

   '@petamoriken/float16@3.8.7': {}

-  '@pinecone-database/pinecone@2.2.2':
-    dependencies:
-      '@sinclair/typebox': 0.29.6
-      ajv: 8.17.1
-      cross-fetch: 3.1.8(encoding@0.1.13)
+  '@pinecone-database/pinecone@3.0.2':
+    dependencies:
       encoding: 0.1.13

   '@pkgjs/parseargs@0.11.0':
@@ -12938,8 +12933,6 @@

   '@sideway/pinpoint@2.0.0': {}

-  '@sinclair/typebox@0.29.6': {}
-
   '@sindresorhus/merge-streams@1.0.0': {}

   '@smithy/abort-controller@3.1.1':
@@ -14676,13 +14669,13 @@

   chownr@2.0.0: {}

-  chromadb@1.8.1(@google/generative-ai@0.12.0)(cohere-ai@7.10.6(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0))(encoding@0.1.13))(encoding@0.1.13)(openai@4.58.2(encoding@0.1.13)(zod@3.23.8)):
+  chromadb@1.8.1(@google/generative-ai@0.12.0)(cohere-ai@7.13.0(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0))(encoding@0.1.13))(encoding@0.1.13)(openai@4.58.2(encoding@0.1.13)(zod@3.23.8)):
     dependencies:
       cliui: 8.0.1
       isomorphic-fetch: 3.0.0(encoding@0.1.13)
     optionalDependencies:
       '@google/generative-ai': 0.12.0
-      cohere-ai: 7.10.6(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0))(encoding@0.1.13)
+      cohere-ai: 7.13.0(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0))(encoding@0.1.13)
       openai: 4.58.2(encoding@0.1.13)(zod@3.23.8)
     transitivePeerDependencies:
       - encoding
@@ -14746,7 +14739,7 @@
     dependencies:
       rfdc: 1.4.1

-  cohere-ai@7.10.6(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0))(encoding@0.1.13):
+  cohere-ai@7.13.0(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0))(encoding@0.1.13):
     dependencies:
       '@aws-sdk/client-sagemaker': 3.624.0
       '@aws-sdk/credential-providers': 3.624.0(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0))
@@ -16319,7 +16312,7 @@

   grid-index@1.1.0: {}

-  groq-sdk@0.5.0(encoding@0.1.13):
+  groq-sdk@0.6.1(encoding@0.1.13):
     dependencies:
       '@types/node': 18.19.44
       '@types/node-fetch': 2.6.11
@@ -16328,7 +16321,6 @@
       form-data-encoder: 1.7.2
       formdata-node: 4.4.1
       node-fetch: 2.7.0(encoding@0.1.13)
-      web-streams-polyfill: 3.3.3
     transitivePeerDependencies:
       - encoding

@@ -17100,36 +17092,37 @@
     transitivePeerDependencies:
       - uWebSockets.js

-  llamaindex@0.5.20(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0))(@aws-sdk/credential-providers@3.624.0(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0)))(@notionhq/client@2.2.15(encoding@0.1.13))(encoding@0.1.13)(socks@2.8.3)(typescript@5.5.4):
-    dependencies:
-      '@anthropic-ai/sdk': 0.21.1(encoding@0.1.13)
+  llamaindex@0.5.21(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0))(@aws-sdk/credential-providers@3.624.0(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0)))(@notionhq/client@2.2.15(encoding@0.1.13))(encoding@0.1.13)(socks@2.8.3)(typescript@5.5.4):
+    dependencies:
+      '@anthropic-ai/sdk': 0.27.1(encoding@0.1.13)
       '@aws-crypto/sha256-js': 5.2.0
       '@azure/identity': 4.4.1
       '@datastax/astra-db-ts': 1.4.1
       '@discordjs/rest': 2.3.0
+      '@discoveryjs/json-ext': 0.6.1
       '@google-cloud/vertexai': 1.2.0(encoding@0.1.13)
       '@google/generative-ai': 0.12.0
       '@grpc/grpc-js': 1.11.1
       '@huggingface/inference': 2.8.0
       '@llamaindex/cloud': 0.2.2
-      '@llamaindex/core': 0.1.9(@aws-crypto/sha256-js@5.2.0)(js-tiktoken@1.0.14)(pathe@1.1.2)(tiktoken@1.0.16)
-      '@llamaindex/env': 0.1.9(@aws-crypto/sha256-js@5.2.0)(js-tiktoken@1.0.14)(pathe@1.1.2)(tiktoken@1.0.16)
-      '@mistralai/mistralai': 0.5.0(encoding@0.1.13)
+      '@llamaindex/core': 0.1.10(@aws-crypto/sha256-js@5.2.0)(js-tiktoken@1.0.14)(pathe@1.1.2)(tiktoken@1.0.14)
+      '@llamaindex/env': 0.1.9(@aws-crypto/sha256-js@5.2.0)(js-tiktoken@1.0.14)(pathe@1.1.2)(tiktoken@1.0.14)
+      '@mistralai/mistralai': 1.0.4(zod@3.23.8)
       '@mixedbread-ai/sdk': 2.2.11(encoding@0.1.13)
-      '@pinecone-database/pinecone': 2.2.2
+      '@pinecone-database/pinecone': 3.0.2
       '@qdrant/js-client-rest': 1.11.0(typescript@5.5.4)
       '@types/lodash': 4.17.7
-      '@types/node': 20.14.12
+      '@types/node': 22.5.4
       '@types/papaparse': 5.3.14
       '@types/pg': 8.11.8
       '@xenova/transformers': 2.17.2
       '@zilliz/milvus2-sdk-node': 2.4.8
       ajv: 8.17.1
       assemblyai: 4.7.0
-      chromadb: 1.8.1(@google/generative-ai@0.12.0)(cohere-ai@7.10.6(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0))(encoding@0.1.13))(encoding@0.1.13)(openai@4.58.2(encoding@0.1.13)(zod@3.23.8))
-      cohere-ai: 7.10.6(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0))(encoding@0.1.13)
+      chromadb: 1.8.1(@google/generative-ai@0.12.0)(cohere-ai@7.13.0(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0))(encoding@0.1.13))(encoding@0.1.13)(openai@4.58.2(encoding@0.1.13)(zod@3.23.8))
+      cohere-ai: 7.13.0(@aws-sdk/client-sso-oidc@3.624.0(@aws-sdk/client-sts@3.624.0))(encoding@0.1.13)
       discord-api-types: 0.37.100
-      groq-sdk: 0.5.0(encoding@0.1.13)
+      groq-sdk: 0.6.1(encoding@0.1.13)
       js-tiktoken: 1.0.14
       lodash: 4.17.21
       magic-bytes.js: 1.10.0
@@ -17145,7 +17138,7 @@
       portkey-ai: 0.1.16
       rake-modified: 1.0.8
       string-strip-html: 13.4.8
-      tiktoken: 1.0.16
+      tiktoken: 1.0.14
       unpdf: 0.11.0(encoding@0.1.13)
       weaviate-client: 3.1.4(encoding@0.1.13)
       wikipedia: 2.1.2
@@ -19967,7 +19960,7 @@

   through@2.3.8: {}

-  tiktoken@1.0.16: {}
+  tiktoken@1.0.14: {}

   tiny-case@1.0.3:
     optional: true
@@ -20522,8 +20515,6 @@

   web-namespaces@2.0.1: {}

-  web-streams-polyfill@3.3.3: {}
-
   web-streams-polyfill@4.0.0-beta.3: {}

   webidl-conversions@3.0.1: {}

The error comes from this file apps/web/.next/server/vendor-chunks/@llamaindex.js

import { objectEntries } from '../utils/index.js';
import { z } from 'zod';

function getDefaultExportFromCjs (x) {
    return x && x.__esModule && Object.prototype.hasOwnProperty.call(x, 'default') ? x['default'] : x;
}

Object.defineProperty(String.prototype, "format", {
    value: function(...args_) {
        // Create variables
        let self = this;
        let __patterns__ = self.match(/({.*?})/g);
        const { REF, FILL_CHAR, MASK_NUMBER, ALIGN_OP, CROP_SIZE, DOT, FRACTION, TYPE_VAR } = {
            REF: 1,
            FILL_CHAR: 2,
            MASK_NUMBER: 3,
            ALIGN_OP: 4,
            CROP_SIZE: 5,
            DOT: 6,
            FRACTION: 7,
            TYPE_VAR: 8
        };
        const DEFAULT_PLACE = 6;
        const ALL_REGEXP = /{(\w+)?:([^>\^<\d#]|0)?([#%,])?([>^<\.])?(\d+)?(\.)?(\d+)?([eEfFgGdxXobn#%])?}/g;
        const regExpBasic = /{\[?(\w+)\]?}/; // it's not best solution
        const isObject = typeof args_[0] === "object";
        // types/use logic
        __patterns__?.map((pattern, patt_index)=>{
            const kargs = ALL_REGEXP.exec(pattern) || ALL_REGEXP.exec(pattern);
            const wargs = regExpBasic.exec(pattern);
            // Insert values (one 2 one / array / object)
            const INDEX_VAR = (wargs ? wargs[REF] : kargs ? kargs[REF] : patt_index) || patt_index;
            let NATUAL_VALUE = isObject ? args_[0][INDEX_VAR] : args_[INDEX_VAR];
            let ACTUAL_VALUE = isObject ? args_[0][INDEX_VAR] : args_[INDEX_VAR];
            // Verify sintax/semantic
            if (ACTUAL_VALUE === null || ACTUAL_VALUE === undefined) throw new Error(`Replacement index ${INDEX_VAR} out of range for positional args tuple`);
            if (kargs) {
                // If TYPE_VAR is not defined and the first argument is a number, pad a string should from left, so set TYPE_VAR to "d"
                if (kargs[TYPE_VAR] === undefined && typeof ACTUAL_VALUE === "number") {
                    kargs[TYPE_VAR] = "d";
                }
                const LETTER = (!kargs[FILL_CHAR] ? false : !kargs[ALIGN_OP] && [
                    ..."FfbefoxXn"
                ].includes(kargs[FILL_CHAR].toLowerCase()) ? kargs[FILL_CHAR] : kargs[TYPE_VAR]) || kargs[TYPE_VAR];
                //  padronaze
                if (LETTER) {
                    const floatSize = pattern.includes(".") ? Number(kargs[FRACTION] || kargs[CROP_SIZE]) : DEFAULT_PLACE;
                    switch(LETTER){
                        case "E":
                            ACTUAL_VALUE = ACTUAL_VALUE.toExponential(DEFAULT_PLACE).toUpperCase();
                            break;
                        case "e":
                            ACTUAL_VALUE = ACTUAL_VALUE.toExponential(DEFAULT_PLACE);
                            break;
                        case "X":
                            ACTUAL_VALUE = ACTUAL_VALUE.toString(16).toUpperCase();
                            break;
                        case "x":
                            ACTUAL_VALUE = ACTUAL_VALUE.toString(16); // Hexadecimal
                            break;
                        case "b":
                            ACTUAL_VALUE = ACTUAL_VALUE.toString(2); // Binary
                            break;
                        case "f":
                        case "F":
                            ACTUAL_VALUE = ACTUAL_VALUE.toFixed(floatSize);
                            break;
                        case "o":
                            ACTUAL_VALUE = ACTUAL_VALUE.toString(8); // Octal
                            break;
                    }
                    //  mask
                    switch(kargs[MASK_NUMBER]){
                        case "#":
                            const MASK = {
                                x: "0x",
                                X: "0X",
                                o: "0o",
                                b: "0b"
                            }[LETTER];
                            ACTUAL_VALUE = MASK + ACTUAL_VALUE;
                            break;
                    }
                }
                // signal
                if ([
                    ..." +-,%"
                ].includes(kargs[FILL_CHAR]) && typeof NATUAL_VALUE === "number") {
                    ACTUAL_VALUE = ACTUAL_VALUE.toString().replace("-", "");
                    if (NATUAL_VALUE >= 0) switch(kargs[FILL_CHAR]){
                        case "+":
                            ACTUAL_VALUE = "+" + ACTUAL_VALUE;
                            break;
                        case " ":
                            ACTUAL_VALUE = " " + ACTUAL_VALUE;
                            break;
                        case ",":
                            ACTUAL_VALUE = NATUAL_VALUE.toString().split(/(?=(?:...)*$)/).join(kargs[FILL_CHAR]);
                            break;
                        case "%":
                            ACTUAL_VALUE = (NATUAL_VALUE * 100).toFixed(kargs[FRACTION] || DEFAULT_PLACE) + "%";
                            break;
                    }
                    else ACTUAL_VALUE = "-" + ACTUAL_VALUE;
                }
                // space / order / trim
                if (kargs[CROP_SIZE]) {
                    ACTUAL_VALUE = ACTUAL_VALUE.toString();
                    const FILL_ELEMENT = kargs[FILL_CHAR] || " ";
                    const SIZE_STRING = ACTUAL_VALUE.length;
                    const SIZE_ARG = kargs[CROP_SIZE];
                    const FILL_LENGTH = SIZE_STRING > SIZE_ARG ? SIZE_STRING : SIZE_ARG;
                    const FILL = FILL_ELEMENT.repeat(FILL_LENGTH);
                    switch(kargs[ALIGN_OP] || kargs[FILL_CHAR]){
                        case "<":
                            ACTUAL_VALUE = ACTUAL_VALUE.padEnd(FILL_LENGTH, FILL_ELEMENT);
                            break;
                        case ".":
                            if (!(LETTER && /[fF]/.test(LETTER))) ACTUAL_VALUE = ACTUAL_VALUE.slice(0, SIZE_ARG);
                            break;
                        case ">":
                            ACTUAL_VALUE = ACTUAL_VALUE.padStart(FILL_LENGTH, FILL_ELEMENT);
                            break;
                        case "^":
                            const length_start = Math.floor((FILL_LENGTH - SIZE_STRING) / 2);
                            const string_start = length_start > 0 ? FILL_ELEMENT.repeat(length_start) + ACTUAL_VALUE : ACTUAL_VALUE;
                            ACTUAL_VALUE = FILL.replace(RegExp(`.{${string_start.length}}`), string_start);
                            break;
                        default:
                            ACTUAL_VALUE = LETTER ? ACTUAL_VALUE.padStart(FILL_LENGTH, FILL_ELEMENT) : ACTUAL_VALUE.padEnd(FILL_LENGTH, FILL_ELEMENT);
                            break;
                    }
                }
            }
            // SET Definitive value
            self = self.replace(pattern, ACTUAL_VALUE);
        });
        return self;
    }
});
var pythonFormatJs = (inputString, ...param)=>inputString.format(...param);

var format = /*@__PURE__*/getDefaultExportFromCjs(pythonFormatJs);

const promptType = {
    SUMMARY: "summary",
    TREE_INSERT: "insert",
    TREE_SELECT: "tree_select",
    TREE_SELECT_MULTIPLE: "tree_select_multiple",
    QUESTION_ANSWER: "text_qa",
    REFINE: "refine",
    KEYWORD_EXTRACT: "keyword_extract",
    QUERY_KEYWORD_EXTRACT: "query_keyword_extract",
    SCHEMA_EXTRACT: "schema_extract",
    TEXT_TO_SQL: "text_to_sql",
    TEXT_TO_GRAPH_QUERY: "text_to_graph_query",
    TABLE_CONTEXT: "table_context",
    KNOWLEDGE_TRIPLET_EXTRACT: "knowledge_triplet_extract",
    SIMPLE_INPUT: "simple_input",
    PANDAS: "pandas",
    JSON_PATH: "json_path",
    SINGLE_SELECT: "single_select",
    MULTI_SELECT: "multi_select",
    VECTOR_STORE_QUERY: "vector_store_query",
    SUB_QUESTION: "sub_question",
    SQL_RESPONSE_SYNTHESIS: "sql_response_synthesis",
    SQL_RESPONSE_SYNTHESIS_V2: "sql_response_synthesis_v2",
    CONVERSATION: "conversation",
    DECOMPOSE: "decompose",
    CHOICE_SELECT: "choice_select",
    CUSTOM: "custom",
    RANKGPT_RERANK: "rankgpt_rerank"
};
const promptTypeSchema = z.enum([
    promptType.SUMMARY,
    promptType.TREE_INSERT,
    promptType.TREE_SELECT,
    promptType.TREE_SELECT_MULTIPLE,
    promptType.QUESTION_ANSWER,
    promptType.REFINE,
    promptType.KEYWORD_EXTRACT,
    promptType.QUERY_KEYWORD_EXTRACT,
    promptType.SCHEMA_EXTRACT,
    promptType.TEXT_TO_SQL,
    promptType.TEXT_TO_GRAPH_QUERY,
    promptType.TABLE_CONTEXT,
    promptType.KNOWLEDGE_TRIPLET_EXTRACT,
    promptType.SIMPLE_INPUT,
    promptType.PANDAS,
    promptType.JSON_PATH,
    promptType.SINGLE_SELECT,
    promptType.MULTI_SELECT,
    promptType.VECTOR_STORE_QUERY,
    promptType.SUB_QUESTION,
    promptType.SQL_RESPONSE_SYNTHESIS,
    promptType.SQL_RESPONSE_SYNTHESIS_V2,
    promptType.CONVERSATION,
    promptType.DECOMPOSE,
    promptType.CHOICE_SELECT,
    promptType.CUSTOM,
    promptType.RANKGPT_RERANK
]);
const PromptType = promptTypeSchema.enum;

class BasePromptTemplate {
    constructor(options){
        this.metadata = {};
        this.templateVars = new Set();
        this.options = {};
        this.templateVarMappings = {};
        this.functionMappings = {};
        const { metadata, templateVars, outputParser, templateVarMappings, functionMappings } = options;
        if (metadata) {
            this.metadata = metadata;
        }
        if (templateVars) {
            this.templateVars = new Set(templateVars);
        }
        if (options.options) {
            this.options = options.options;
        }
        this.outputParser = outputParser;
        if (templateVarMappings) {
            this.templateVarMappings = templateVarMappings;
        }
        if (functionMappings) {
            this.functionMappings = functionMappings;
        }
    }
    mapTemplateVars(options) {
        const templateVarMappings = this.templateVarMappings;
        return Object.fromEntries(objectEntries(options).map(([k, v])=>[
                templateVarMappings[k] || k,
                v
            ]));
    }
    mapFunctionVars(options) {
        const functionMappings = this.functionMappings;
        const newOptions = {};
        for (const [k, v] of objectEntries(functionMappings)){
            newOptions[k] = v(options);
        }
        for (const [k, v] of objectEntries(options)){
            if (!(k in newOptions)) {
                newOptions[k] = v;
            }
        }
        return newOptions;
    }
    mapAllVars(options) {
        const newOptions = this.mapFunctionVars(options);
        return this.mapTemplateVars(newOptions);
    }
}
class PromptTemplate extends BasePromptTemplate {
    #template;
    constructor(options){
        const { template, promptType, ...rest } = options;
        super(rest);
        this.#template = template;
        this.promptType = promptType ?? PromptType.custom;
    }
    partialFormat(options) {
        const prompt = new PromptTemplate({
            template: this.template,
            templateVars: [
                ...this.templateVars
            ],
            options: this.options,
            outputParser: this.outputParser,
            templateVarMappings: this.templateVarMappings,
            functionMappings: this.functionMappings,
            metadata: this.metadata,
            promptType: this.promptType
        });
        prompt.options = {
            ...prompt.options,
            ...options
        };
        return prompt;
    }
    format(options) {
        const allOptions = {
            ...this.options,
            ...options
        };
        const mappedAllOptions = this.mapAllVars(allOptions);
        const prompt = format(this.template, mappedAllOptions);
        if (this.outputParser) {
            return this.outputParser.format(prompt);
        }
        return prompt;
    }
    formatMessages(options) {
        const prompt = this.format(options);
        return [
            {
                role: "user",
                content: prompt
            }
        ];
    }
    get template() {
        return this.#template;
    }
}

class PromptMixin {
    validatePrompts(promptsDict, moduleDict) {
        for (const key of Object.keys(promptsDict)){
            if (key.includes(":")) {
                throw new Error(`Prompt key ${key} cannot contain ':'.`);
            }
        }
        for (const key of Object.keys(moduleDict)){
            if (key.includes(":")) {
                throw new Error(`Module key ${key} cannot contain ':'.`);
            }
        }
    }
    getPrompts() {
        const promptsDict = this._getPrompts();
        const moduleDict = this._getPromptModules();
        this.validatePrompts(promptsDict, moduleDict);
        const allPrompts = {
            ...promptsDict
        };
        for (const [module_name, prompt_module] of objectEntries(moduleDict)){
            for (const [key, prompt] of objectEntries(prompt_module.getPrompts())){
                allPrompts[`${module_name}:${key}`] = prompt;
            }
        }
        return allPrompts;
    }
    updatePrompts(prompts) {
        const promptModules = this._getPromptModules();
        this._updatePrompts(prompts);
        const subPrompt = {};
        for(const key in prompts){
            if (key.includes(":")) {
                const [module_name, sub_key] = key.split(":");
                if (!subPrompt[module_name]) {
                    subPrompt[module_name] = {};
                }
                subPrompt[module_name][sub_key] = prompts[key];
            }
        }
        for (const [module_name, subPromptDict] of Object.entries(subPrompt)){
            if (!promptModules[module_name]) {
                throw new Error(`Module ${module_name} not found.`);
            }
            const moduleToUpdate = promptModules[module_name];
            moduleToUpdate.updatePrompts(subPromptDict);
        }
    }
}

const defaultTextQAPrompt = new PromptTemplate({
    templateVars: [
        "context",
        "query"
    ],
    template: `Context information is below.
---------------------
{context}
---------------------
Given the context information and not prior knowledge, answer the query.
Query: {query}
Answer:`
});
const anthropicTextQaPrompt = new PromptTemplate({
    templateVars: [
        "context",
        "query"
    ],
    template: `Context information:
<context>
{context}
</context>
Given the context information and not prior knowledge, answer the query.
Query: {query}`
});
const defaultSummaryPrompt = new PromptTemplate({
    templateVars: [
        "context"
    ],
    template: `Write a summary of the following. Try to use only the information provided. Try to include as many key details as possible.

{context}

SUMMARY:"""
`
});
const anthropicSummaryPrompt = new PromptTemplate({
    templateVars: [
        "context"
    ],
    template: `Summarize the following text. Try to use only the information provided. Try to include as many key details as possible.
<original-text>
{context}
</original-text>

SUMMARY:
`
});
const defaultRefinePrompt = new PromptTemplate({
    templateVars: [
        "query",
        "existingAnswer",
        "context"
    ],
    template: `The original query is as follows: {query}
We have provided an existing answer: {existingAnswer}
We have the opportunity to refine the existing answer (only if needed) with some more context below.
------------
{context}
------------
Given the new context, refine the original answer to better answer the query. If the context isn't useful, return the original answer.
Refined Answer:`
});
const defaultTreeSummarizePrompt = new PromptTemplate({
    templateVars: [
        "context",
        "query"
    ],
    template: `Context information from multiple sources is below.
---------------------
{context}
---------------------
Given the information from multiple sources and not prior knowledge, answer the query.
Query: {query}
Answer:`
});
const defaultChoiceSelectPrompt = new PromptTemplate({
    templateVars: [
        "context",
        "query"
    ],
    template: `A list of documents is shown below. Each document has a number next to it along 
with a summary of the document. A question is also provided.
Respond with the numbers of the documents
you should consult to answer the question, in order of relevance, as well
as the relevance score. The relevance score is a number from 1-10 based on
how relevant you think the document is to the question.
Do not include any documents that are not relevant to the question.
Example format:
Document 1:
<summary of document 1>

Document 2:
<summary of document 2>

...

Document 10:\n<summary of document 10>

Question: <question>
Answer:
Doc: 9, Relevance: 7
Doc: 3, Relevance: 4
Doc: 7, Relevance: 3

Let's try this now:

{context}
Question: {query}
Answer:`
});
function buildToolsText(tools) {
    const toolsObj = tools.reduce((acc, tool)=>{
        acc[tool.name] = tool.description;
        return acc;
    }, {});
    return JSON.stringify(toolsObj, null, 4);
}
const exampleTools = [
    {
        name: "uber_10k",
        description: "Provides information about Uber financials for year 2021"
    },
    {
        name: "lyft_10k",
        description: "Provides information about Lyft financials for year 2021"
    }
];
const exampleQueryStr = `Compare and contrast the revenue growth and EBITDA of Uber and Lyft for year 2021`;
const exampleOutput = [
    {
        subQuestion: "What is the revenue growth of Uber",
        toolName: "uber_10k"
    },
    {
        subQuestion: "What is the EBITDA of Uber",
        toolName: "uber_10k"
    },
    {
        subQuestion: "What is the revenue growth of Lyft",
        toolName: "lyft_10k"
    },
    {
        subQuestion: "What is the EBITDA of Lyft",
        toolName: "lyft_10k"
    }
];
const defaultSubQuestionPrompt = new PromptTemplate({
    templateVars: [
        "toolsStr",
        "queryStr"
    ],
    template: `Given a user question, and a list of tools, output a list of relevant sub-questions that when composed can help answer the full user question:

# Example 1
<Tools>
\`\`\`json
${buildToolsText(exampleTools)}
\`\`\`

<User Question>
${exampleQueryStr}

<Output>
\`\`\`json
${JSON.stringify(exampleOutput, null, 4)}
\`\`\`

# Example 2
<Tools>
\`\`\`json
{toolsStr}
\`\`\`

<User Question>
{queryStr}

<Output>
`
});
const defaultCondenseQuestionPrompt = new PromptTemplate({
    templateVars: [
        "chatHistory",
        "question"
    ],
    template: `Given a conversation (between Human and Assistant) and a follow up message from Human, rewrite the message to be a standalone question that captures all relevant context from the conversation.

<Chat History>
{chatHistory}

<Follow Up Message>
{question}

<Standalone question>
`
});
const defaultContextSystemPrompt = new PromptTemplate({
    templateVars: [
        "context"
    ],
    template: `Context information is below.
---------------------
{context}
---------------------`
});
const defaultKeywordExtractPrompt = new PromptTemplate({
    templateVars: [
        "maxKeywords",
        "context"
    ],
    template: `
Some text is provided below. Given the text, extract up to {maxKeywords} keywords from the text. Avoid stopwords.
---------------------
{context}
---------------------
Provide keywords in the following comma-separated format: 'KEYWORDS: <keywords>'
`
}).partialFormat({
    maxKeywords: "10"
});
const defaultQueryKeywordExtractPrompt = new PromptTemplate({
    templateVars: [
        "maxKeywords",
        "question"
    ],
    template: `(
  "A question is provided below. Given the question, extract up to {maxKeywords} "
  "keywords from the text. Focus on extracting the keywords that we can use "
  "to best lookup answers to the question. Avoid stopwords."
  "---------------------"
  "{question}"
  "---------------------"
  "Provide keywords in the following comma-separated format: 'KEYWORDS: <keywords>'"
)`
}).partialFormat({
    maxKeywords: "10"
});

export { BasePromptTemplate, PromptMixin, PromptTemplate, anthropicSummaryPrompt, anthropicTextQaPrompt, defaultChoiceSelectPrompt, defaultCondenseQuestionPrompt, defaultContextSystemPrompt, defaultKeywordExtractPrompt, defaultQueryKeywordExtractPrompt, defaultRefinePrompt, defaultSubQuestionPrompt, defaultSummaryPrompt, defaultTextQAPrompt, defaultTreeSummarizePrompt };

More context about how this error happens: I'm using llamaindex at app/api/chat/route.ts , in other words, I'm using Route Handlers

When I start my app and open /home next.js compiles the app and I can call llamaindex at app/api/chat/route.ts without any error. Then, when I navigate to another page (e.g, /products) next.js does another compilation. After this when I call the handler at app/api/chat/route.ts I get the error above.

I'm assuming that during the first compilation the Object.defineProperty(String.prototype, "format", {/* logic */}) gets executed without any issue. However, during the second compilation if throws the Cannot redefine property: format error

@marcusschiesser @himself65 any idea of what's going on here? My knowledge about this lib is pretty limited.

AndreMaz commented 2 weeks ago

Did a quick test by adding an if like so:

if (!String.prototype.format) {
  Object.defineProperty(String.prototype, "format", {});
}

and it solves the issue

himself65 commented 2 weeks ago

I will do some patch for this package, thanks for feedback

himself65 commented 2 weeks ago

any idea of what's going on here? My knowledge about this lib is pretty limited.

We are using this package to keep same logic with python side, so