Chainlit / literalai-typescript

https://docs.literalai.com
Apache License 2.0
5 stars 0 forks source link

llamaindex import issue #66

Closed ghislainf closed 1 month ago

ghislainf commented 1 month ago

Hi there! llamaindex is listed as a peer dependency but the package is currently throwing an error when llamaindex is missing.

Here's how you can see this for yourself:

A bit annoying because llamaindex has an insane number of dependencies.

Cheers! ✌🏻

willydouhard commented 1 month ago

Thank you for the feedback!

I opened https://github.com/Chainlit/literalai-typescript/pull/67, will test it throughout the day and will release on Monday if everything is OK.

ghislainf commented 1 month ago

great thanks for your reactivity!

ghislainf commented 1 month ago

@willydouhard I see the merge (thank you), but there's no new version available on npm. Is this normal?

willydouhard commented 1 month ago

Just released 0.0.517 @ghislainf, let me know if it fixes the issue on your end.

ghislainf commented 1 month ago

no it's not working, step to reproduce easily

bun add  @literalai/client

result

bun add v1.0.3 (25e69c71)

 installed @literalai/client@0.0.517

bun run index.ts

 import { LiteralClient } from '@literalai/client';

const client = new LiteralClient();

result

error: Cannot find package "llamaindex"
ghislainf commented 1 month ago

here the diff to fix the issue (no access to submit PR)

commit d062f6443994dcc8b4ee1b9ae6dc48343cd9e613
Author: ghislainf <ghislain@payfit.com>
Date:   Tue Sep 17 17:56:26 2024 +0200

    fix peerDependencies loading

diff --git a/src/instrumentation/index.ts b/src/instrumentation/index.ts
index c50da06..b21d80b 100644
--- a/src/instrumentation/index.ts
+++ b/src/instrumentation/index.ts
@@ -2,6 +2,7 @@ import { LiteralClient, Maybe } from '..';
 import { Thread } from '../observability/thread';
 import type { AllVercelFn } from './vercel-sdk';

+
 export type OpenAIGlobalOptions = {
   tags?: Maybe<string[]>;
   metadata?: Maybe<Record<string, any>>;
@@ -19,75 +20,85 @@ export default (client: LiteralClient) => ({
    * @param options.metadata Metadata to attach to all generations.
    * @returns
    */
-  openai: (options?: OpenAIGlobalOptions) => {
-    try {
-      // eslint-disable-next-line @typescript-eslint/no-var-requires
-      const { default: instrumentOpenAI } = require('./openai');
-      return instrumentOpenAI(client, options);
-    } catch (error) {
-      throw new Error(
-        'Failed to load OpenAI. Please ensure openai is installed.'
-      );
-    }
-  },
-
-  langchain: {
-    literalCallback: (params?: {
-      threadId?: string;
-      chainTypesToIgnore?: string[];
-    }) => {
+  get openai() {
+    return (options?: OpenAIGlobalOptions) => {
       try {
         // eslint-disable-next-line @typescript-eslint/no-var-requires
-        const { LiteralCallbackHandler } = require('./langchain');
-        return new LiteralCallbackHandler(
-          client,
-          params?.threadId,
-          params?.chainTypesToIgnore
-        );
+        const { default: instrumentOpenAI } = require('./openai');
+        return instrumentOpenAI(client, options);
       } catch (error) {
         throw new Error(
-          'Failed to load langchain. Please ensure langchain is installed.'
+          'Failed to load OpenAI. Please ensure openai is installed.'
         );
       }
-    }
+    };
   },

-  vercel: {
-    instrument: <TFunction extends AllVercelFn>(fn: TFunction) => {
-      try {
-        // eslint-disable-next-line @typescript-eslint/no-var-requires
-        const { makeInstrumentVercelSDK } = require('./vercel-sdk');
-        return makeInstrumentVercelSDK(client)(fn);
-      } catch (error) {
-        throw new Error(
-          'Failed to load Vercel SDK. Please ensure @vercel/ai is installed.'
-        );
+  get langchain() {
+    return {
+      literalCallback: (params?: {
+        threadId?: string;
+        chainTypesToIgnore?: string[];
+      }) => {
+        try {
+          // eslint-disable-next-line @typescript-eslint/no-var-requires
+          const { LiteralCallbackHandler } = require('./langchain');
+          return new LiteralCallbackHandler(
+            client,
+            params?.threadId,
+            params?.chainTypesToIgnore
+          );
+        } catch (error) {
+          throw new Error(
+            'Failed to load langchain. Please ensure langchain is installed.'
+          );
+        }
       }
-    }
+    };
   },

-  llamaIndex: {
-    instrument: () => {
-      try {
-        // eslint-disable-next-line @typescript-eslint/no-var-requires
-        const { instrumentLlamaIndex } = require('./llamaindex');
-        return instrumentLlamaIndex(client);
-      } catch (error) {
-        throw new Error(
-          'Failed to load LlamaIndex. Please ensure llamaindex is installed.'
-        );
+  get vercel() {
+    return {
+      instrument: <TFunction extends AllVercelFn>(fn: TFunction) => {
+        try {
+          // eslint-disable-next-line @typescript-eslint/no-var-requires
+          const { makeInstrumentVercelSDK } = require('./vercel-sdk');
+          return makeInstrumentVercelSDK(client)(fn);
+        } catch (error) {
+          throw new Error(
+            'Failed to load Vercel SDK. Please ensure @vercel/ai is installed.'
+          );
+        }
       }
+    };
+  },
+
+  llamaIndex: {
+    get instrument() {
+      return () => {
+        try {
+          // eslint-disable-next-line @typescript-eslint/no-var-requires
+          const { instrumentLlamaIndex } = require('./llamaindex');
+          return instrumentLlamaIndex(client);
+        } catch (error) {
+          throw new Error(
+            'Failed to load LlamaIndex. Please ensure llamaindex is installed.'
+          );
+        }
+      };
     },
-    withThread: <R>(thread: Thread, callback: () => R) => {
-      try {
-        // eslint-disable-next-line @typescript-eslint/no-var-requires
-        const { withThread } = require('./llamaindex');
-        return withThread(thread, callback);
-      } catch (error) {
-        throw new Error(
-          'Failed to load LlamaIndex. Please ensure llamaindex is installed.'
-        );
-      }
+    get withThread() {
+      return <R>(thread: Thread, callback: () => R) => {
+        try {
+          // eslint-disable-next-line @typescript-eslint/no-var-requires
+          const { withThread } = require('./llamaindex');
+          return withThread(thread, callback);
+        } catch (error) {
+          throw new Error(
+            'Failed to load LlamaIndex. Please ensure llamaindex is installed.'
+          );
+        }
+      };
     }
   }
-});
+});
\ No newline at end of file
diff --git a/src/prompt-engineering/prompt.ts b/src/prompt-engineering/prompt.ts
index bee2abe..129c9db 100644
--- a/src/prompt-engineering/prompt.ts
+++ b/src/prompt-engineering/prompt.ts
@@ -6,13 +6,13 @@ import {

 import { API } from '../api';
 import { DatasetItem } from '../evaluation/dataset';
-import { CustomChatPromptTemplate } from '../instrumentation/langchain';
 import {
   GenerationType,
   IGenerationMessage
 } from '../observability/generation';
 import { Maybe, OmitUtils, Utils } from '../utils';

+
 export interface IPromptVariableDefinition {
   name: string;
   language: 'json' | 'plaintext';
@@ -133,15 +133,27 @@ export class Prompt extends PromptFields {
    * @returns A custom chat prompt template configured with the prompt's data.
    */
   toLangchainChatPromptTemplate() {
-    const lcMessages: [string, string][] = this.templateMessages.map((m) => [
-      m.role,
-      m.content as string
-    ]);
-    const chatTemplate = CustomChatPromptTemplate.fromMessages(lcMessages);
-    chatTemplate.variablesDefaultValues = this.variablesDefaultValues;
-    chatTemplate.literalTemplateMessages = this.templateMessages;
-    chatTemplate.promptId = this.id;
-
-    return chatTemplate;
+    try {
+      const {
+        CustomChatPromptTemplate
+        // eslint-disable-next-line @typescript-eslint/no-var-requires
+      } = require('../instrumentation/langchain');
+
+      const lcMessages: [string, string][] = this.templateMessages.map((m) => [
+        m.role,
+        m.content as string
+      ]);
+      const chatTemplate = CustomChatPromptTemplate.fromMessages(lcMessages);
+      chatTemplate.variablesDefaultValues = this.variablesDefaultValues;
+      chatTemplate.literalTemplateMessages = this.templateMessages;
+      chatTemplate.promptId = this.id;
+
+      return chatTemplate;
+    } catch (error) {
+      console.error('Error loading LangChain module:', error);
+      throw new Error(
+        'Failed to load LangChain. Please ensure LangChain is installed if you want to use this feature.'
+      );
+    }
   }
-}
+}
\ No newline at end of file
willydouhard commented 1 month ago

Thank you for the diff! I think https://github.com/Chainlit/literalai-typescript/pull/69 covers it, WDYT?

ghislainf commented 1 month ago

yes it works for the import