Open slavakurilyak opened 1 year ago
Hey @slavakurilyak - I'd love to hear more about this LangChain integration idea. From our testing with LangChan JS, we've explored using it by primarily wrapping any call from LangChain within a step.run
meaning that it's mostly LangChain code with some added help of Inngest's step.run
and not too much integration (see basic example below).
What type of integration would you like to see? How might that work for you?
export const basicChain = inngest.createFunction(
{ name: "Basic Chain" },
{ event: "ai/basic.chain" },
async ({ event, step }) => {
// Get the input data from the event payload
const product = event.data.product;
const model = new OpenAI({ temperature: 0 });
const prompt = PromptTemplate.fromTemplate(
"What is a good name for a company that makes {product}?"
);
const chainA = new LLMChain({ llm: model, prompt });
const result = await step.run("First prompt", async () => {
return await chainA.call({ product });
});
return { message: "success" };
}
);
Is your feature request related to a problem? Please describe.
There is currently no straightforward way to run LangChain-supported models in serverless environments without dealing with infrastructure or state concerns. This is a challenge because it creates a barrier for developers who want to deploy and manage LLM and chat model applications in serverless environments.
Describe the solution you'd like
I propose the integration of LangChain into Inngest. This would allow developers to run LangChain-managed language models in a serverless environment, handling all infrastructure and state concerns automatically. This would greatly simplify the process of deploying and managing LLM applications in serverless environments.
Describe alternatives you've considered
Alternatives to LangChain include LlamaIndex (previously known as GPT Index)
Additional context
Inngest's blog post on May 16, 2023, highlighted their interest in integrating with LangChain to allow people to run LangChain models in serverless environments. Given Inngest's mission to simplify and automate serverless workflows, and LangChain's goal to enable developers to build LLM applications, this integration seems like a natural fit.