langchain-ai / langchainjs

πŸ¦œπŸ”— Build context-aware reasoning applications πŸ¦œπŸ”—
https://js.langchain.com/docs/
MIT License
12.25k stars 2.07k forks source link

Add support for Cloudflare Workers / Vercel Edge Functions #62

Closed dqbd closed 1 year ago

dqbd commented 1 year ago

So far it seems like there are 3 culprits which may block us from running Langchainjs on Edge.

This is the error output when attempting to deploy the examples/src/chains/llm_chain.ts via Vercel:

CleanShot 2023-02-19 at 19 28 23@2x
jasongill commented 1 year ago

I'm also investigating this, seeing if I can clean up or pare down the package a bit to make it work in Cloudflare Workers. In theory node_compat = true in wrangler.toml should fix many of those missing modules by adding polyfills, but will also need to replace node's crypto with WebCrypto I think.

nfcampos commented 1 year ago

Another thing to fix here is to add support for at least one docstore that is external, as there is no filesystem support to load one from a file

nfcampos commented 1 year ago

One option for openai adapter is mentioned here https://github.com/openai/openai-node/issues/30#issuecomment-1342350056

nfcampos commented 1 year ago

@dqbd thanks so much for getting this started, updated your issue above to keep track of progress being made

nfcampos commented 1 year ago

Discussion on next envs to support here https://github.com/hwchase17/langchainjs/discussions/152

jasongill commented 1 year ago

@nfcampos @dqbd I would be happy to pay to encourage support for Cloudflare Workers for Langchain, if there's a place/dev I can send $500 to contribute to getting this done please let me know! Langchain at the edge is very exciting.

hwchase17 commented 1 year ago

@jasongill trying to think/understand the use case here. this may sound like a really stupid question, but what exactly is it about langchain at the edge that excites you so much?

jasongill commented 1 year ago

We run a large "AI writing" tool (think Jasper competitor) and one big issue we have is slow responses from LLM's. With a "server side" model, these slow responses consume resources - be it webserver connection slots, open file descriptors, etc, not to mention a small amount of CPU cycles which can add up with tens of thousands of simultaneous users.

Being able to build our prompts and chains into a serverless environment allows us to no longer worry about scalability issues - each user request straight from the browser can be handled by a Cloudflare Worker and can take as long as it wants without consuming any of our server time or resources, returning a response back to our frontend interface when it's done.

Basically, having Langchain at the edge will allow us to get rid of servers and worry less about stability and more about just building great prompts & user experiences.

nfcampos commented 1 year ago

Btw if someone wants to take this on, I started on the remaining issues on this branch https://github.com/hwchase17/langchainjs/compare/main...nc/test-envs

This

  1. Adds a way to test against cf workers and vercel edge (to find all the missing issues)
  2. Fixes (or starts to fix, can't remember just now) the crypto issue mentioned in the issue at the top
jasongill commented 1 year ago

I'm still happy to put a bounty on getting Cloudflare Worker support added, or donate to charity of devs choice, to get this done - unfortunately actually doing the coding is above my skill level but happy to contribute to the project to encourage this feature!

isaiahtaylor commented 1 year ago

FYI - if you are trying to use Edge runtime because it supports streaming, you can now stream from Serverless: https://twitter.com/isaiah_p_taylor/status/1638216571507331078

tiagoefreitas commented 1 year ago

Serverless free version has a low timeout making it unsuitable for langchain, edge free version has a longer timeout

On Tue, 21 Mar 2023 at 22:42, Isaiah Taylor @.***> wrote:

FYI - if you are trying to use Edge runtime because it supports streaming, you can now stream from Serverless: https://twitter.com/isaiah_p_taylor/status/1638216571507331078

β€” Reply to this email directly, view it on GitHub https://github.com/hwchase17/langchainjs/issues/62#issuecomment-1478285912, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACXJNRUU77QETJNRP5A452DW5HOPFANCNFSM6AAAAAAVBEORVI . You are receiving this because you are subscribed to this thread.Message ID: @.***>

yvesloy commented 1 year ago

Do you already know when you can get to task 2 (better treeshaking support)? Can I help with this?

nfcampos commented 1 year ago

@yvesloy task 2 has actually been completed already, I had forgotten to update the list above. The next step is to setup a build specifically for edge

treyhuffine commented 1 year ago

Very excited about this and happy to contribute as needed to get it working on Vercel Edge

nfcampos commented 1 year ago

Hi, edge/browser support has been published as a prerelease, you can install the prerelease with npm i langchain@next to try it out before we release to everyone.

See upgrade instructions here https://langchainjs-docs-git-nc-test-exports-cf-langchain.vercel.app/docs/getting-started/install you'll need to update import paths, see the link for more details

If you test it let me know any issues!

treyhuffine commented 1 year ago

Thanks @nfcampos I'll start testing!

nfcampos commented 1 year ago

Support for Browsers, Cloudflare workers, Nextjs/vercel (browser/serverless/edge), Deno, Supabase edge functions has been released, see https://blog.langchain.dev/js-envs/ for details. See the docs for install/upgrade instructions https://js.langchain.com/docs/getting-started/install including some breaking changes

Any issues let me know

abhagsain commented 1 year ago

Support for Browsers, Cloudflare workers, Nextjs/vercel (browser/serverless/edge), Deno, Supabase edge functions has been released, see https://blog.langchain.dev/js-envs/ for details. See the docs for install/upgrade instructions https://js.langchain.com/docs/getting-started/install including some breaking changes

Any issues let me know

I'm really excited about this update. However, importing langchain significantly increases the bundle size for Cloudflare Workers. I just imported it to try it out in my app and received a warning.

Before

Total Upload: 928.13 KiB / gzip: 187.40 KiB

After

Total Upload: 4178.92 KiB / gzip: 1457.85 KiB
β–² [WARNING] We recommend keeping your script less than 1MiB (1024 KiB) after gzip. 
Exceeding past this can affect cold start time

I haven't used langchain yet, so I followed the docs and only imported these.

import { OpenAI } from 'langchain/llms/openai'
import { PromptTemplate } from 'langchain/prompts'
import { LLMChain } from 'langchain/chains'

Any recommendations @nfcampos ?

Edit - It's cold start time so it won't be that bad after the first request so I think I don't need to worry I guess.

beljand commented 1 year ago

Support for Browsers, Cloudflare workers, Nextjs/vercel (browser/serverless/edge), Deno, Supabase edge functions has been released, see https://blog.langchain.dev/js-envs/ for details. See the docs for install/upgrade instructions https://js.langchain.com/docs/getting-started/install including some breaking changes Any issues let me know

I'm really excited about this update. However, importing langchain significantly increases the bundle size for Cloudflare Workers. I just imported it to try it out in my app and received a warning.

Before

Total Upload: 928.13 KiB / gzip: 187.40 KiB

After

Total Upload: 4178.92 KiB / gzip: 1457.85 KiB
β–² [WARNING] We recommend keeping your script less than 1MiB (1024 KiB) after gzip. 
Exceeding past this can affect cold start time

I haven't used langchain yet, so I followed the docs and only imported these.

import { OpenAI } from 'langchain/llms/openai'
import { PromptTemplate } from 'langchain/prompts'
import { LLMChain } from 'langchain/chains'

Any recommendations @nfcampos ?

Edit - It's cold start time so it won't be that bad after the first request so I think I don't need to worry I guess.

Same issue here, you will not be able to deploy this worker (since it exceeded the size limit)

abhagsain commented 1 year ago

Support for Browsers, Cloudflare workers, Nextjs/vercel (browser/serverless/edge), Deno, Supabase edge functions has been released, see https://blog.langchain.dev/js-envs/ for details. See the docs for install/upgrade instructions https://js.langchain.com/docs/getting-started/install including some breaking changes Any issues let me know

I'm really excited about this update. However, importing langchain significantly increases the bundle size for Cloudflare Workers. I just imported it to try it out in my app and received a warning.

Before

Total Upload: 928.13 KiB / gzip: 187.40 KiB

After

Total Upload: 4178.92 KiB / gzip: 1457.85 KiB
β–² [WARNING] We recommend keeping your script less than 1MiB (1024 KiB) after gzip. 
Exceeding past this can affect cold start time

I haven't used langchain yet, so I followed the docs and only imported these.

import { OpenAI } from 'langchain/llms/openai'
import { PromptTemplate } from 'langchain/prompts'
import { LLMChain } from 'langchain/chains'

Any recommendations @nfcampos ? Edit - It's cold start time so it won't be that bad after the first request so I think I don't need to worry I guess.

Same issue here, you will not be able to deploy this worker (since it exceeded the size limit)

I haven't tried deploying it yet but the limit is 5MB after compression right? So shouldn't be a problem. image

captainjapeng commented 1 year ago

The 5MB limit only applies for paid plans, for the Free Tier it's only 1MB. Also the warning says it could affect the start up time.

It would help if we can reduce this or use treeshaking but I don't know if CF Worker is already doing that.

abhagsain commented 1 year ago

The 5MB limit only applies for paid plans, for the Free Tier it's only 1MB. Also the warning says it could affect the start up time.

It would help if we can reduce this or use treeshaking but I don't know if CF Worker is already doing that.

Got it. I asked this in the CF worker discord, they said it will only affect the cold start time so it will only happen once if your worker is asleep for a long time. After the first time forward it will faster even so they said don't worry about it. But I do agree on the limit on the free and paid tiers some people would need to upgrade to Paid plan just to use langchain on workers.

nfcampos commented 1 year ago

I've opened an issue to investigate bundle size, #809 if anyone wants to help out there very welcome

jaredpalmer commented 1 year ago

Jared here from Vercel. Can we please re-open this issue? The most basic OpenAI x Next.js call w/Edge Runtime still fails because of TikToken. We would like to send folks toward LangChain in our documentation, but serverless is not cost effective or as performant for our customers

I have created an example that reproduces this issue: https://github.com/jaredpalmer/nextjs-langchain-bug/tree/main

CleanShot 2023-05-08 at 09 14 10@2x
nfcampos commented 1 year ago

cc @dqbd this one might be best for you @jaredpalmer thanks for the repro, we'll look into this asap

jaredpalmer commented 1 year ago

Update: I was able to get it working in Next.js and Vercel with the following changes to Next.js's webpack config:

/** @type {import('next').NextConfig} */
const nextConfig = {
  experimental: {
    serverActions: true,
  },
  webpack(config) {
    config.experiments = {
      asyncWebAssembly: true,
      layers: true,
    };

    return config;
  },
};

module.exports = nextConfig;
nfcampos commented 1 year ago

We actually have similar config in our install instructions in the docs, see https://js.langchain.com/docs/getting-started/install#vercel--nextjs would you like us to reword it or change it in any way?

jaredpalmer commented 1 year ago

If you want to use LangChain in frontend pages, you need to add the following to your next.config.js to enable support for WebAssembly modules (which is required by the tokenizer library @dqbd/tiktoken):

should change to something like

To use LangChain with Next.js (either with app/ or pages/), add the following to your next.config.js to enable support for WebAssembly modules (which is required by the LangChain's tokenizer library @dqbd/tiktoken):

nfcampos commented 1 year ago

Cool, I've updated it here https://github.com/hwchase17/langchainjs/pull/1165

nfcampos commented 1 year ago

@jaredpalmer hi, we've merged PR #1239 which removes the usage of a WASM tokeniser library, and so removes the need for any special config for Next.js/Vercel. This will be released today, and should make using langchain from Next.js "just work".