Closed jaredpalmer closed 1 year ago
Related to #62 and the discussion at #152
@nfcampos I have no experience with
Create and test cloudlfare workers and vercel edge templates
But I'd love to tag team on this. This is also a huge need for our organization as well. Similar to the points @jasongill brought up.
I'd also want to add some kind of deno export to allow for deno edge functions to be run.
I'm still happy to put a bounty on getting Cloudflare Worker support added, or donate to charity of devs choice, to get this done - unfortunately actually doing the coding is above my skill level but happy to contribute to the project to encourage this feature!
I'm still happy to put a bounty on getting Cloudflare Worker support added, or donate to charity of devs choice, to get this done - unfortunately actually doing the coding is above my skill level but happy to contribute to the project to encourage this feature!
You can use @ericlewis/langchain
. It is a version of langchain that is supported on workers with no need for node_compat.
@ericlewis I just sent some $ to you via Github sponsorship, thank you for this! Do we think this is something that can get merged into upstream, I see it appears to need changes to both the OpenAI packages and langchain itself
@ericlewis I just sent some $ to you via Github sponsorship, thank you for this! Do we think this is something that can get merged into upstream, I see it appears to need changes to both the OpenAI packages and langchain itself
I think the only remaining change is the stuff needed for the crypto dependency, I think that the OpenAI stuff is likely done already. My library is based on my own version of both changes. I have PR open for the crypto stuff too.
I've merged the crypto PR today, thank you
@nfcampos also happy to send some $ to you and @hwchase17 if you let me know where (don't see your sponsorship button is set up). You can contact me directly if you'd like, my github username @gmail.com
Love this project and the ecosystem that's building around it!
I've merged the crypto PR today, thank you
sweet, then that should be most of the big things that needed changing for now.
@jasongill I can confirm that the latest release works at least basically on cloudflare workers.
@ericlewis awesome, yes it's working for me in node_compat mode! it's complaining when node_compat is off but who cares - it works!
node_modules/langchain/dist/callbacks/tracers.js:1:25: ERROR: Could not resolve "process"
node_modules/langchain/dist/callbacks/utils.js:1:25: ERROR: Could not resolve "process"
Alright I've released a new version (0.0.35) which should remove those process
warnings
Alright I've released a new version (0.0.35) which should remove those
process
warnings
@jasongill @nfcampos can confirm that you no longer need node_compat
in 0.0.35. Working great!
@ericlewis can you post a sample if you get a moment? without node_compat while using 0.0.35 I'm still getting errors about a few places that process.env is called (example is callbacks/utils.ts but there are others, such as where it tries to get OPENAI_API_KEY)
@ericlewis can you post a sample if you get a moment? without node_compat while using 0.0.35 I'm still getting errors about a few places that process.env is called (example is callbacks/utils.ts but there are others, such as where it tries to get OPENAI_API_KEY)
You sure you updated? I will publish a project soon as an example.
Yeah, definitely updated. Here's an example from master of usage of process.env that remains in the code:
https://github.com/hwchase17/langchainjs/blob/main/langchain/src/callbacks/utils.ts#L17
Yeah, definitely updated. Here's an example from master of usage of process.env that remains in the code:
https://github.com/hwchase17/langchainjs/blob/main/langchain/src/callbacks/utils.ts#L17
this should be fine if using latest wrangler etc for CFW. but without more specifics it's hard to know why it's not working.
wrangler.toml
name = "chatty"
main = "src/index.ts"
compatibility_date = "2023-03-01"
usage_model = "unbound"
[durable_objects]
bindings = [{name = "BRAIN", class_name = "Brain"}]
[[migrations]]
tag = "v1"
new_classes = ["Brain"]
[vars]
PROMPT = "You are a large language model trained by OpenAI to behave as an assistant."
VERSION = "v76"
MODEL = "gpt-3.5-turbo"
# The necessary secrets are:
# - OPENAI_API_KEY
# Run `echo <VALUE> | wrangler secret put <NAME>` for each of these
package.json
{
"name": "chatty",
"version": "0.0.0",
"devDependencies": {
"@cloudflare/workers-types": "^4.20230228.0",
"@types/object-hash": "^3.0.2",
"i": "^0.3.7",
"npm": "^9.6.0",
"typescript": "^4.9.5",
"wrangler": "2.12.0"
},
"private": true,
"scripts": {
"start": "wrangler dev",
"deploy": "wrangler publish"
},
"dependencies": {
"langchain": "^0.0.35"
}
}
tsconfig.json
{
"compilerOptions": {
"target": "es2021",
"lib": [
"es2021"
],
"jsx": "react",
"module": "es2022",
"moduleResolution": "node",
"types": [
"@cloudflare/workers-types",
"vitest"
],
"resolveJsonModule": true,
"allowJs": true ,
"checkJs": false,
"noEmit": true,
"isolatedModules": true,
"allowSyntheticDefaultImports": true,
"forceConsistentCasingInFileNames": true,
"strict": true,
"skipLibCheck": true
}
}
index.js
import {OpenAI} from "langchain/llms";
export default {
async fetch(request, env) {
const model = new OpenAI({openAIApiKey: env.OPENAI_API_KEY});
const res = await model.call(
"What is the capital of France?"
);
return new Response(res.trim());
},
};
package.json
{
"dependencies": {
"langchain": "^0.0.35"
},
"devDependencies": {
"wrangler": "^2.12.3"
}
}
wrangler.toml
name = "langchainjs-example"
main = "index.js"
compatibility_date = "2023-03-15"
[vars]
OPENAI_API_KEY="...."
npx wrangler dev
output when loading the page; same thing happens after npx wrangler publish
3:33:50 PM GET / 500
✘ [ERROR] Uncaught (in promise) ReferenceError: process is not defined
^
at getInstance
(langchainjs-example/node_modules/langchain/src/callbacks/utils.ts:17:6)
at getCallbackManager
(langchainjs-example/node_modules/langchain/src/callbacks/utils.ts:27:34)
at BaseLanguageModel
(langchainjs-example/node_modules/langchain/src/base_language/index.ts:29:53)
at BaseLLM
(langchainjs-example/node_modules/langchain/src/llms/base.ts:44:4)
at OpenAI
(langchainjs-example/node_modules/langchain/src/llms/openai.ts:148:4)
at fetch (langchainjs-example/index.js:5:22)
I am guessing since you are using Typescript that Workers must be doing/injecting something else into your environment, perhaps? it doesn't appear to work in simple Javascript (without node_compat, in which case it works great)
I am guessing since you are using Typescript that Workers must be doing/injecting something else into your environment, perhaps? it doesn't appear to work in simple Javascript (without node_compat, in which case it works great)
I will try with just JS, but shouldn't be the case.
Hi, edge/browser support has been published as a prerelease, you can install the prerelease with npm i langchain@next
to try it out before we release to everyone.
See upgrade instructions here https://langchainjs-docs-git-nc-test-exports-cf-langchain.vercel.app/docs/getting-started/install you'll need to update import paths, see the link for more details
If you test it let me know any issues!
Support for Browsers, Cloudflare workers, Nextjs/vercel (browser/serverless/edge), Deno, Supabase edge functions has been released, see https://blog.langchain.dev/js-envs/ for details. See the docs for install/upgrade instructions https://js.langchain.com/docs/getting-started/install including some breaking changes
Any issues let me know
Slow clap. Great work @nfcampos. It works swimmingly on supabase (v1.49.4 CLI). The latest edge-runtime image seems to be having issues (which is funny cause its supabase's big old release today.
Right now there is a reliance on
fs
and other native node modules which do not exist in the Vercel edge runtime. It would be amazing if we could publish alangchain/edge
with the functionality that can be run at edge