Open ZhangHanwen96 opened 1 year ago
Also keen on using LMQL on server-side (via node.js)
Current setup is that I'm calling openai on a vercel serverless function (next.js) and manually checking that completions are in a certain format.
I think a client side library in TypeScript would be awesome. I'm in to contribute.
I'm looking forward to the TypeScript implementation, using it in edge runtime (Cloudflare workers, Vercel edge functions, Deno, etc.) or Node.js.
I looked into bridging Python-Nodejs interoperation, but the space seems quite sparse. Is a full native port the only option here, or is anyone aware of a non-messy solution for calling Python code from Node.js?
Given that the In-Browser playground is already based on WebAssembly/Pyodide (https://pyodide.org/en/stable/usage/index.html#node-js), it may make sense to investigate this further, for local use. In the browser, the tokenizer is unfortunately a little slow.
Just wanted to add, langchain i believe is doing a 1:1 map of their python library to their javascript library https://github.com/hwchase17/langchainjs
As for the questions above:
Do you want to run LMQL in the browser or server-side, e.g. via node.js.
I think it's interesting to explore the possibility of supporting both from the getgo, especially with the rise of using something like Vercel Edge Functoins Langchain just supported that lately https://github.com/hwchase17/langchainjs/pull/632
Are you looking for a full-on adaption of the LMQL language to allow inline TypeScript/JavaScript in the prompt clause or are you just interested in some basic API that gives you a run(query_code: string) function.
To be honest i just want to run the queries for now.
Thanks for creating this interesting language 🙏
In the meantime, developers looking to use this in their projects in other languages can set up a python API server and execute LMQL queries that way.
Can the playground be used headlessly in browser?
The Playground uses a WebAssembly Build of LMQL that runs in a Web Worker. You could indeed run the same setup headlessly, relying on JavaScript on the Frontend and communicating with LMQL via Web Worker Messages (this is what the Browser Playground does).
We plan to expose a clean interface for this that will enable this form of in-browser use of LMQL. Note however, that this will entail some loading time for LMQL to become available, as it has to setup a proper Pyodide environment in the worker process. If you already want to hack around with this, you can have a look at https://github.com/eth-sri/lmql/blob/main/web/browser-build/src/lmql-worker.js, which implements the current LMQL Web Worker interface. On the JS side of things, the playground uses https://github.com/eth-sri/lmql/blob/main/src/lmql/ui/playground/src/browser_process.js, to communicate and call the LMQL runtime.
I have created a small experimental demo project, which loads the LMQL WebAssembly Build in JavaScript here: https://github.com/lmql-lang/lmql.js, with some simple client code that demonstrates how to use it from JS.
Contributions are very welcome. It basically needs some adaptations to get a cleaner JS API and a good interface with the Web Worker that actually runs the LMQL runtime. Feel free to also reach out via Discord, if you want to help with this.
In the meantime, developers looking to use this in their projects in other languages can set up a python API server and execute LMQL queries that way.
Anyone did this supporting streaming from python to js? Would be great to support the vercel ai sdk
Hi Zhang, we indeed are thinking about expanding to other platforms and languages.
Could you detail a bit what kind of TypeScript support you are looking for:
run(query_code: string)
function.Please let us know, this is definitely an interesting and important direction.
I may note that the LMQL Playground already runs fully in the browser via Emscripten/Pyodide. So depending on your tolerance for latency and startup time, this may already be an option to run in a JS environment for now.