jupyterlite / jupyterlite

Wasm powered Jupyter running in the browser 💡
https://jupyterlite.rtfd.io/en/stable/try/lab
BSD 3-Clause "New" or "Revised" License
3.75k stars 286 forks source link

Add jupyter-ai (LLM) Support to JupyterLite #1121

Open theonewolf opened 11 months ago

theonewolf commented 11 months ago

Problem

JupyterLab upstream has a new module to support LLMs while coding: jupyter-ai. It would be great to take advantage of the latest LLM technology also inside JupyterLite.

Proposed Solution

Similar to other upstream packages, test or create a compatibility package for jupyter-ai to integrate into JupyterLite.

Additional context

This would help in using JupyterLite as a teaching tool, which is a huge killer application for it, because it requires no installation at all and runs entirely in a user's browser.

theonewolf commented 11 months ago

@jtpio I'm happy to work on this, but I do not know where to begin. Do you have a wiki on how to make things compatible with JupyterLite?

bollwyvl commented 11 months ago

Please see the documentation for emulating a server extension in a standalone extension.

This would likely not be appropriate in jupyterlite core, as it requires interacting with a non-free SaaS which, at best, requires a for-pay, per-use token, which there would be no way to protect in a jupyterlite setting. At worst, and is all too common, the upstream service could break/go away.

jtpio commented 11 months ago

Yes it will have to be implemented as a third-party extension.

Ideally this could be done in the jupyter-ai repo directly. There is an issue about this: https://github.com/jupyterlab/jupyter-ai/issues/119. Last time I checked some dependencies made it difficult to use in Pyodide, but maybe the situation has improved since.

psychemedia commented 11 months ago

A recent PR to jupyter-ai added support for the locally run LLMs as well as as hosted LLMs. If the locally hosted LLM can be accessed via an http API, then presumably there would be a route for the jupyter-ai extension to access it without any need to call on a non-free SaaS offering.

In the JupyterLite context, a WASM based LLM could also presumably be run in the browser context alongside JupyterLite, if your browser/machine was up to it? An early demo of WASM based LLM chat can be found here: https://github.com/mlc-ai/web-llm I'm not sure what models it supports, though?

theonewolf commented 10 months ago

@bollwyvl and @jtpio i was thinking we could take a pure JS approach with new WASM projects like Starcoder in JS:

https://rahuldshetty.github.io/starcoder.js/

I teach Python and Data Science. Having a coding assistant directly available to my students in self-hosted notebooks would greatly enhance their learning experience.

@bollwyvl thanks for the extension documentation, I’ll take a look.