Closed richlysakowski closed 4 months ago
@richlysakowski As long as LangChain has an LLM class for the provider you are interested in, this can be added to Jupyter AI. To get started, here are some existing example here for provider implementations. https://github.com/jupyterlab/jupyter-ai/blob/main/packages/jupyter-ai-magics/jupyter_ai_magics/providers.py#L450
Here are the steps to add a new provider:
providers.py
which should extend from BaseProvider
and VertexAI
.Hi @richlysakowski @3coins - I believe we can make this easier I’m the maintainer of LiteLLM - we allow you to deploy a LLM proxy to call 100+ LLMs in 1 format - PaLM, Bedrock, OpenAI, Anthropic etc https://github.com/BerriAI/litellm/tree/main/openai-proxy.
If this looks useful (we're used in production)- please let me know how we can help.
PaLM request
curl http://0.0.0.0:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "palm/chat-bison",
"messages": [{"role": "user", "content": "Say this is a test!"}],
"temperature": 0.7
}'
gpt-3.5-turbo request
curl http://0.0.0.0:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-3.5-turbo",
"messages": [{"role": "user", "content": "Say this is a test!"}],
"temperature": 0.7
}'
claude-2 request
curl http://0.0.0.0:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "claude-2",
"messages": [{"role": "user", "content": "Say this is a test!"}],
"temperature": 0.7
}'
Gemini API is superseding Palm API.
I'll follow @3coins's advice to add a new provider, but I can't figure out how to use jupyter-ai from local sources, so I might need some help with testing.
@simonff do you need help setting up jupyter-ai locally for development? Did you encounter any error, or failed at a particular step?
@krassowski : I think various installed versions got mixed up, and the last error I got is below. Yes, it would be great to try some self-contained instructions for running jupyter-ai locally. For now I'm just using github actions to run tests.
Traceback (most recent call last): File "/usr/lib/python3/dist-packages/notebook/traittypes.py", line 235, in _resolve_classes klass = self._resolve_string(klass) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/google/home/simonf/.local/lib/python3.11/site-packages/traitlets/traitlets.py", line 2018, in _resolve_string return import_item(string) ^^^^^^^^^^^^^^^^^^^ File "/usr/local/google/home/simonf/.local/lib/python3.11/site-packages/traitlets/utils/importstring.py", line 31, in import_item module = import(package, fromlist=[obj]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ModuleNotFoundError: No module named 'jupyter_server.contents'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/bin/jupyter-notebook", line 33, in
Yes, it looks like it stems from a conflict between packages installed on the system level and local version of Python. I would strongly suggest developing in a self-contained virtual environment.
@simonff To do a development installation locally, run these in your terminal:
# verify that you are at the root of the repo
pwd
# create & activate a new isolated Python environment named `jupyter-ai-dev`
# I recommend using `micromamba`, but you can use `conda` or `venv`
micromamba create -yn jupyter-ai-dev python=3.11 jupyterlab
micromamba activate jupyter-ai-dev
# install JS dependencies
jlpm
# build JS code & install `jupyter-ai` locally in this environment
jlpm dev-install
I don't have conda or micromamba installed, so I'm trying venv. This worked: /myenv/bin/pip install jupyterlab
But for the next step I get: ./myenv/bin/pip install jupyter-ai-dev ERROR: Could not find a version that satisfies the requirement jupyter-ai-dev (from versions: none) ERROR: No matching distribution found for jupyter-ai-dev
Sorry for beginner questions, I usually just run "pip install --break-system-packages". :)
In the example above jupyter-ai-dev
is a name of the environment, not of the installable. Once you activate the environment, you should have a script named jlpm
which is installed with jupyterlab
package. Running this script using jlpm && jlpm dev-install
should do the trick.
Ok, 'jlpm' by itself runs (and I see it's coming from the myenv environment), but then I get (running from the root of the git repo):
jlpm dev-install
lerna notice cli v6.6.2
Lerna (powered by Nx) The following projects do not have a configuration for any of the provided targets ("dev-install")
@jupyter-ai/monorepo
Lerna (powered by Nx) Running target dev-install for 2 projects:
——————————————————————————————————————————————————————————————————————————————————
@jupyter-ai/magics:dev-install
@jupyter-ai/magics: error: externally-managed-environment
@jupyter-ai/magics: × This environment is externally managed
@jupyter-ai/magics: ╰─> To install Python packages system-wide, try apt install
@jupyter-ai/magics: python3-xyz, where xyz is the package you are trying to
@jupyter-ai/magics: install.
@jupyter-ai/magics:
@jupyter-ai/magics: If you wish to install a non-Debian-packaged Python package,
@jupyter-ai/magics: create a virtual environment using python3 -m venv path/to/venv.
@jupyter-ai/magics: Then use path/to/venv/bin/python and path/to/venv/bin/pip. Make
@jupyter-ai/magics: sure you have python3-full installed.
@jupyter-ai/magics:
@jupyter-ai/magics: If you wish to install a non-Debian packaged Python application,
@jupyter-ai/magics: it may be easiest to use pipx install xyz, which will manage a
@jupyter-ai/magics: virtual environment for you. Make sure you have pipx installed.
@jupyter-ai/magics:
@jupyter-ai/magics: See /usr/share/doc/python3.11/README.venv for more information.
@jupyter-ai/magics: note: If you believe this is a mistake, please contact your Python installation or OS distribution provider. You can override this, at the risk of breaking your Python installation or OS, by passing --break-system-packages.
@jupyter-ai/magics: hint: See PEP 668 for the detailed specification.
——————————————————————————————————————————————————————————————————————————————————
Lerna (powered by Nx) Running target dev-install for 2 projects failed
Tasks not run because their dependencies failed or --nx-bail=true:
@jupyter-ai/core:dev-install
Failed tasks:
@jupyter-ai/magics:dev-install
Meanwhile, this is the new code so far:
https://github.com/jupyterlab/jupyter-ai/compare/main...simonff:jupyter-ai:main
I tried to add the Gemini provider, but could not get Gemini to show up on the model list. Any help will be appreciated. https://github.com/jupyterlab/jupyter-ai/pull/666
This issue should be resolved by #666, which adds support for Gemini. Thanks to @giswqs for working on this! Users will have access to this in the next release, which we are planning for sometime this week.
@dlqqq Thank you for your help! I couldn't have done it without your guidance. Thank you for accepting the PR.
Problem
Support / wrapper for Google PaLM and Bard missing. I don't find any mention of it here, yet PaLM is one of the major AI LLM. (@JasonWeill on 2023-12-29: The Gemini API is replacing PaLM — thanks @simonff)
Proposed Solution
Can we start a discussion about requirements packages, and tools needed to bring PaLM and Bard functionality into JupyterAI?
Is there a description of how to wrap new LLM engines?
Are there template wrappers or decorators to make this easy ?
What limitations do we need to take into account for this integration? I know that Google has been blocking Python wrappers that reverse engineer Bard, but what about PaLM API? What are the most effective workarounds for using a wrapper around Bard Chat?
Please help with this integration so we can get it done soon, because I am getting requests from Julyter.AI users I am training.