Open JasonWeill opened 9 months ago
To expand I would think it would be very, very powerful if the backend vector store could be used across applications.
I would like to be able to provide the foundation to 30 diverse users. Some users would be using notebooks on our Jupyter hub, some via vs code or pycharm. Some our interactive dashboards.
I think your UI close to the code and data is very powerful. And I'm totally blown away by your /read implementation. And I would like to tap into that process such that my users do not have to provide those documents again and again in other applications.
An example of a CLi for interacting with a vector store would be https://github.com/simonw/llm
For context I've been contributing to the chat components of HoloViz Panel including https://holoviz-topics.github.io/panel-chat-examples/. And I would also love to build applications on top of the Jupyter-ai vector store.
Since jupyter-ai chat package already inherits from JupyterApp
it could quite easily provide a jupyter ai
command (in the same way as jupyterlab providers jupyter lab
and jupyter labextension
white jupyter-server
provides jupyter server
.
/generate
available in magic commands as well.
Problem
Users would like to learn from local documents, and ask questions of files via retrieval augmented generation (RAG), without using the chat UI.
Proposed Solution
Make equivalents to the
/learn
and/ask
commands available outside the chat UI, such as in magic commands. Allow users to specify their embedding model of choice, similar to how magic commands work.Additional context
This is adapted from https://github.com/jupyterlab/jupyter-ai/issues/205#issuecomment-1795337608 , a comment by @MarcSkovMadsen . Thanks!