jupyterlab / jupyter-ai

A generative AI extension for JupyterLab
https://jupyter-ai.readthedocs.io/
BSD 3-Clause "New" or "Revised" License
3.19k stars 321 forks source link

Add all notebook to context, #1037

Open Marchlak opened 5 days ago

Marchlak commented 5 days ago

Hello,

I’m testing your plugin, and it’s very good. I particularly like the /learn and /ask commands, which are useful when working on my project. However, I have a few suggestions regarding the context sent to the AI and more.

Problem

I haven’t seen an option to send the entire notebook or multiple cells as context. Selecting parts of just one cell is not very efficient, especially when the AI starts generating its own variables. I don’t see why I wouldn’t just use ChatGPT and paste parts of my code instead of using Jupyter AI, especially with the new canvas option (I’m ignoring local models, which, although improving, have never satisfied me as much as the solution from OpenAI). Working with the entire notebook would be much more convenient.

Proposed Solution

I understand that providing the entire notebook as context poses the issue of the LLM having to generate each modified cell separately so that it can be pasted into the corresponding cell in our notebook. I think many models could handle this task with appropriate context if we passed several cells along with their IDs in the selection field. Alternatively, this option could be disabled for multiple selections or sending the entire notebook.

Additional context

Perhaps passing the entire notebook as context poses other problems I haven't thought of. Regardless, do you plan to add such functionality to Jupyter AI? Is it possible to become a contributor in this matter? Lastly, a "fix" button for non-working cells would be helpful, as typing /fix is rather cumbersome.

JasonWeill commented 5 days ago

@Marchlak Thanks for your feedback! The common trade-off about context is that larger requests made to hosted LLMs may incur higher costs for the caller. We welcome pull requests that would add functionality to this project. Thanks for your interest in Jupyter AI.

dlqqq commented 5 days ago

@Marchlak Thank you for writing such a well-thought-out issue! As of Jupyter AI v2.24.0, there exists an @file command you can call to include a file's content with your prompt. You can use @file to ask questions about notebooks (or any plaintext file) on your local filesystem.

You can also chain multiple @file commands in the same prompt, so you can ask questions about multiple notebooks simultaneously as long as they fit within your configured LLM's token window.

Lastly, a "fix" button for non-working cells would be helpful, as typing /fix is rather cumbersome.

Great suggestion. I've received a similar feature request from other users privately, so this is clearly a feature the community could be interested in. Can you describe the user experience you'd like to see in Jupyter AI by opening a separate issue? I would love to get your feedback on this.

Is it possible to become a contributor in this matter?

We absolutely welcome contributions! We have an open issue for improving the @file experience when called on notebooks: https://github.com/jupyterlab/jupyter-ai/issues/1033

williamstein commented 14 hours ago

I'm listening to the discussion in the JupyterLab meeting now. In CoCalc we just have a switch of "Selection/Cell/All/None":

image

In CoCalc one of our core design values with LLM's is being very clear about what the LLM sees rather than making it implicit and silent.

We never include the output.