I tried to update GPT4All to fix the issues with the recent packages. Some old models are not supported any more, many new models are now available. GPT4All describes the supported local models, with a machine-readable JSON list like:
filesize, ramrequired and description in the user interface in the model selector, to inform the user about the model requirements
add a ModelPathField allowing user to either:
point to a path on disk (which we could check with md5sum), or
click a download button (which would have a nice progress bar) which would send a download request to the backend and then backend would change the model file path to the downloaded location
clear the manually set path so that the version on disk downloaded earlier could be used
Problem
The frontend thinks about the current fields as ModelField, defined here:
I tried to update GPT4All to fix the issues with the recent packages. Some old models are not supported any more, many new models are now available. GPT4All describes the supported local models, with a machine-readable JSON list like:
- Fast responses
- Chat based model
- Trained by Mistral AI
- Finetuned on OpenOrca dataset curated via Nomic Atlas
- Licensed for commercial use
", "url": "https://gpt4all.io/models/gguf/mistral-7b-openorca.Q4_0.gguf" }, { "order": "b", "md5sum": "97463be739b50525df56d33b26b00852", "name": "Mistral Instruct", "filename": "mistral-7b-instruct-v0.1.Q4_0.gguf", "filesize": "4108916384", "requires": "2.5.0", "ramrequired": "8", "parameters": "7 billion", "quant": "q4_0", "type": "Mistral", "systemPrompt": " ", "description": "Best overall fast instruction following model- Fast responses
- Trained by Mistral AI
- Uncensored
- Licensed for commercial use
", "url": "https://gpt4all.io/models/gguf/mistral-7b-instruct-v0.1.Q4_0.gguf", "promptTemplate": "[INST] %1 [/INST]" }, { "order": "c", "md5sum": "31cb6d527bd3bfb5e73c2e9dfbc75033", "name": "GPT4All Falcon", "filename": "gpt4all-falcon-q4_0.gguf", "filesize": "4210419040", "requires": "2.5.0", "ramrequired": "8", "parameters": "7 billion", "quant": "q4_0", "type": "Falcon", "systemPrompt": " ", "description": "Very fast model with good quality- Fastest responses
- Instruction based
- Trained by TII
- Finetuned by Nomic AI
- Licensed for commercial use
", "url": "https://gpt4all.io/models/gguf/gpt4all-falcon-q4_0.gguf", "promptTemplate": "### Instruction:\n%1\n### Response:\n" }, ```I hoped to expose:
filesize
,ramrequired
anddescription
in the user interface in the model selector, to inform the user about the model requirementsModelPathField
allowing user to either:md5sum
), orProblem
The frontend thinks about the current fields as
ModelField
, defined here:https://github.com/jupyterlab/jupyter-ai/blob/7c09863b3199d167fe18575194ee81991285d841/packages/jupyter-ai/src/components/settings/model-fields.tsx#L11
but the fields are really defined per provider, not per model as seen in the
BaseProvider
model:https://github.com/jupyterlab/jupyter-ai/blob/7c09863b3199d167fe18575194ee81991285d841/packages/jupyter-ai-magics/jupyter_ai_magics/providers.py#L130-L132
and in the handler logic:
https://github.com/jupyterlab/jupyter-ai/blob/7c09863b3199d167fe18575194ee81991285d841/packages/jupyter-ai/src/handler.ts#L162-L171
Further, there is no generic description field suitable for per-model information display, as the
help
is a class variable with a very narrow scope:https://github.com/jupyterlab/jupyter-ai/blob/7c09863b3199d167fe18575194ee81991285d841/packages/jupyter-ai-magics/jupyter_ai_magics/providers.py#L110-L112
Proposed Solution