alondmnt / joplin-plugin-jarvis

Joplin (note-taking) assistant running a very intelligent system (OpenAI/GPT, Hugging Face, Gemini, Llama, Universal Sentence Encoder, etc.)
GNU Affero General Public License v3.0
209 stars 22 forks source link

Removing obsolete OpenAI models #23

Closed jakubjezek001 closed 3 months ago

jakubjezek001 commented 4 months ago

Reason for changes

Following models are deprecated:

Refactory of Edit with Jarvis:

- Enhanced UX: 
    - added Re-submit button - for re-submitting request to Jarvis
    - added button for Clear - clearing changes in previewing text frame
    - added button for Replace - action for replacing original selected text
- Updated CSS for improved UX with rounded corners and spacing
alondmnt commented 3 months ago

Thanks for doing this! Had it in my backlog for some time to switch to gpt-3.5-turbo-instruct.

jakubjezek001 commented 3 months ago

I have included here also enhancements of the Edit text tool. Do you want me to split it into another PR?

jakubjezek001 commented 3 months ago

It's a real bummer that the Joplin API for plugin user experience is so limited. It's super annoying when the dialogue popup disappears every time you send a request to Jarvis. I've looked everywhere but can't find a way to keep the box open or even show a message asking the user to hang tight and wait.

alondmnt commented 3 months ago

I'll have a closer look at the latest commits later this week. For now, there's no need to split the PRs. As for Joplin UX, using panels instead of dialogs keeps all elements in view at the expense of screen space. If you have revolutionary UX ideas (beyond updating the current dialogs, which is cool) I'd be happy to hear, but prefer that we discuss them in a separate PR.

alondmnt commented 3 months ago

Thanks @jakubjezek001, this looks great, and makes the edit feature really useful again. The only thing I'm going to change, is that since editing is no longer limited to a specific API I would like to try to make this work with any model selected by the user, such as locally-hosted LLMs. This means using a model class like everywhere else in the code.