yaroslavyaroslav / OpenAI-sublime-text

First class Sublime Text AI assistant with GPT-4o and llama.cpp support!
MIT License
130 stars 11 forks source link

How to use with ollama #48

Closed reagle closed 2 months ago

reagle commented 2 months ago
  1. I am running llama3 in the terminal.

  2. I have openAI.sublime-settings as:

    // Settings in here override those in "OpenAI completion/openAI.sublime-settings"
    {
    "url": "http://localhost:11434",
    "token": "sk-your-token",
    "model": "llama3",
    }
  3. but when I invoke new message, whatever I type in the "Question:" bottom panel returns: "OpenAI error model 'gpt-4-0613' not found, try pulling it first".

yaroslavyaroslav commented 2 months ago

Seems like you're running "OpenAI: New Message" right from the start instead of to run "OpenAI: Chat Model Select" first to select a model from the list.

yaroslavyaroslav commented 2 months ago

https://github.com/yaroslavyaroslav/OpenAI-sublime-text#ai-assistance-use-case please check this part of readme for detailed instructions.

reagle commented 2 months ago

I did try that, but didn't see llama3 in the menu. (I can scroll down a bit further, but it's not there.) I thought since I specified "model": "llama3" it would be in the menu or be set in any case.

reagle commented 2 months ago
SCR-20240502-sepz
yaroslavyaroslav commented 2 months ago

Yeah, now I see.

You have a few issues within your config. Here's the correct example:

{
    "url": "http://localhost:11434",
    "token": "sk-your-token",
    "status_hint": [
        "name",
        "prompt_mode",
        "chat_model"
    ],
    "assistants": [
        {
            "name": "Local llama assistant",
            "chat_model": "llama3",
            "assistant_role": "You are a senior code assistant",
            "prompt_mode": "panel"
        }
    ]
}

Please pay attention that now the "assistants" array appears within the setup and assistant related settings were moved there. The second thing to consider is that there are 4 required settings to make things work (like to add the next to it predefined assistant), they are "name", "chat_model", "assistant_role", "prompt_mode".

I'd recommend to read comments in settings when you'll have time for obtaining more in depth knowledge of how this thing work https://github.com/yaroslavyaroslav/OpenAI-sublime-text/blob/master/openAI.sublime-settings#L47-L73

reagle commented 2 months ago

I tried the above settings but that didn't work. I'm putting this aside for now as too complicated, feel free to close.

reagle commented 2 months ago

BTW: Here was the error:

...
RuntimeError: unable to instantiate 'OpenAI completion.openai_panel.OpenaiPanelCommand'
Traceback (most recent call last):
  File "/Applications/Sublime Text.app/Contents/MacOS/Lib/python38/sublime_plugin.py", line 535, in create_window_commands
    o = cls(window)
  File "/Users/reagle/Library/Application Support/Sublime Text/Installed Packages/OpenAI completion.sublime-package/openai_panel.py", line 20, in __init__
    self.load_assistants()
  File "/Users/reagle/Library/Application Support/Sublime Text/Installed Packages/OpenAI completion.sublime-package/openai_panel.py", line 26, in load_assistants
    self.assistants: List[AssistantSettings] = [
  File "/Users/reagle/Library/Application Support/Sublime Text/Installed Packages/OpenAI completion.sublime-package/openai_panel.py", line 27, in <listcomp>
    AssistantSettings(**{**DEFAULT_ASSISTANT_SETTINGS, **assistant})
TypeError: __init__() got an unexpected keyword argument 'model'

The above exception was the direct cause of the following exception:

RuntimeError: unable to instantiate 'OpenAI completion.openai_panel.OpenaiPanelCommand'
Empty file I belive
Empty file I belive
...
yaroslavyaroslav commented 2 months ago

Yep, there was a mistake in the provided snipped above fixed it.