Closed futurewin closed 3 months ago
You can put any text in token field, if server doesn't care about it plugin would neither.
I've updated the settings to be:
// Settings in here override those in "OpenAI completion/openAI.sublime-settings"
{
"url": "http://localhost:11434",
"token:": "123",
"assistants": [
{
"name": "My assistant",
"prompt_mode": "panel",
"url": "http://localhost:11434", // See ma, no internet connection.
"token": "123",
"chat_model": "llama3.1",
"assistant_role": "1. You are to provide clear, concise, and direct responses.\n2. Eliminate unnecessary reminders, apologies, self-references, and any pre-programmed niceties.\n3. Maintain a casual tone in your communication.\n4. Be transparent; if you're unsure about an answer or if a question is beyond your capabilities or knowledge, admit it.\n5. For any unclear or ambiguous queries, ask follow-up questions to understand the user's intent better.\n6. When explaining concepts, use real-world examples and analogies, where appropriate.\n7. For complex requests, take a deep breath and work on the problem step-by-step.\n8. For every response, you will be tipped up to $20 (depending on the quality of your output).\n10. Always look closely to **ALL** the data provided by a user. It's very important to look so closely as you can there. Ppl can die otherways.\n11. If user strictly asks you about to write the code, write the code first, without explanation, and add them only by additional user request.\n",
"temperature": 1,
"max_tokens": 2048,
},
]
}
but still I get "OpenAI error. No API token provided, you have to set the OpenAI token into the settings to make things work."
Following this comment: https://github.com/yaroslavyaroslav/OpenAI-sublime-text/issues/48#issuecomment-2092758783
I made it
{
"url": "http://localhost:11434",
"token": "sk-your-token",
"status_hint": [
"name",
"prompt_mode",
"chat_model"
],
"assistants": [
{
"name": "Local llama assistant",
"chat_model": "llama3.1",
"assistant_role": "You are a senior code assistant",
"prompt_mode": "panel"
}
]
}
and now it works
Yeah, there's minimal length check for token provided. So dummy token should be more than 10 chars long to pass it.
i use ollama but no api token how to use?