Closed randoentity closed 5 months ago
Via OpenAI API plugin: https://github.com/oobabooga/text-generation-webui/blob/main/docs/12%20-%20OpenAI%20API.md
Without these changes:
Untested: if the missing API key warning still fires.
I'm just getting started with nvim and Lua.
I don't have any cloud based LLM services set up so I didn't test against those.
There are some necessary configuration changes to not get errors. Here's an example for anyone looking to get started (with text-generation-webui):
return { "jackMort/ChatGPT.nvim", event = "VeryLazy", config = function() require("chatgpt").setup({ api_host_cmd = "echo -n http://127.0.0.1:5000", actions_paths = { "~/.config/nvim/lua/plugins/chatgpt-actions.json" }, openai_params = { character = "Example", model = "", max_tokens = 300, frequency_penalty = 0, temperature = 0.99, stream = true, mode = "chat", }, openai_edit_params = { model = "", max_tokens = 300, frequency_penalty = 0, temperature = 0.99, stream = false, }, }) end, dependencies = { "MunifTanjim/nui.nvim", "nvim-lua/plenary.nvim", "nvim-telescope/telescope.nvim" } }
{ "explain_code": { "type": "chat", "opts": { "title": "Explain Code", "template": "Explain the following code:\n\nCode:\n```{{filetype}}\n{{input}}\n```\n\nUse markdown format.\nHere's what the above code is doing:\n```", "strategy": "display", "params": { "mode": "instruct", "model": "", "stop": [ "```" ], "stream": false } } }, "summarize": { "type": "chat", "opts": { "template": "Summarize the following text.\n\nText:\n\"\"\"\n{{input}}\n\"\"\"\n\nSummary:", "strategy": "edit", "params": { "mode": "instruct", "model": "", "stream": false } } } }
I don't see the reason to add to main repo this specific for text-generation-webui config, you can still overwrite the settings as you mentioned.
Via OpenAI API plugin: https://github.com/oobabooga/text-generation-webui/blob/main/docs/12%20-%20OpenAI%20API.md
Without these changes:
Untested: if the missing API key warning still fires.
I'm just getting started with nvim and Lua.
I don't have any cloud based LLM services set up so I didn't test against those.
There are some necessary configuration changes to not get errors. Here's an example for anyone looking to get started (with text-generation-webui):
chatgpt.lua
chatgpt-actions.json