Open nathanbraun opened 1 year ago
Hi, that is a great idea! But I am not sure if it is possible in vim. Do you have any clue how to do that or do you know any plugin that does something similar?
There is a library called denops that can implement vim plugins with deno.
https://github.com/vim-denops/denops.vim
Here is an example of a plugin that just runs the chatgpt API asynchronously using denops (about 130 lines of code).
https://github.com/tani/hey.vim
I'd be happy to get a reference if you'd like.
Wow vim-denops looks really cool! Seems they manipulate vim buffer from a different process what makes all processing asynchronous. The only problem I can see with this approach is that it would probably mess up the undo tree after completion (user would need to hit undo many times to undo a single completion).
Anyway at this point I don't want to introduce any 3rd-party dependencies into vim-ai.
I attempted to try chat gpt's web browsing capabilities by asking chat gpt 4 how it would make this repository (I gave it a link) async and — though it didn't actually browse/do anything with the link — this is what it said:
Prompt: Check out this plugin, which lets you use an openai key to interact with chat-gpt inside (neo)vim:
https://github.com/madox2/vim-ai
It'd be nice if it was asynchronous -- what do you think would be the easiest way to do that without introducing any third party plugin dependencies?
Response: In general, making network requests asynchronous in Vim or NeoVim can be done using their built-in features. Here are a few approaches that can be considered:
Job Control: NeoVim has support for job control, which allows you to run commands asynchronously. You can use this feature to execute the OpenAI API call in a separate job and handle the result when it's done. Here is a very basic example:
function! s:handle_exit(job_id, data, event) dict
" process the data
endfunction
let s:job = jobstart(['command', 'arg1', 'arg2'], {
\ 'on_exit': function('s:handle_exit')
\ })
In this example, command arg1 arg2
is executed asynchronously. When it's done, the s:handle_exit
function is called with the job's output.
Timers: Vim 8 and NeoVim have support for timers. You can use this feature to run a function in the background at regular intervals. This won't make the API call itself asynchronous, but it can be used to poll for the result of an asynchronous call.
function! s:poll_for_result(timer)
" check if the result is available
endfunction
let timer = timer_start(1000, 's:poll_for_result', {'repeat': -1})
In this example, the s:poll_for_result
function is called every 1,000 milliseconds (1 second) until the timer is stopped.
Channels: Vim 8 introduced the concept of channels, which can be used for communication with external processes. This is similar to the job control in NeoVim but works slightly differently.
function! s:ReadMessage(channel, msg)
" handle the message
endfunction
let channel = ch_open('localhost:8765')
ch_sendexpr(channel, 'some message')
ch_setoptions(channel, {'callback': 's:ReadMessage'})
In this example, a channel is opened to a local server, a message is sent, and then a callback is set to handle responses.
Please note that these are just examples and you would need to replace 'command', 'arg1', 'arg2'
, localhost:8765
, and 'some message'
with the actual command, server, and message you want to use, respectively.
In the case of the Vim plugin you linked to, the easiest way to make it asynchronous would probably be to use job control if it's targeted at NeoVim, or channels if it's targeted at Vim 8. This way, the OpenAI API call could be executed in a separate process, and the result could be processed when it's done. This would avoid blocking the editor while waiting for the API response.
Maybe this could be useful? https://github.com/prabirshrestha/async.vim
Looks cool but this plugin does not execute external commands like bash but rather use built-in python.
Is there a way we have this feature implemented? Although this plugin is really cool, it loses to the web version everyday.
Features you have while using the web version:
Features you have while using the plugin:
I would argue that the vim features are really low when comparison with the web version.
I would argue that the vim features are really low when comparison with the web version.
I disagree, I use this (along with the .aichat files) almost 100% of the time. Cost is a non-issue, yeah open ai can subsidize it/offer a free version on their site because they developed the models but how is @madox2 supposed to have it be free in this plugin? Also it's pretty cheap, especially with the new GPT4 Turbo.
Agree async would be nice/seems like it should be possible though.
I disagree, I use this (along with the .aichat files) almost 100% of the time.
The web version also allows for that. I do like that this plugin also offers it.
Cost is a non-issue, yeah open ai can subsidize it/offer a free version on their site because they developed the models but how is @madox2 supposed to have it be free in this plugin? Also it's pretty cheap, especially with the new GPT4 Turbo.
I'm not saying this plugin should have a work around the open api pricing, let's not jump into conclusions... Given the fact that you are stuck waiting for the job to finish it's no different than using the web version and the web version is free (for now).
With that, the original quoted statement still remains true.
I would argue that the vim features are really low when comparison with the web version.
But I'll rephrase it for a better meaning of what I meant. This is exactly ChatGPT inside Vim, no more, a tad less. Unless being inside Vim holds that much value.
Although it is not an ideal solution, you could chat in the new terminal within vim, that would make it async. (e.g. spawn a new vim instance in the split :vert term vim
and use this window for AIChat)
Regarding :help channel-demo
which uses python as example code, isn't there still a chance that the python code used in their demo
could also be used by vim-ai to open a job channel?
That looks promising and worth experimenting. Just wondering how it would update chat buffer without interrupting user doing different actions and if it could keep consistent undo sequences
I don't know if this could be helpful: https://github.com/tpope/vim-dispatch they seem to have async stuff running in the background. For LLMs it is maybe more complex as it impacts the buffer.
I'd suppose that the main difficulty is that the output is continuously written into the buffer. Apparently :Start
achieves something similar, giving the example of a REPL, but this rather seems to open a terminal inside Vim instead of an editable buffer; the output halts on leaving the buffer.
Great plugin, thanks. I find myself mostly using it via chat, and sometimes waiting around while it types a long answer. I know neovim/vim 8 have async features -- would this be possible/hard to build in?