dustinblackman / oatmeal

Terminal UI to chat with large language models (LLM) using different model backends, and integrations with your favourite editors!
https://dustinblackman.com/posts/oatmeal/
MIT License
477 stars 23 forks source link

Submit subsequent blocks of code to the chat. #51

Open madsamjp opened 6 months ago

madsamjp commented 6 months ago

After submitting a block of code to the model, and having a back and forth chat, I've found it's not so easy to submit another snippet of code to the model. Also, the prompt window doesn't really allow for intuitive vim keybindings, it's just on one line and not so easy to edit. This means it's not really practical to just paste blocks of code into the prompt window.

Would it be possible to paste selected blocks of code from the buffer back to the chat model, and/or make the prompt window more interactive to allow for constructing more complex prompts?

dustinblackman commented 5 months ago

Hey there! Sorry for the delay.

You're right the back and forth aspect isn't great at the moment. Right now it expect you to talk about a single codeblock you send back, and to start a new session afterwards. I'll move this issue to the main repo as a feature request. :)

For the single line input box, it's being tracked in https://github.com/dustinblackman/oatmeal/issues/16. I'm waiting on the authors of the library to work on the feature as my own attempts to do it haven't been successful. :(