SychAndrii / infusion

A command line tool designed to help you generate documentation for your source code using LLMs.
MIT License
0 stars 2 forks source link

Feature: Allow user to see a response from LLM in real time #33

Closed SychAndrii closed 1 month ago

SychAndrii commented 1 month ago

Having used chatGPT, it is very familiar to an average user that LLM responds to a user in chunks, without a need to wait for a whole answer to be loaded. It would be great if there was an option -s or --stream that would allow user to see how his response is being generated in real time.

SychAndrii commented 1 month ago

Closed by https://github.com/SychAndrii/infusion/commit/b01f493a8eb3c86aad00760f41f8adf0b93b231e