Finity-Alpha / OpenVoiceChat

Have a natural voice conversation with an LLM
http://www.finityalpha.com/OpenVoiceChat/
Apache License 2.0
39 stars 12 forks source link

Interruption information to the LLM. #16

Closed fakhirali closed 2 months ago

fakhirali commented 2 months ago

Right now just sends the interrupt transcript back to main.py where it is added to the user_prompt. Refactoring needed. Also since audio cuts off maybe better to send back the audio instead of the interruption idk.

fakhirali commented 2 months ago

Now can identify when the interruption occurred. Need to send that back into the llm. Also the say_multiple_stream function need to be refactored heavily. Contains lots of repeated code.

fakhirali commented 2 months ago

The llm now gets '...' when an interruption happens. The ellipses get added after the phrase that was completely said. For future work we could add it after the exact word where the interruption occurred.