Open Emotibot5 opened 1 month ago
how are you sending the chat message? the client-side application would need to send it
I am using local stt, llm, and tts, I can talk with the agent.
assistant = VoiceAssistant(
vad=vad,
stt=whisper_stt,
llm=_llm,
tts=_tts,
chat_ctx=initial_ctx,
)
assistant.start(ctx.room)
# listen to incoming chat messages, only required if you'd like the agent to
# answer incoming messages from Chat
chat_mng = rtc.ChatManager(ctx.room)
async def answer_from_text(txt: str):
chat_ctx = assistant.chat_ctx.copy()
chat_ctx.append(role="user", text=txt)
stream = _llm.chat(chat_ctx=chat_ctx)
await assistant.say(stream)
@chat_mng.on("message_received")
def on_chat_received(msg: rtc.ChatMessage):
if msg.message:
print("=====",msg.message, " received")
asyncio.create_task(answer_from_text(msg.message))
I am using local stt, llm, and tts, I can talk with the agent.
assistant = VoiceAssistant( vad=vad, stt=whisper_stt, llm=_llm, tts=_tts, chat_ctx=initial_ctx, ) assistant.start(ctx.room) # listen to incoming chat messages, only required if you'd like the agent to # answer incoming messages from Chat chat_mng = rtc.ChatManager(ctx.room) async def answer_from_text(txt: str): chat_ctx = assistant.chat_ctx.copy() chat_ctx.append(role="user", text=txt) stream = _llm.chat(chat_ctx=chat_ctx) await assistant.say(stream) @chat_mng.on("message_received") def on_chat_received(msg: rtc.ChatMessage): if msg.message: print("=====",msg.message, " received") asyncio.create_task(answer_from_text(msg.message))
it is wired that I can talk with the agent, but I set breakpoint in the debug mode, the program cannot suspended.
message_received would not be fired unless you are sending Chat messages from the client. this is expected
I am using local stt, llm, and tts, I can talk with the agent.
assistant = VoiceAssistant( vad=vad, stt=whisper_stt, llm=_llm, tts=_tts, chat_ctx=initial_ctx, ) assistant.start(ctx.room) # listen to incoming chat messages, only required if you'd like the agent to # answer incoming messages from Chat chat_mng = rtc.ChatManager(ctx.room) async def answer_from_text(txt: str): chat_ctx = assistant.chat_ctx.copy() chat_ctx.append(role="user", text=txt) stream = _llm.chat(chat_ctx=chat_ctx) await assistant.say(stream) @chat_mng.on("message_received") def on_chat_received(msg: rtc.ChatMessage): if msg.message: print("=====",msg.message, " received") asyncio.create_task(answer_from_text(msg.message))
it is wired that I can talk with the agent, but I set breakpoint in the debug mode, the program cannot suspended.
My goal is to using stream mode to input llm trunk data to tts and then accelerate the voice response.
I am running this examaple:
https://github.com/livekit/agents/blob/main/examples/voice-assistant/minimal_assistant.py
message_received event cannot be triggered.
chat = rtc.ChatManager(ctx.room)