livekit / agents

Build real-time multimodal AI applications 🤖🎙️📹
https://docs.livekit.io/agents
Apache License 2.0
653 stars 106 forks source link

ChatMessage doesn't get appended in the ChatContext #408

Open naman-scogo opened 4 days ago

naman-scogo commented 4 days ago

I have made a Voice Assistant using the kitt example

It can interact over voice as well as text messages.

Once the user ends the call/room, the assistant is designed to process all the messages and store them in a persistent database.

However, after the call ends, all the ChatMessaged pushed to the ChatContext using llm.chat() method while answering the text messages from the user are not present in the messages array.

The same happens if I push a System Message to the context while sending a system message to the llm after the function calls are finished.

Reynold97 commented 4 days ago

Hi, friend. I was trying to run the same example but I can't. On my windows machine I always have this error:

{"message": "silero stream failed\nTraceback (most recent call last):\n File \"D:\!!!Trabajo\!!Natasquad\Repos\Live-Agent\env\lib\site-packages\livekit\plugins\silero\vad.py\", line 148, in _run\n await asyncio.shield(self._run_inference())\n File \"D:\!!!Trabajo\!!Natasquad\Repos\Live-Agent\env\lib\site-packages\livekit\plugins\silero\vad.py\", line 171, in _run_inference\n raw_prob = await asyncio.to_thread(\n File \"D:\!!!Trabajo\!!Natasquad\Repos\Live-Agent\env\lib\asyncio\threads.py\", line 25, in to_thread\n return await loop.run_in_executor(None, func_call)\n File \"D:\!!!Trabajo\!!Natasquad\Repos\Live-Agent\env\lib\concurrent\futures\thread.py\", line 58, in run\n result = self.fn(*self.args, **self.kwargs)\n File \"D:\!!!Trabajo\!!Natasquad\Repos\Live-Agent\env\lib\site-packages\livekit\plugins\silero\vad.py\", line 172, in \n lambda: self._model(tensor, self._sample_rate).item()\n File \"C:\Users\Reynold/.cache\torch\hub\snakers4_silero-vad_v4.0\utils_vad.py\", line 65, in call\n ort_outs = self.session.run(None, ort_inputs)\n File \"D:\!!!Trabajo\!!Natasquad\Repos\Live-Agent\env\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py\", line 220, in run\n return self._sess.run(output_names, input_feed, run_options)\nonnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Unexpected input data type. Actual: (tensor(int32)) , expected: (tensor(int64))\n", "job_id": "AJ_juzTLFapA3Pq", "pid": 12844, "timestamp": "2024-07-02T20:29:42.814037+00:00"}

TL:DR

[ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Unexpected input data type. Actual: (tensor(int32)), expected: (tensor(int64))