Open 24samj opened 2 weeks ago
I have the same use case and I could detect those events on the createStartAvatar
debug stream.
But I think the SDK should have a better ways for us to detect the start and end of the avatar talking.
const res = await avatar.value.createStartAvatar(
{
newSessionRequest: {
quality: "low",
avatarName: "Kristin_public_2_20240108",
voice: { voiceId: "" },
},
},
(e) => {
try {
const response = JSON.parse(e.replace("Received event:", "").trim());
if (!response.type) return;
if (response.type === "avatar_start_talking")
// STARTED TALKING
if (response.type === "avatar_stop_talking")
// STOPPED TALKING
} catch (e) {}
}
);
I'm using their sdk in a project and would like to disable all app features when the avatar is speaking. I tried to use the video tag's audioStreams but it looks like there's only one audioStream that remains active UNTIL the session is open, rather than when the avatar itself is speaking.
Does HeyGen provide a variable/hook perhaps to check if the avatar is currently speaking or not?
Original issue: https://github.com/HeyGen-Official/StreamingAvatarSDK/issues/5#issue-2323736404