microsoft / BotFramework-WebChat

A highly-customizable web-based client for Azure Bot Services.
https://www.botframework.com/
MIT License
1.58k stars 1.53k forks source link

Adaptive Card speak problem #2243

Closed Monomoy88 closed 5 years ago

Monomoy88 commented 5 years ago

In my web app I'm using hybrid speech model as per the webchat sample code. If adapive card has a speak field & I'm comunicating with bot using microphone then the bot is not speaking when the card is coming as response. To get rid of this I've specified the same text in place of ssml in my Bot code as shown below:

var result = JsonConvert.DeserializeObject<AdaptiveModel>(cardAttachment.Content.ToString());
if (!string.IsNullOrEmpty(result.speak))
    await turnContext.SendActivityAsync(MessageFactory.Attachment(cardAttachment, "", result.speak, "en")), cancellationToken);

Now the bot is speaking same text twice as it has generated the Webchat have generated ssml as follows while sending it to cognitive speech:

<speak version="1.0" xml:lang="en-US">
  <voice xml:lang="en-US" xml:gender="undefined" name="Microsoft Server Speech Text to Speech Voice (en-US, ZiraRUS)">
    <prosody pitch="+0%" rate="+0%" volume="+0%">
      <s>Adaptive Card Text</s>
<s>Adaptive Card Text</s>
    </prosody>
  </voice>
</speak>

So either the bot is not speaking at all or speaking twice the same text for Adaptive card. How to get rid of this problem?

corinagum commented 5 years ago

Could you provide the json for your Adaptive Card for testing?

Monomoy88 commented 5 years ago

Here is one sample Adaptive card

{
  "type": "AdaptiveCard",
  "speak": "<s>Adaptive Card Text</s>",
  "body": [
    {
      "type": "Container",
      "separator": true,
      "style": "default",
      "items": [
        {
          "type": "TextBlock",
          "horizontalAlignment": "Left",
          "size": "Medium",
          "weight": "Bolder",
          "color": "Accent",
          "text": "Adaptive Card Title",
          "wrap": true
        }
      ]
    },
    {
      "type": "Container",
      "separator": true,
      "style": "default",
      "items": [
        {
          "type": "Image",
          "altText": "",
          "url": "https://cdn.cnn.com/cnnnext/dam/assets/190121090951-04-blood-moon-global-01212019-exlarge-169.jpg"
        },
        {
          "type": "TextBlock",
          "weight": "Lighter",
          "text": "- Sky gazers were treated to a rare lunar eclipse known as a super blood wolf moon on Sunday night, in which sunlight passing through Earth's atmosphere lit the celestial body in a dramatic fashion and turned it red.\n- Watchers in North and South America, parts of Europe and western Africa, who were lucky enough to have clear skies, saw a total lunar eclipse -- but eastern Africa and Asia observed a partial eclipse.\n- Hundreds of people came out late on Sunday night or early Monday morning to witness the event, capturing images of the super blood wolf moon and sharing it on Twitter.",
          "wrap": true,
          "horizontalAlignment": "Left",
          "size": "Medium"
        }
      ]
    }
  ],
  "$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
  "version": "1.0"
}
tdurnford commented 5 years ago

@Monomoy88 is correct. Web Chat is not speaking the speak property in an Adaptive Card. I was able to repro his issue using the JSON he provided and with an Adaptive Card that only has the speak property set.

{
  "type": "AdaptiveCard",
  "version": "1.2",
  "body": [],
  "$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
  "speak": "The bot should speak this."
}