axa-group / nlp.js

An NLP library for building bots, with entity extraction, sentiment analysis, automatic language identify, and so more
MIT License
6.2k stars 616 forks source link

Discussion: Bot orchestration #715

Open jesus-seijas-sp opened 3 years ago

jesus-seijas-sp commented 3 years ago

Hello!

For the version 5 we are working on making a chatbot orchestration. You can see an explanation and example in this comment: https://github.com/axa-group/nlp.js/issues/713#issuecomment-724072580

You can see there the commands that are already implemented, but we want to start a dialog to get feedback about which commands can be implemented.

Also, we developed a connector for building bots using CDD (Conversation Driven Development), you can see an example here: https://github.com/axa-group/nlp.js/blob/master/packages/bot/test/bot.test.js#L54 This Unit Tests is able to run the bot and test it using an scenario described in the file scenario01.dlt, with this content:

user> hello
bot> What's your name?
user> John
bot> (Converting name to uppercases...)
bot> Hello JOHN
bot> This is the help
bot> This is the second help
bot> Bye user

So, feel free to comment what features you wish related with conversation orchestration.

Thank you!

intech commented 3 years ago

If I understand correctly, then this is an analogue of story in rasa, the name is more logical.

They have interactive training that creates a story out of dialogue that can be used for tests too. It is very comfortable.

I like the idea of a single syntax - json better, but for some reason in nlp.js everything is divided into json, md, and a kind of pipeline, which is not very convenient.

syntithenai commented 3 years ago

Hi, I've been working on something similar.

I've implemented something like RASA core with rules, stories and forms based on your neural network library so it can run in a browser. https://github.com/syntithenai/voicedialogjs

I'm using it in https://opennludata.org/ which is a web UI for annotating intent data and importing/exporting Jovo/RASA/Mycroft intents/responses/entities. When I saw your neural network library I figured I could use that for the stories part of Rasa style core routing and I've added features to the UI to manage stories, forms, rules, actions and apis (written in javascript to run in a browser) that can be exported as a single JSON file containing everything needed to run the bot in the browser.

Published skills are available as a standalone chat application served from github eg. https://opennludata.org/static/skills/Syntithenai-music%20player.html

All heavily based on your work with NLP.js so THANKS A MILLION. Still a work in progress and a bit a source code dogs breakfast but I saw your post and it seemed timely to reply.

I think the combination of forms (slot filling) with machine learning based stories is a winner in building a bot framework.

I ponder what other conversational structures might have a place and the idea of goals and goal completion would be a good addition.

Food for thought

Again thanks.

Steve

On Tue, Nov 10, 2020 at 2:16 AM Jesús Seijas notifications@github.com wrote:

Hello!

For the version 5 we are working on making a chatbot orchestration. You can see an explanation and example in this comment: #713 (comment) https://github.com/axa-group/nlp.js/issues/713#issuecomment-724072580

You can see there the commands that are already implemented, but we want to start a dialog to get feedback about which commands can be implemented.

Also, we developed a connector for building bots using CDD (Conversation Driven Development), you can see an example here:

https://github.com/axa-group/nlp.js/blob/master/packages/bot/test/bot.test.js#L54 This Unit Tests is able to run the bot and test it using an scenario described in the file scenario01.dlt, with this content:

user> hello bot> What's your name? user> John bot> (Converting name to uppercases...) bot> Hello JOHN bot> This is the help bot> This is the second help bot> Bye user

So, feel free to comment what features you wish related with conversation orchestration.

Thank you!

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/axa-group/nlp.js/issues/715, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABA5YTTMSCDIMG2U54Z3IOLSPABUVANCNFSM4TPPB2TA .

-- Steve Ryan Software Engineer stever@syntithenai.com phone (mobile) 0457057526 skype : irishflute 260 Spring Creek Road Buckajo NSW 2550 Australia

ericzon commented 3 years ago

From my point of view, one of the most interesting purposes for the orchestration should:

Something like this, having in mind these assumptions:

getWeatherCondition async code attached to bot with registerAction or similar, where getWeatherAnswer value could be accessed and also other data like user coordinates to perform an async request to weather API

getWeatherCard function that can receive parameters and access to context to generate the json of an adaptive card

plot1.dlg

dialog main
  nlp

dialog askWeatherDialog
  say do you want to know the weather?
  ask getWeatherAnswer
  [if getWeatherCondition is true] run acceptGetWeather
  [else] run declineGetWeather

dialog acceptGetWeather
  read weather.dlg
  run byeDialog

dialog declineGetWeather
  say well, If you have any request, just tell me
  run main

dialog byeDialog
  say bye bye :)

weather.dlg

dialog acceptGetWeather
  say well, the weather in your coordinates is 
  card getWeatherCard
jesus-seijas-sp commented 3 years ago

Hello, a little update of last version 4.16, and what it includes:

You have an example here: https://github.com/jesus-seijas-sp/nlpjs-examples/blob/master/04.bot

Comments If a line starts with # then is a comment. Example

# this is a comment

Import another .dlg If a line starts with import then will import other .dlgs. You can provide several separated by space:

import something.dlg ./others/other.dlg

language If a line starts with language it sets a locale for the code until another language command is found.

# this sets the language to english
language en

intents

You can add new intents using dlg. They will be added to the language of the last language command For each intent you can define the utterances, tests and answers

language en
intent agent.acquaintance
  utterances
  - say about you
  - describe yourself
  - tell me about yourself
  - who are you
  - I want to know more about you
  answers
  - I'm a virtual agent
  - Think of me as a virtual agent

entity In the same way that you can define intents, you can define entities. There are two different kind of entities that you can define: enum and regex.

entity hero
  - spiderman: spiderman, spider-man
  - ironman: ironman, iron-man
  - thor: thor
entity email 
  regex /\\b(\\w[-._\\w]*\\w@\\w[-._\\w]*\\w\\.\\w{2,3})\\b/gi

This code will create two entities: hero and email. Email is a regex entity. Hero is an enum entity with three options: spiderman, ironman and thor. Spiderman is identified when the text "spiderman" or "spider-man" is found.

dialog This will create a new dialog with a pipeline of commands. Example:

# Script for a simple turn conversation
import corpus.dlg
dialog main
  nlp
dialog hellodialog
  [!user_name] run greet
  run bye
dialog greet
  say Hello user!
  say Tell me your name
  ask user_name
  call uppers user_name
  [user_name !== 'ADMIN'] say Hello {{ user_name }}
  [user_name === 'ADMIN'] say You unblocked admin mode
dialog bye
  say Bye user 

say This is the command for the chatbot to say something to the user.

say Hello, I'm a chatbot

ask This wait for input of the user and store it into a variable. Important: this will not say anything to the user, just wait the input and store it. If you want to say something to the user, use say.

say What's your name?
ask name

This will store the input from the user in the name variable

run Executes another dialog by name. The dialog execution is an stack that stores also the last position of each dialog on the stack. Example:

dialog main
  run greet
  run help
  run bye
dialog greet
  say hello user
dialog help
  say I'm a bot
  say You can ask me questions
dialog bye
  say bye user

nlp This command execute the input from user to the nlp and retrieve the answer. Important thing here: the answre from nlp can be a message to the user or starts by /. If the answer starts by / then it means that is to execute a dialog, so it acts as "run /dialogname". In the example if you go to the corpus you'll find this:

https://github.com/jesus-seijas-sp/nlpjs-examples/blob/master/04.bot/corpus.dlg#L502

intent greetings.hello
  utterances
  - hello
  - hi
  - howdy
  answers
  - /hellodialog

That means that when someone says 'Hello' to the bot, the dialog '/hellodialog' is executed:

dialog hellodialog
  [!user_name] run greet
  run bye

inc It will increment a variable by it's name. You can provide the increment or not, by default is 1. If the variable does not exists, then initialize it to 0 before inc.

# This will add 1 to count
inc count
# This will add 3 to count
inc count 3

dec It will decrement a variable by it's name. You can provide the decrement or not, by default is 1. If the variable does not exists, then initialize it to 0 before dec.

# This will substract 1 from count
dec count
# This will substract 3 from count
dec count 3

set This will set a value to a variable. You can provide an expression:

# this will set count to (count*2+1)
set count count * 2 + 1

conditions You can add conditions before each command so the command will be execute if the condition is solved to truthy. Just add the condition between square brackets:

# Script for a simple turn conversation
import corpus.dlg
dialog main
  nlp
dialog hellodialog
  [!user_name] run greet
  run bye
dialog greet
  say Hello user!
  say Tell me your name
  ask user_name
  call uppers user_name
  [user_name !== 'ADMIN'] say Hello {{ user_name }}
  [user_name === 'ADMIN'] say You unblocked admin mode
dialog bye
  say Bye user

The [!user_name] run greet means that the dialog greet will be executed only if the variable user_name does not exists. When the user_name is ADMIN then the user will receive the message "You unblocked admin mode" otherwise the user will receive "Hello {{ user_name }}"

String templating When you see "Hello {{ user_name }}" that means that the part {{ user_name }} will be replaced with the variable user_name from the context.

call This is used to call functions so you can code by yourself actions for the chatbot. In the example you'll find:

  call uppers user_name

That means that the bot will try to find the function uppers and call it with parameter "user_name". Important: user_name will not be replaced with the user name from context, user_name will be provided exactly as is, an string with value "user_name". The function uppers has this code:

const uppers = (session, context, params) => {
  if (params) {
    const variableName = typeof params === 'string' ? params : params[0];
    if (variableName) {
      context[variableName] = (context[variableName] || '').toUpperCase();
    }
  }
};

The way to register an action is calling bot.registerAction with the name of the action and the function to be executed:

bot.registerAction('uppers', uppers)

The signature of each action is:

  function action(session, context, params)

So the action will receive the session object, the context object and the parameters.

jesus-seijas-sp commented 3 years ago

New features added:

Example:

# Script for a simple turn conversation
import corpus-ner.json
import card01.json card02.json card03.json
dialog main
  nlp
dialog demo01
  suggest Car|Bus|Bicycle
  say Please enter your mode of transport.
dialog demo02
  card card02

card02.json content:

{
  "name": "card02",
  "type": "message",
  "attachments": [
    {
      "contentType": "application/vnd.microsoft.card.adaptive",
      "content": {
        "$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
        "version": "1.0",
        "type": "AdaptiveCard",
        "speak": "Your flight is confirmed for you and 3 other passengers from San Francisco to Amsterdam on Friday, October 10 8:30 AM",
        "body": [
          {
            "type": "TextBlock",
            "text": "Passengers",
            "weight": "bolder",
            "isSubtle": false
          },
          {
            "type": "TextBlock",
            "text": "Sarah Hum",
            "separator": true
          },
          {
            "type": "TextBlock",
            "text": "Jeremy Goldberg",
            "spacing": "none"
          },
          {
            "type": "TextBlock",
            "text": "Evan Litvak",
            "spacing": "none"
          },
          {
            "type": "TextBlock",
            "text": "2 Stops",
            "weight": "bolder",
            "spacing": "medium"
          },
          {
            "type": "TextBlock",
            "text": "Fri, October 10 8:30 AM",
            "weight": "bolder",
            "spacing": "none"
          },
          {
            "type": "ColumnSet",
            "separator": true,
            "columns": [
              {
                "type": "Column",
                "width": 1,
                "items": [
                  {
                    "type": "TextBlock",
                    "text": "San Francisco",
                    "isSubtle": true
                  },
                  {
                    "type": "TextBlock",
                    "size": "extraLarge",
                    "color": "accent",
                    "text": "SFO",
                    "spacing": "none"
                  }
                ]
              },
              {
                "type": "Column",
                "width": "auto",
                "items": [
                  {
                    "type": "TextBlock",
                    "text": " "
                  },
                  {
                    "type": "Image",
                    "url": "http://adaptivecards.io/content/airplane.png",
                    "size": "small",
                    "spacing": "none"
                  }
                ]
              },
              {
                "type": "Column",
                "width": 1,
                "items": [
                  {
                    "type": "TextBlock",
                    "horizontalAlignment": "right",
                    "text": "Amsterdam",
                    "isSubtle": true
                  },
                  {
                    "type": "TextBlock",
                    "horizontalAlignment": "right",
                    "size": "extraLarge",
                    "color": "accent",
                    "text": "AMS",
                    "spacing": "none"
                  }
                ]
              }
            ]
          },
          {
            "type": "TextBlock",
            "text": "Non-Stop",
            "weight": "bolder",
            "spacing": "medium"
          },
          {
            "type": "TextBlock",
            "text": "Fri, October 18 9:50 PM",
            "weight": "bolder",
            "spacing": "none"
          },
          {
            "type": "ColumnSet",
            "separator": true,
            "columns": [
              {
                "type": "Column",
                "width": 1,
                "items": [
                  {
                    "type": "TextBlock",
                    "text": "Amsterdam",
                    "isSubtle": true
                  },
                  {
                    "type": "TextBlock",
                    "size": "extraLarge",
                    "color": "accent",
                    "text": "AMS",
                    "spacing": "none"
                  }
                ]
              },
              {
                "type": "Column",
                "width": "auto",
                "items": [
                  {
                    "type": "TextBlock",
                    "text": " "
                  },
                  {
                    "type": "Image",
                    "url": "http://adaptivecards.io/content/airplane.png",
                    "size": "small",
                    "spacing": "none"
                  }
                ]
              },
              {
                "type": "Column",
                "width": 1,
                "items": [
                  {
                    "type": "TextBlock",
                    "horizontalAlignment": "right",
                    "text": "San Francisco",
                    "isSubtle": true
                  },
                  {
                    "type": "TextBlock",
                    "horizontalAlignment": "right",
                    "size": "extraLarge",
                    "color": "accent",
                    "text": "SFO",
                    "spacing": "none"
                  }
                ]
              }
            ]
          },
          {
            "type": "ColumnSet",
            "spacing": "medium",
            "columns": [
              {
                "type": "Column",
                "width": "1",
                "items": [
                  {
                    "type": "TextBlock",
                    "text": "Total",
                    "size": "medium",
                    "isSubtle": true
                  }
                ]
              },
              {
                "type": "Column",
                "width": 1,
                "items": [
                  {
                    "type": "TextBlock",
                    "horizontalAlignment": "right",
                    "text": "$4,032.54",
                    "size": "medium",
                    "weight": "bolder"
                  }
                ]
              }
            ]
          }
        ]
      }
    }
  ]
}

image

torloneg commented 3 years ago

is it possible to send a card as a template ?

jesus-seijas-sp commented 3 years ago

@torloneg Yes, you can include templating inside the card that will be replaced with context information. When you're sending a card, all the card object nodes are visited and passed through a template, so if you put something like "Hello {{ name }}" the "{{ name }}" part will be replaced by the value of context.name.

torloneg commented 3 years ago

I can generate a card at runtime in onIntent function ?

jesus-seijas-sp commented 3 years ago

No, not on an onIntent because onIntent is part of the NLP, not part of the bot orchestration. But you can do it on a "call", as explained in this same thread, is used to build actions by yourself.

But ok, step by step. 1) In your code you can build your card dinamically, but here is an example as constant:

const card = {
  "name": "card04",
  "type": "message",
  "attachments": [
    {
      "contentType": "application/vnd.microsoft.card.adaptive",
      "content": {
        "$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
        "version": "1.0",
        "type": "AdaptiveCard",
        "speak": "Your flight is confirmed for you and 3 other passengers from San Francisco to Amsterdam on Friday, October 10 8:30 AM",
        "body": [
          {
            "type": "TextBlock",
            "text": "Passengers",
            "weight": "bolder",
            "isSubtle": false
          },
          {
            "type": "TextBlock",
            "text": "Sarah Hum",
            "separator": true
          },
          {
            "type": "TextBlock",
            "text": "Jeremy Goldberg",
            "spacing": "none"
          },
          {
            "type": "TextBlock",
            "text": "Evan Litvak",
            "spacing": "none"
          },
          {
            "type": "TextBlock",
            "text": "2 Stops",
            "weight": "bolder",
            "spacing": "medium"
          },
          {
            "type": "TextBlock",
            "text": "Fri, October 10 8:30 AM",
            "weight": "bolder",
            "spacing": "none"
          },
          {
            "type": "Input.Text",
            "placeholder": "Placeholder text",
            "id": "inputText"
          },
          {
            "type": "Input.Date",
            "id": "inputDate"
          },
          {
            "type": "Input.Time",
            "id": "inputTime"
          },
          {
            "type": "ActionSet",
            "actions": [
              {
                "type": "Action.Submit",
                "title": "Send!"
              }
            ]
          }
        ]
      }
    }
  ]
}

2) Create the code for an action. An action receives the session, the context, and the parameters. The session contains the method "sendCard" that receives the card and the context.

const sendCard = (session, context, params) => {
  session.sendCard(card, context);
};

3) Register the action to the bot with a name. That way you'll be able to call this action from the script of the bot using 'call'

  bot.registerAction('sendCard', sendCard);

4) Define a dialog that calls this action:

dialog dialogCard
  call sendCard

Ok, now we have the dialog card, that will call the action sendCard, that sends the card. But, how to call this dialog from an intent?

5) Create an intent where the answer is '/dialogCard'

6) Run the bot, trigger the intent.

rohit-32 commented 3 years ago

Hello, I am using nlpjs in a react-native project, is it possible to achieve bot orchestration in React-Native. if not, what would be the right way to do it.

FYI I would prefer to do it code than in script format

Thank you