Open MeetMartin opened 3 years ago
Short answer: this library is an NLP, not a chatbot orchestrator, we recommend to use Microsoft Bot Framework for that.
Long answer: I'm preparing a chatbot orchestrator, that will be the main feature introduced for version 5, but can be used in version 4. You've an example here: https://github.com/jesus-seijas-sp/nlpjs-examples/tree/master/04.bot
Explanation of the example:
There is an script script.dlg that is the orchestrator of the bot, this is the content:
dialog main
nlp
dialog hellodialog
run greet
run help
run bye
dialog greet
run ask name
call uppers user_name
say Hello {{ user_name }}
dialog help
say This is the help
say This is the second help
dialog bye
say Bye user
dialog ask name
say What's your name?
ask user name
Different commands here:
You'll see this in the corpus:
{
"intent": "greetings.hello",
"utterances": [
"hello",
"hi",
"howdy"
],
"answers": [
"/hellodialog"
]
},
The answer /hellodialog means that instead of saying something to the user, it will triggers the dialog with name "hellodialog".
The index.js is like this:
const { dockStart } = require('@nlpjs/basic');
(async () => {
const dock = await dockStart();
const bot = dock.get('bot');
bot.registerAction('uppers', (session, context, params) => {
const variableName = params && params[0] ? params[0] : undefined;
if (variableName) {
context[variableName] = (context[variableName] || '').toUpperCase();
}
});
})();
We are retrieving the bot from the IoC container, and registering an action called uppers that pass a variable name to uppercase.
If you execute this code and go to http://localhost:3000 you will be able to se something like this:
You've the question an answers, but the intent "hello" triggers this orchestrated dialog.
Hi Jesus,
Thank you for the super fast reply and all best to you in Barcelona. I had friends who studied there for Erasmus and loved the city and the people.
Your work is brilliant and I will certainly try the example to see how it works. I noticed the code among the packages and it's great to get more clarity.
Reading through it and the code it is not immediately obvious to me how it works with a regular corpus. If the bot dialog is triggered only for defined cases or if all corpus has to be mapped within the bot. But I will probably find out by playing with the code.
Also is it possible to use it with a web build or only as an extension for the basic package?
Thank you Martin
Hello, During this week I will publish advances on this part, but as an answer to your question: the bot class gets the fs from the container, so it works in web if you want. The current advances that I am doing is integration with the NER and using a custom ner to be able to do things like "askDate" or "askEmail". Also a webbot package that is based on @nlpjs/basic, so will not work in the frontend, but automates things loading all from a folder structure.
Hi Martin and Jesus, Thank you both for the super work you have been doing with NLP.JS! Martin - I have been playing around with your great nlpjs-web and am keen to incorporate a script.dlg in it, along with the corresponding corpus.dlg and conf.json files. How can I do that? Ian
hi @archegyral ,
in the example provided by Jesus , you can see how the script location is specified in conf.json. The corpus is imported from the script.dlg file itself.
thanks @jordi-carbonell - I have done that and have been running the .dlg files using the Directline Connector in the conventional manner. What I would like to do is incorporate that in the work done by @MeetMartin, described in his great article at https://betterprogramming.pub/natural-language-processing-in-the-browser-8ca5fdf2488b so I can run nlp.js in the browser in the way that Martin has demonstrated but also to incorporate the use of .dlg files in doing so.
Hi @archegyral,
As a hint, if you check bot.js (loadScript) and dialog-parse.js, you'll see that the fs plugin is the one used to load the script:
so maybe registering an fs plugin that is able to return the script content when required, would help you to get what you want.
Thanks for that hint @aigloss , that is very helpful, although I'm not quite sure how to do that. I will try it out and any further hints you might have would be most welcome.
Thank you again for the great work with nlp.js, it is huge!
Is there a way of using follow up intents (conversation tree) with the library?
Follow up intents have meaning within the context of their previous intent. For example: User: Does JavaScript have lambda functions? Bot: Yes it has arrow functions since ES6. Would you like to know more? User: Yes Bot: Let me tell you all about it...
The 'yes' answer is for the follow-up intent and 'yes' would have different meanings following different intents.
I have been looking through the source code and it does not seem that there is support for it. In that case, I would be happy to give a hand and collaborate on the feature if the core developer(s) would like some help.