Closed pmjgeraeds closed 3 years ago
Hi, @pmjgeraeds
In order to load the corpus automatically, you need to configure config.json, as pointed out. However, even if you define and configure config.json, you probably need to actually train it.
For this reason, I came up with the following flow to solve the problem.
This will automatically load the TSV data at startup.
Sample: https://github.com/okhiroyuki/node-red-contrib-nlp/blob/master/examples/qna_file.json
Dear Hiroyuki San, I like your workarround, but in fact that is the part I had working (loading corpus.sjon via the inject node into a training node)
If I want try out most of the examples online, I need to create index.js, pipelines.md and conf.json file to use more advanced features. However all the examples forget to mention, in which directory to store these files.
e.g. in /node-red-contrib-nlp/ or in the node-red home directory or under the /node-nlp/
or should I let node-red know where I stored them using the settings.json file. I am a newby to node.js (and only worked a lot using the node-red interface) so I never had to dive into the nodes behind them.
do you know where to put them ?
thanks Pascal
@pmjgeraeds node-red-contrib-nlp has been updated. It has been changed to accept data in TSV format instead of JSON format. Please check the samples and README.
Dear Hiroyuki San,
I have successfully implemented your node-red-contrib-nlp nodes in node-red. Currently I trying to understand how to implement intends, entities etc. and am blocked by something (I guess) very simple thing: I got the implementation of training corpus etc up and running and can connect telegram to the nlp (as described in https://github.com/okhiroyuki/node-red-contrib-nlp/tree/master/examples)
However :
in the example documentation of nlpjs they mention index.js conf.json
To implement “advanced” features of the NLP nodes ( as described in: https://github.com/axa-group/nlp.js/blob/master/docs/v4/quickstart.md )
I can make these features work when directly editing the /node-red-contrib-nlp/nodes/nlp.js file and could for example load the corpus automatically without using an inject node to feed the corpus json data as a message in the nlp_train node. However I don’t think this is what was intended : I created an index.js file
const { dockStart } = require('@nlpjs/basic');
(async () => { const dock = await dockStart({ use: ['Basic']}); const nlp = dock.get('nlp'); await nlp.addCorpus('/data/node_modules/node-red-contrib-nlp/corpus-en.json'); await nlp.train(); const response = await nlp.process('en', 'Who are you'); console.log(response); })();
To take over this task, but don’t have a clue where to store the file in order that it will be executed during a flow restart (or restart of the docker container).
Likewise conf.json (a lot of variables could be preloaded in order to simplify index.js), but I have no clue where to store it, to make it work / run.
Could you please spare a little of your time to get me started ?
kind regards Pascal