Open Du-danger opened 1 year ago
下面这个custom.qabot
报错是因为没有安装这个自定义过程,也就是没有在图数据库中运行这个链接https://github.com/ongdb-contrib/graph-qabot-demo
中的 7.1.1
的第二个语句CALL apoc.custom.asProcedure
。
CALL custom.qabot('火力发电行业博士学历的男性高管有多少位?') YIELD result RETURN result
报错Neo.ClientError.Procedure.ProcedureNotFound:
There is no procedure with the name custom.qabot registered for this database instance. Please ensure you've spelled the procedure name correctly and that the procedure is properly deployed.
下面这个报错是因为没有安装NLP模块,也就是6.1
中的nlp
部分,具体依赖是nlp/code/python
目录下部分代码。
这一句也报错Neo.ClientError.Procedure.ProcedureCallFailed
Failed to invoke function olab.nlp.pagenum.parse: Caused by: java.io.IOException: CreateProcess error=2, 系统找不到指定的文件。
可以根据上述再尝试一下 :)
CALL custom.qabot('火力发电行业博士学历的男性高管有多少位?') YIELD result RETURN result 报错Neo.ClientError.Procedure.ProcedureNotFound: There is no procedure with the name
custom.qabot
registered for this database instance. Please ensure you've spelled the procedure name correctly and that the procedure is properly deployed.// 1.搜索语句 WITH LOWER('火力发电行业博士学历的男性高管有多少位?') AS query // 2.个性化配置:图数据模型/本体权重/实体匹配规则/预期意图 WITH query, custom.inference.search.qabot() AS graphDataSchema, custom.inference.weight.qabot() AS weight, custom.inference.match.qabot() AS nodeHitsRules, //预期意图定义中支持设置一个排序参数 custom.inference.intended.qabot() AS intendedIntent, custom.inference.operators.parse(query) AS oper // 3.个性化语句解析:解析时间/解析页面 WITH oper,oper.query AS query,oper.operator AS operator,graphDataSchema,weight,nodeHitsRules,intendedIntent, olab.nlp.timeparser(oper.query) AS time,olab.nlp.pagenum.parse(oper.query) AS page // 4.从查询语句中过滤时间词 WITH oper,operator,graphDataSchema,weight,nodeHitsRules,intendedIntent,time,page, olab.replace(query,REDUCE(l=[],mp IN time.list | l+{raw:mp.text,rep:' '})) AS query // 5.过滤时间词后进行分词 WITH oper,operator,graphDataSchema,weight,nodeHitsRules,intendedIntent,time,page, olab.hanlp.standard.segment(query) AS words // 6.分词后结果只保留名词且不能是纯数字 WITH oper,operator,graphDataSchema,weight,nodeHitsRules,intendedIntent,time,page, EXTRACT(m IN FILTER(mp IN words WHERE (mp.nature STARTS WITH 'n' AND olab.string.matchCnEn(mp.word)<>'') OR mp.nature='uw')| m.word) AS words // 7.实体识别 WITH oper,operator,graphDataSchema,weight,nodeHitsRules,intendedIntent,time,page,words, olab.entity.recognition(graphDataSchema,nodeHitsRules,NULL,'EXACT',words,{isMergeLabelHit:true,labelMergeDis:0.5}) AS entityRecognitionHits // 8.生成权重搜索队列 WITH oper,operator,graphDataSchema,weight,intendedIntent,time,page,words,entityRecognitionHits CALL olab.entity.ptmd.queue(graphDataSchema,entityRecognitionHits,weight) YIELD value WITH oper,operator,graphDataSchema,intendedIntent,time,page,words,value AS entityRecognitionHit LIMIT 1 // 9.将个性化语句解析结果增加到entityRecognitionHit WITH oper,operator,graphDataSchema,intendedIntent,words,custom.inference.parseadd.qabot(entityRecognitionHit,time,page).entityRecognitionHit AS entityRecognitionHit // 10.意图识别 WITH oper,operator,graphDataSchema,intendedIntent,words,entityRecognitionHit, apoc.convert.toJson(olab.intent.schema.parse(graphDataSchema,oper.query,words,intendedIntent)) AS intentSchema WHERE SIZE(apoc.convert.fromJsonList(intendedIntent))>SIZE(apoc.convert.fromJsonMap(intentSchema).graph.nodes) // 11.图上下文语义解析 WITH operator,graphDataSchema,intentSchema,intendedIntent, olab.semantic.schema(graphDataSchema,intentSchema,apoc.convert.toJson(entityRecognitionHit)) AS semantic_schema // 12.查询转换【不设置skip参数,每个查询抽取100条结果】 WITH olab.semantic.cypher(apoc.convert.toJson(semantic_schema),intentSchema,-1,10,{},operator) AS cypher WITH REPLACE(cypher,'RETURN n','RETURN DISTINCT n') AS cypher // 13.执行查询【value返回为一个MAP,MAP的KEY SIZE小于等于解析后返回意图类别个数】 CALL apoc.cypher.run(cypher,{}) YIELD value WITH value SKIP 0 LIMIT 10 WITH olab.map.keys(value) AS keys,value UNWIND keys AS key WITH apoc.map.get(value,key) AS n CALL apoc.case([apoc.coll.contains(['NODE'],apoc.meta.cypher.type(n)),'WITH $n AS n,LABELS($n) AS lbs WITH lbs[0] AS label,n.value AS value RETURN label+$sml+UPPER(TOSTRING(value)) AS result'],'WITH $n AS n RETURN TOSTRING(n) AS result',{n:n,sml:':'}) YIELD value RETURN value.result AS result 这一句也报错Neo.ClientError.Procedure.ProcedureCallFailed Failed to invoke function
olab.nlp.pagenum.parse
: Caused by: java.io.IOException: CreateProcess error=2, 系统找不到指定的文件。我按照教程安装了相关的组件
1.2 安装APOC和OLAB组件
下载APOC
下载OLAB
1.3 修改配置并启动ONgDB
尝试了更新组件olab,无效。