Closed CAICCU closed 2 months ago
Hi @CAICCU If you're running from the source, on #390 and #373 temperature param support was added, so a new build can fix this.
glm does not support 0.0 or 1.0 for its temperature and top_p, which are the default configs, you need to change it.
This is my settings.yaml
settings.yaml
I use xinference api to run
python -m graphrag.index --root ./ragtest
,but in
create_base_extracted_entities
step have bug in xinference:ValueError: `temperature` (=0.0) has to be a strictly positive float, otherwise your next token scores will be invalid. If you're looking for greedy decoding strategies, set `do_sample=False`.
How to solve it?