ray-project / ray-llm

RayLLM - LLMs on Ray
https://aviary.anyscale.com
Apache License 2.0
1.22k stars 89 forks source link

Adding confidence level to the output of requests #20

Open kouroshHakha opened 1 year ago

kouroshHakha commented 1 year ago

One useful feature is to be able to extract confidence level and top-n probs on a per token basis as the generation happens. It can open up a lot of applications. One interesting use-case is to do monte-carlo search over the answers to boost the output of models.

image