Closed shuther closed 7 months ago
I followed the steps from: https://www.nltk.org/data.html
poetry run python
import nltk nltk.download()
then I picked up the popular ones; not sure if it is what is needed?
Hi, The reason that trigger nltk at this step is your llm_client (llama3), does not return the string in required format. So I believe, even you solve this nltk issue, it will fail again in other steps.
Basically, in each step, we require the llm to return a json loadable string. However, we observed that currently most model do not have such ability. So you need to ensure this by write a post-process function in your model definition, please see the document here: https://github.com/Libr-AI/OpenFactVerification/blob/dev/docs/development_guide.md#new-llm-support
I tried the project with llama3 using:
poetry run python -m factcheck --modal string --input "MBZUAI is the first AI university in the world" --client local_openai --model llama3 --prompt factcheck/config/sample_prompt.yaml
but I end up with a lib not setup. not sure if it is expected?