shmsw25 / FActScore

A package to evaluate factuality of long-form generation. Original implementation of our EMNLP 2023 paper "FActScore: Fine-grained Atomic Evaluation of Factual Precision in Long Form Text Generation"
https://arxiv.org/abs/2305.14251
MIT License
275 stars 40 forks source link

Having issues on using LLAMA-7B model #25

Open mungg opened 1 year ago

mungg commented 1 year ago

I tried to use llama-7B for factscore, but have problem when running download_data. after loading checkpoint shards, RecursionError: maximum recursion depth exceeded while calling a Python object occurs while running on recover_instruct_llama(args.llama_7B_HF_path, os.path.join(args.model_dir, "inst-llama-7B"))

martiansideofthemoon commented 1 year ago

Hi @mungg, Thanks for reaching out and reporting this issue. Could you share the entire stack trace with us, specifically on what line of the recover_instruct_llama you faced this error, and what command you used to run this?

Also, had you followed all instructions here to obtain LLAMA-7B? https://huggingface.co/docs/transformers/main/model_doc/llama

I've also DM'ed you on Slack the location of inst-llama-7B on the UMass servers.

Loose-Gu commented 9 months ago

same problem