Closed tyfeld closed 10 months ago
Is there a script provided to calculate the acc and f1 score as is the paper in the evaluation module? After running the evaluation script, only can we generate the output json file. There seems to be a lack of an ACC calculating process.
Thanks for your interests! An example is in https://github.com/HKUDS/GraphGPT/blob/main/scripts/eval_script/cal_metric_arxiv.py
Is there a script provided to calculate the acc and f1 score as is the paper in the evaluation module? After running the evaluation script, only can we generate the output json file. There seems to be a lack of an ACC calculating process.
Thanks for your interests! An example is in https://github.com/HKUDS/GraphGPT/blob/main/scripts/eval_script/cal_metric_arxiv.py
Thank you for your response. But it seems that I can't find the file of labelidx2arxivcategeory.csv
at line'26. I can't find it in any huggingface link.
Is there a script provided to calculate the acc and f1 score as is the paper in the evaluation module? After running the evaluation script, only can we generate the output json file. There seems to be a lack of an ACC calculating process.