HKUDS / GraphGPT

[SIGIR'2024] "GraphGPT: Graph Instruction Tuning for Large Language Models"
https://arxiv.org/abs/2310.13023
Apache License 2.0
521 stars 39 forks source link

关于找不到config.json文件 #14

Closed nulinuli closed 10 months ago

nulinuli commented 10 months ago

2023-11-09 16:41:33,050 INFO worker.py:1673 -- Started a local Ray instance. (eval_model pid=11217) Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained. (eval_model pid=11217) start loading Traceback (most recent call last): File "./run_graphgpt.py", line 240, in run_eval(args, args.num_gpus) File "./run_graphgpt.py", line 94, in run_eval ans_jsons.extend(ray.get(ans_handle)) File "/home/fry/.conda/envs/graphgpt/lib/python3.8/site-packages/ray/_private/auto_init_hook.py", line 24, in auto_init_wrapper return fn(*args, kwargs) File "/home/fry/.conda/envs/graphgpt/lib/python3.8/site-packages/ray/_private/client_mode_hook.py", line 103, in wrapper return func(*args, *kwargs) File "/home/fry/.conda/envs/graphgpt/lib/python3.8/site-packages/ray/_private/worker.py", line 2563, in get raise value.as_instanceof_cause() ray.exceptions.RayTaskError(AssertionError): ray::eval_model() (pid=11217, ip=172.27.37.124) File "/home/fry/.conda/envs/graphgpt/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(args, kwargs) File "./run_graphgpt.py", line 116, in eval_model model = GraphLlamaForCausalLM.from_pretrained(args.model_name, torch_dtype=torch.float16, use_cache=True, low_cpu_mem_usage=True).cuda() File "/home/fry/.conda/envs/graphgpt/lib/python3.8/site-packages/transformers/modeling_utils.py", line 3085, in from_pretrained model = cls(config, *model_args, **model_kwargs) File "/home/fry/桌面/GraphGPT-main/graphgpt/model/GraphLlama.py", line 284, in init self.model = GraphLlamaModel(config) File "/home/fry/桌面/GraphGPT-main/graphgpt/model/GraphLlama.py", line 104, in init clip_graph, args= load_model_pretrained(CLIP, config.pretrain_graph_model_path) File "/home/fry/桌面/GraphGPT-main/graphgpt/model/GraphLlama.py", line 55, in load_model_pretrained assert osp.exists(osp.join(pretrain_model_path, 'config.json')), 'config.json missing' AssertionError: config.json missing (eval_model pid=11217) Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained. (eval_model pid=11217) finish loading (eval_model pid=11217) start loading 我下载的是您的checkpoints

zhuochunli commented 3 months ago

你好,请问解决了吗?我也是在evaluation方面出了这个问题

Longmeix commented 3 months ago

我也遇到了这个问题,请问有人解决了吗

daixixiwang commented 1 week ago

这个作者有个小错误,脚本中和实际目录结构下的名字不一致,手动改下就可以了 。