Closed Andarkness1 closed 4 months ago
error information is : AttributeError: 'GraphLlamaConfig' object has no attribute 'pretrain_graph_model_path'
error information is : AttributeError: 'GraphLlamaConfig' object has no attribute 'pretrain_graph_model_path'
I met the same issue before, and I deleted that line. The it reported:
AttributeError: 'str' object has no attribute 'requires_grad_'
Then I changed the parameters of pretrain_gnn
to clip_gt_arxiv
in stage_1 scripts. And then your error was reported.
Then I added the following codes.
model.config.pretrain_graph_model_path = model_args.graph_tower
Then error:
AttributeError: 'GraphLlamaConfig' object has no attribute 'graph_hidden_size'
I was not able to find a further solution, so I assigned graph tower's output features dimension to this non-existent parameter.
self.config.graph_hidden_size = self.graph_tower.W_P.out_features
if not hasattr(self, 'graph_projector'):
self.graph_projector = nn.Linear(self.config.graph_hidden_size, self.config.hidden_size)
And then the error was solved.
Thanks.You are right.
error information is : AttributeError: 'GraphLlamaConfig' object has no attribute 'pretrain_graph_model_path'
I met the same issue before, and I deleted that line. The it reported:
AttributeError: 'str' object has no attribute 'requires_grad_'
Then I changed the parameters of
pretrain_gnn
toclip_gt_arxiv
in stage_1 scripts. And then your error was reported.Then I added the following codes.
model.config.pretrain_graph_model_path = model_args.graph_tower
Then error:
AttributeError: 'GraphLlamaConfig' object has no attribute 'graph_hidden_size'
I was not able to find a further solution, so I assigned graph tower's output features dimension to this non-existent parameter.
self.config.graph_hidden_size = self.graph_tower.W_P.out_features if not hasattr(self, 'graph_projector'): self.graph_projector = nn.Linear(self.config.graph_hidden_size, self.config.hidden_size)
And then the error was solved. thanks for your answer. and I encounter another problem caused by flash_attn.what's the version of your flash_attn? is the same of the requirements.txt?
This part of the code "Pretra_gnn=./clip_gt_arxiv " is not included in your project .When I was running the code, I found that this part of the error should be related to clip_gt_arxiv_pub.pkl, and I also found that there is a corresponding reading method in the graphgpt-main/graphgpt/model/graphllama.py file. Do I need to call this code to reproduce the paper? And how I can run it?