HKUDS / GraphGPT

[SIGIR'2024] "GraphGPT: Graph Instruction Tuning for Large Language Models"
https://arxiv.org/abs/2310.13023
Apache License 2.0
493 stars 36 forks source link

A problem occurred when I tried to run stage1.sh #50

Closed Andarkness1 closed 4 months ago

Andarkness1 commented 5 months ago

This part of the code "Pretra_gnn=./clip_gt_arxiv " is not included in your project .When I was running the code, I found that this part of the error should be related to clip_gt_arxiv_pub.pkl, and I also found that there is a corresponding reading method in the graphgpt-main/graphgpt/model/graphllama.py file. Do I need to call this code to reproduce the paper? And how I can run it?

Andarkness1 commented 5 months ago

error information is : AttributeError: 'GraphLlamaConfig' object has no attribute 'pretrain_graph_model_path'

hhy-huang commented 4 months ago

error information is : AttributeError: 'GraphLlamaConfig' object has no attribute 'pretrain_graph_model_path'

I met the same issue before, and I deleted that line. The it reported:

AttributeError: 'str' object has no attribute 'requires_grad_'

Then I changed the parameters of pretrain_gnn to clip_gt_arxiv in stage_1 scripts. And then your error was reported.

Then I added the following codes.

model.config.pretrain_graph_model_path = model_args.graph_tower

Then error:

AttributeError:  'GraphLlamaConfig' object has no attribute 'graph_hidden_size'

I was not able to find a further solution, so I assigned graph tower's output features dimension to this non-existent parameter.

self.config.graph_hidden_size = self.graph_tower.W_P.out_features
if not hasattr(self, 'graph_projector'):
        self.graph_projector = nn.Linear(self.config.graph_hidden_size, self.config.hidden_size)

And then the error was solved.

Andarkness1 commented 4 months ago

Thanks.You are right.

Andarkness1 commented 4 months ago

error information is : AttributeError: 'GraphLlamaConfig' object has no attribute 'pretrain_graph_model_path'

I met the same issue before, and I deleted that line. The it reported:

AttributeError: 'str' object has no attribute 'requires_grad_'

Then I changed the parameters of pretrain_gnn to clip_gt_arxiv in stage_1 scripts. And then your error was reported.

Then I added the following codes.

model.config.pretrain_graph_model_path = model_args.graph_tower

Then error:

AttributeError:  'GraphLlamaConfig' object has no attribute 'graph_hidden_size'

I was not able to find a further solution, so I assigned graph tower's output features dimension to this non-existent parameter.

self.config.graph_hidden_size = self.graph_tower.W_P.out_features
if not hasattr(self, 'graph_projector'):
      self.graph_projector = nn.Linear(self.config.graph_hidden_size, self.config.hidden_size)

And then the error was solved. thanks for your answer. and I encounter another problem caused by flash_attn.what's the version of your flash_attn? is the same of the requirements.txt?