学长你好,能否看一下你运行代码配置的环境呢,我这里不知道是不是因为版本冲突,一直这样报错
traceback.format_exc():__Traceback (most recent call last):
File "run.py", line 448, in fit
train_loss = self.run_epoch(epoch, val_mrr)
File "run.py", line 412, in run_epoch
pred = self.model.forward(sub, rel, neg_ent)
File "/home/lab401/app2/hua/RAGAT-main/model/models.py", line 200, in forward
sub_emb, rel_emb, all_ent = self.forward_base(sub, rel, self.inp_drop, self.hidden_drop_gcn)
File "/home/lab401/app2/hua/RAGAT-main/model/models.py", line 50, in forward_base
ent_embed1, rel_embed1 = self.conv1(x=self.init_embed, rel_embed=init_rel)
File "/home/lab401/anaconda3/envs/ra22/lib/python3.8/site-packages/torch/nn/modules/module.py", line 550, in call__
result = self.forward(*input, *kwargs)
File "/home/lab401/app2/hua/RAGAT-main/model/ragat_conv.py", line 88, in forward
in_res1 = self.propagate('add', self.in_index, x=x, edge_type=self.in_type, rel_embed=rel_embed,
File "/home/lab401/app2/hua/RAGAT-main/model/message_passing.py", line 114, in propagate
out = self.message(message_args)
File "/home/lab401/app2/hua/RAGAT-main/model/ragat_conv.py", line 191, in message
out = torch.mm(xj_rel, weight)
RuntimeError: CUDA error: CUBLAS_STATUS_EXECUTION_FAILED when calling `cublasSgemm( handle, opa, opb, m, n, k, &alpha, a, lda, b, ldb, &beta, c, ldc)
学长你好,能否看一下你运行代码配置的环境呢,我这里不知道是不是因为版本冲突,一直这样报错 traceback.format_exc():__Traceback (most recent call last): File "run.py", line 448, in fit train_loss = self.run_epoch(epoch, val_mrr) File "run.py", line 412, in run_epoch pred = self.model.forward(sub, rel, neg_ent) File "/home/lab401/app2/hua/RAGAT-main/model/models.py", line 200, in forward sub_emb, rel_emb, all_ent = self.forward_base(sub, rel, self.inp_drop, self.hidden_drop_gcn) File "/home/lab401/app2/hua/RAGAT-main/model/models.py", line 50, in forward_base ent_embed1, rel_embed1 = self.conv1(x=self.init_embed, rel_embed=init_rel) File "/home/lab401/anaconda3/envs/ra22/lib/python3.8/site-packages/torch/nn/modules/module.py", line 550, in call__ result = self.forward(*input, *kwargs) File "/home/lab401/app2/hua/RAGAT-main/model/ragat_conv.py", line 88, in forward in_res1 = self.propagate('add', self.in_index, x=x, edge_type=self.in_type, rel_embed=rel_embed, File "/home/lab401/app2/hua/RAGAT-main/model/message_passing.py", line 114, in propagate out = self.message(message_args) File "/home/lab401/app2/hua/RAGAT-main/model/ragat_conv.py", line 191, in message out = torch.mm(xj_rel, weight) RuntimeError: CUDA error: CUBLAS_STATUS_EXECUTION_FAILED when calling `cublasSgemm( handle, opa, opb, m, n, k, &alpha, a, lda, b, ldb, &beta, c, ldc)