Hi, thanks for your wonderful work. When I was fine-tuning the NLU task, I ran into the following problem when running the cola.sh:
Traceback (most recent call last):
File "/mnt/WavCaps/lcq/HRA-master/nlu/DeBERTa/apps/run.py", line 497, in
main(args)
File "/mnt/WavCaps/lcq/HRA-master/nlu/DeBERTa/apps/run.py", line 302, in main
model = create_model(args, len(label_list), model_class_fn)
File "/mnt/WavCaps/lcq/HRA-master/nlu/DeBERTa/apps/run.py", line 46, in create_model
model = model_class_fn(init_model, args.model_config, num_labels=num_labels, \
File "/mnt/WavCaps/lcq/HRA-master/nlu/DeBERTa/deberta/nnmodule.py", line 118, in load_model model = cls(config, *inputs, kwargs) File "/mnt/WavCaps/lcq/HRA-master/nlu/DeBERTa/apps/models/sequence_classification.py", line 28, in init self.deberta = DeBERTa(config, pre_trained=pre_trained) File "/mnt/WavCaps/lcq/HRA-master/nlu/DeBERTa/deberta/deberta.py", line 53, in init self.embeddings = BertEmbeddings(config) File "/mnt/WavCaps/lcq/HRA-master/nlu/DeBERTa/deberta/bert.py", line 247, in init self.word_embeddings = nn.Embedding(config.vocab_size, self.embedding_size, padding_idx = padding_idx) File "/mnt/WavCaps/lcq/envs/nlu/lib/python3.10/site-packages/torch/nn/modules/sparse.py", line 143, in init self.weight = Parameter(torch.empty(( num_embeddings, embedding_dim), factory_kwargs),
RuntimeError: Trying to create tensor with negative dimension -1: [-1, 768]
Hi, thanks for your wonderful work. When I was fine-tuning the NLU task, I ran into the following problem when running the cola.sh:
Traceback (most recent call last): File "/mnt/WavCaps/lcq/HRA-master/nlu/DeBERTa/apps/run.py", line 497, in
main(args)
File "/mnt/WavCaps/lcq/HRA-master/nlu/DeBERTa/apps/run.py", line 302, in main
model = create_model(args, len(label_list), model_class_fn)
File "/mnt/WavCaps/lcq/HRA-master/nlu/DeBERTa/apps/run.py", line 46, in create_model
model = model_class_fn(init_model, args.model_config, num_labels=num_labels, \
File "/mnt/WavCaps/lcq/HRA-master/nlu/DeBERTa/deberta/nnmodule.py", line 118, in load_model model = cls(config, *inputs, kwargs) File "/mnt/WavCaps/lcq/HRA-master/nlu/DeBERTa/apps/models/sequence_classification.py", line 28, in init self.deberta = DeBERTa(config, pre_trained=pre_trained) File "/mnt/WavCaps/lcq/HRA-master/nlu/DeBERTa/deberta/deberta.py", line 53, in init self.embeddings = BertEmbeddings(config) File "/mnt/WavCaps/lcq/HRA-master/nlu/DeBERTa/deberta/bert.py", line 247, in init self.word_embeddings = nn.Embedding(config.vocab_size, self.embedding_size, padding_idx = padding_idx) File "/mnt/WavCaps/lcq/envs/nlu/lib/python3.10/site-packages/torch/nn/modules/sparse.py", line 143, in init self.weight = Parameter(torch.empty(( num_embeddings, embedding_dim), factory_kwargs),
RuntimeError: Trying to create tensor with negative dimension -1: [-1, 768]