ucinlp / autoprompt

AutoPrompt: Automatic Prompt Construction for Masked Language Models.
Apache License 2.0
594 stars 81 forks source link

RuntimeError: CUDA error #61

Open enhaohuang opened 3 months ago

enhaohuang commented 3 months ago

Dear author, I am sure that all the versions of my packages are correct. I used CUDA version 10.1 to adapt to Torch version 1.4. However, I meet an error when Evaluation as follows: Traceback (most recent call last): File "/home/miracle/anaconda3/envs/autoprompt/lib/python3.7/runpy.py", line 193, in _run_module_as_main "main", mod_spec) File "/home/miracle/anaconda3/envs/autoprompt/lib/python3.7/runpy.py", line 85, in _run_code exec(code, run_globals) File "/home/miracle/llm-prompt/autoprompt/autoprompt/create_trigger.py", line 531, in run_model(args) File "/home/miracle/llm-prompt/autoprompt/autoprompt/create_trigger.py", line 297, in run_model predict_logits = predictor(model_inputs, trigger_ids) File "/home/miracle/llm-prompt/autoprompt/autoprompt/createtrigger.py", line 52, in call logits, * = self._model(model_inputs) File "/home/miracle/anaconda3/envs/autoprompt/lib/python3.7/site-packages/torch/nn/modules/module.py", line 532, in call result = self.forward(*input, *kwargs) File "/home/miracle/anaconda3/envs/autoprompt/lib/python3.7/site-packages/transformers/modeling_roberta.py", line 232, in forward inputs_embeds=inputs_embeds, File "/home/miracle/anaconda3/envs/autoprompt/lib/python3.7/site-packages/torch/nn/modules/module.py", line 532, in call result = self.forward(input, kwargs) File "/home/miracle/anaconda3/envs/autoprompt/lib/python3.7/site-packages/transformers/modeling_bert.py", line 736, in forward encoder_attention_mask=encoder_extended_attention_mask, File "/home/miracle/anaconda3/envs/autoprompt/lib/python3.7/site-packages/torch/nn/modules/module.py", line 532, in call result = self.forward(*input, kwargs) File "/home/miracle/anaconda3/envs/autoprompt/lib/python3.7/site-packages/transformers/modeling_bert.py", line 407, in forward hidden_states, attention_mask, head_mask[i], encoder_hidden_states, encoder_attention_mask File "/home/miracle/anaconda3/envs/autoprompt/lib/python3.7/site-packages/torch/nn/modules/module.py", line 532, in call result = self.forward(*input, *kwargs) File "/home/miracle/anaconda3/envs/autoprompt/lib/python3.7/site-packages/transformers/modeling_bert.py", line 368, in forward self_attention_outputs = self.attention(hidden_states, attention_mask, head_mask) File "/home/miracle/anaconda3/envs/autoprompt/lib/python3.7/site-packages/torch/nn/modules/module.py", line 532, in call result = self.forward(input, kwargs) File "/home/miracle/anaconda3/envs/autoprompt/lib/python3.7/site-packages/transformers/modeling_bert.py", line 314, in forward hidden_states, attention_mask, head_mask, encoder_hidden_states, encoder_attention_mask File "/home/miracle/anaconda3/envs/autoprompt/lib/python3.7/site-packages/torch/nn/modules/module.py", line 532, in call result = self.forward(*input, *kwargs) File "/home/miracle/anaconda3/envs/autoprompt/lib/python3.7/site-packages/transformers/modeling_bert.py", line 216, in forward mixed_query_layer = self.query(hidden_states) File "/home/miracle/anaconda3/envs/autoprompt/lib/python3.7/site-packages/torch/nn/modules/module.py", line 532, in call result = self.forward(input, **kwargs) File "/home/miracle/anaconda3/envs/autoprompt/lib/python3.7/site-packages/torch/nn/modules/linear.py", line 87, in forward return F.linear(input, self.weight, self.bias) File "/home/miracle/anaconda3/envs/autoprompt/lib/python3.7/site-packages/torch/nn/functional.py", line 1372, in linear output = input.matmul(weight.t()) RuntimeError: CUDA error: CUBLAS_STATUS_EXECUTION_FAILED when calling cublasSgemm( handle, opa, opb, m, n, k, &alpha, a, lda, b, ldb, &beta, c, ldc)

My GPU is NVIDIA GeForce RTX 4060 Ti (16GB). I'm sure it's not a problem caused by insufficient video memory.

rloganiv commented 2 months ago

Hi @enhaohuang,

Can you try running using the Docker container contributed by @ElefHead?

Best,

@rloganiv