Closed silveroxides closed 1 month ago
Colab Error encountered when trying to run training again after removing the line trying to import non-existing module mentioned in issue #21
[<ipython-input-25-8519949f2c4c>] (https://localhost:8080/#) in <cell line: 1>() ----> 1 run_job(job_to_run) 23 frames [/usr/local/lib/python3.10/dist-packages/diffusers/models/attention_processor.py](https://localhost:8080/#) in __call__(self, attn, hidden_states, encoder_hidden_states, attention_mask, temb, scale) 1257 # the output of sdp = (batch, num_heads, seq_len, head_dim) 1258 # TODO: add support for attn.scale when we move to Torch 2.1 -> 1259 hidden_states = F.scaled_dot_product_attention( 1260 query, key, value, attn_mask=attention_mask, dropout_p=0.0, is_causal=False 1261 ) RuntimeError: cutlassF: no kernel found to launch!
Colab Error encountered when trying to run training again after removing the line trying to import non-existing module mentioned in issue #21