ToTheBeginning / PuLID

[NeurIPS 2024] Official code for PuLID: Pure and Lightning ID Customization via Contrastive Alignment
Apache License 2.0
2.72k stars 189 forks source link

Pulid not working - xFormers issue #153

Open bipinpeter opened 3 weeks ago

bipinpeter commented 3 weeks ago

Here is the error message I get

ApplyPulid No operator found for memory_efficient_attention_forward with inputs: query : shape=(1, 577, 16, 64) (torch.float16) key : shape=(1, 577, 16, 64) (torch.float16) value : shape=(1, 577, 16, 64) (torch.float16) attn_bias : <class 'NoneType'> p : 0.0 decoderF is not supported because: xFormers wasn't build with CUDA support attn_bias type is <class 'NoneType'> operator wasn't built - see python -m xformers.info for more info flshattF@0.0.0 is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info cutlassF is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info smallkF is not supported because: max(query.shape[-1] != value.shape[-1]) > 32 xFormers wasn't build with CUDA support dtype=torch.float16 (supported: {torch.float32}) has custom scale operator wasn't built - see python -m xformers.info for more info unsupported embed per head: 64