balazik / ComfyUI-PuLID-Flux

PuLID-Flux ComfyUI implementation
Apache License 2.0
381 stars 27 forks source link

Mac issue #54

Open aitalus opened 18 hours ago

aitalus commented 18 hours ago

ApplyPulidFlux No operator found for memory_efficient_attention_forward with inputs: query : shape=(1, 577, 16, 64) (torch.bfloat16) key : shape=(1, 577, 16, 64) (torch.bfloat16) value : shape=(1, 577, 16, 64) (torch.bfloat16) attn_bias : <class 'NoneType'> p : 0.0 ck_decoderF is not supported because: device=mps (supported: {'cuda'}) bf16 is only supported on A100+ GPUs operator wasn't built - see python -m xformers.info for more info ckF is not supported because: device=mps (supported: {'cuda'}) bf16 is only supported on A100+ GPUs operator wasn't built - see python -m xformers.info for more info

aitalus commented 18 hours ago

Doesn't it work with mps?