Kosinkadink / ComfyUI-AnimateDiff-Evolved

Improved AnimateDiff for ComfyUI and Advanced Sampling Support
Apache License 2.0
2.78k stars 209 forks source link

[Mac Studio] Python Crashing when using AnimateDiff in ComfyUI #155

Closed CedricBattah closed 3 months ago

CedricBattah commented 1 year ago

Hello, I'm running into a problem with AnimateDiff + ComfyUI. I run it locally on a Mac Studio (192GB M2 Ultra) However when I try to generate AnimateDiff outputs even slightly over 512x512px or with more than 16 frames, Python instantly crashes when it arrives to the Ksampler Node. Here is the message I have in my console :

Loading 1 new model 0%| | 0/20 [00:00<?, ?it/s]/AppleInternal/Library/BuildRoots/495c257e-668e-11ee-93ce-926038f30c31/Library/Caches/com.apple.xbs/Sources/MetalPerformanceShaders/MPSCore/Types/MPSNDArray.mm:761: failed assertion[MPSNDArray initWithDevice:descriptor:] Error: total bytes of NDArray > 2**32' zsh: abort python3 main.py cedricbattah@Cedrics-Mac-Studio ComfyUI % /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/multiprocessing/resource_tracker.py:224: UserWarning: resource_tracker: There appear to be 1 leaked semaphore objects to clean up at shutdown warnings.warn('resource_tracker: There appear to be %d '`

If someone has an Idea I'd love some help as I am stuck

Kosinkadink commented 1 year ago

I believe to fix that NDArray > 2**32 thing on mac, you need to be on the nightly version of pytorch. I think for AD it's also a good idea to use --use-split-cross-attention startup argument, and be sure to do the mac workaround for a different pytorch bug if you still get black images after that: https://github.com/Kosinkadink/ComfyUI-AnimateDiff-Evolved/issues/48#issuecomment-1750156332

Once I have some features done that are currently in the oven, I'll work on making it easier to use that workaround/add Mac instructions to the README.

CedricBattah commented 1 year ago

Thank you for your help, I am already on the nightly version of pytorch. What do you mean by use --use-split-cross-attention startup argument ? At what point should I type this in ? I'm sorry if the answer is obvious, I'm still new to those things

Kosinkadink commented 1 year ago

At some point, you probably use some command to start ComfyUI, like python -m main.py. Whatever you do to do that, add --use-split-cross-attention at the end.

CedricBattah commented 1 year ago

I just tried it but end up with the same output...

CedricBattah commented 1 year ago

Screenshot 2023-11-05 at 20 43 22

For information, I checked the Activity Monitor when running a simple node tree like this one. It uses very low CPU but around 85GB of my 192GB memory. I herad only 16GB of Vram were necesary to run ComfyUI so I don't know if it is normal. I hope this information can help you help me figure out my problem. I can run other tests, just tell me what I need to do and I will.

Thank you for your time

Weixuanf commented 11 months ago

Hi I finally got animatediff working in my macbook pro M1 I was using python 3.9 and no virtual env and animate diff was either crashing or generating black images

then I changed to use virtual env with python 3.12, python3.12 -m venv myenv, installed pytorch: https://developer.apple.com/metal/pytorch/ and to start ComfyUI I used this command: python main.py --force-fp16 --use-split-cross-attention

this is the json workflow I used to generate animate diff gif: https://openart.ai/workflows/neuralunk/basic-prompt-travel---animatediff/h265kfpR1ltnQjYT45iO

or copy below and paste it to a .json file

{"last_node_id":41,"last_link_id":57,"nodes":[{"id":36,"type":"ADE_AnimateDiffLoaderWithContext","pos":[900,200],"size":{"0":315,"1":210},"flags":{},"order":6,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":48},{"name":"context_options","type":"CONTEXT_OPTIONS","link":49},{"name":"motion_lora","type":"MOTION_LORA","link":null},{"name":"motion_model_settings","type":"MOTION_MODEL_SETTINGS","link":null},{"name":"sample_settings","type":"sample_settings","link":null}],"outputs":[{"name":"MODEL","type":"MODEL","links":[50],"shape":3,"slot_index":0}],"properties":{"Node name for S&R":"ADE_AnimateDiffLoaderWithContext"},"widgets_values":["mm_sd_v15_v2.ckpt","sqrt_linear (AnimateDiff)",1,false],"color":"#332922","bgcolor":"#593930"},{"id":9,"type":"EmptyLatentImage","pos":[300,550],"size":{"0":315,"1":106},"flags":{},"order":5,"mode":0,"inputs":[{"name":"batch_size","type":"INT","link":56,"widget":{"name":"batch_size"}}],"outputs":[{"name":"LATENT","type":"LATENT","links":[39],"shape":3,"slot_index":0}],"properties":{"Node name for S&R":"EmptyLatentImage"},"widgets_values":[512,512,3],"color":"#432","bgcolor":"#653"},{"id":6,"type":"CLIPTextEncode","pos":[820,1300],"size":{"0":570,"1":76},"flags":{},"order":8,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":3}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[5],"shape":3,"slot_index":0}],"properties":{"Node name for S&R":"CLIPTextEncode"},"widgets_values":["(worst quality, low quality: 1.4)"],"color":"#322","bgcolor":"#533"},{"id":2,"type":"VAELoader","pos":[450,430],"size":{"0":385.8948669433594,"1":58},"flags":{},"order":0,"mode":0,"outputs":[{"name":"VAE","type":"VAE","links":[10],"shape":3,"slot_index":0}],"properties":{"Node name for S&R":"VAELoader"},"widgets_values":["vae-ft-mse-840000-ema-pruned.safetensors"],"color":"#223","bgcolor":"#335"},{"id":10,"type":"VAEDecode","pos":[1500,410],"size":{"0":210,"1":46},"flags":{},"order":11,"mode":0,"inputs":[{"name":"samples","type":"LATENT","link":9},{"name":"vae","type":"VAE","link":10}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[51],"shape":3,"slot_index":0}],"properties":{"Node name for S&R":"VAEDecode"}},{"id":37,"type":"VHS_VideoCombine","pos":[1818,359],"size":[636.6485595703125,860.6485595703125],"flags":{},"order":12,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":51}],"outputs":[],"properties":{"Node name for S&R":"VHS_VideoCombine"},"widgets_values":{"frame_rate":8,"loop_count":0,"filename_prefix":"aaa_readme","format":"image/gif","pingpong":false,"save_image":true,"crf":20,"save_metadata":true,"audio_file":"","videopreview":{"hidden":false,"paused":false,"params":{"filename":"aaa_readme_00004.gif","subfolder":"","type":"output","format":"image/gif"}}},"color":"#223","bgcolor":"#335"},{"id":7,"type":"KSampler","pos":[1460,530],"size":{"0":315,"1":474},"flags":{},"order":10,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":50},{"name":"positive","type":"CONDITIONING","link":55},{"name":"negative","type":"CONDITIONING","link":5},{"name":"latent_image","type":"LATENT","link":39}],"outputs":[{"name":"LATENT","type":"LATENT","links":[9],"shape":3,"slot_index":0}],"properties":{"Node name for S&R":"KSampler"},"widgets_values":[888888889,"fixed",20,8,"euler","normal",1],"color":"#2a363b","bgcolor":"#3f5159"},{"id":33,"type":"ADE_AnimateDiffUniformContextOptions","pos":[900,-10],"size":{"0":315,"1":154},"flags":{},"order":1,"mode":0,"outputs":[{"name":"CONTEXT_OPTIONS","type":"CONTEXT_OPTIONS","links":[49],"shape":3,"slot_index":0}],"properties":{"Node name for S&R":"ADE_AnimateDiffUniformContextOptions"},"widgets_values":[24,1,4,"uniform",false],"color":"#332922","bgcolor":"#593930"},{"id":4,"type":"CLIPSetLastLayer","pos":[530,780],"size":{"0":210,"1":58},"flags":{},"order":7,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":44}],"outputs":[{"name":"CLIP","type":"CLIP","links":[3,54],"shape":3,"slot_index":0}],"properties":{"Node name for S&R":"CLIPSetLastLayer"},"widgets_values":[-1],"color":"#223","bgcolor":"#335"},{"id":41,"type":"Note","pos":[1468,19],"size":{"0":456.2312316894531,"1":269.3837890625},"flags":{},"order":2,"mode":0,"title":"#NeuraLunk info & more FREE workflows","properties":{"text":""},"widgets_values":["Credits:\nComfyUI Workflow by: MrLunk / #NeuraLunk\nFor: OpenArt.ai \n\nMy Facebook page:\nhttps://www.facebook.com/NeuraLunk \n\n--->> FREE downloadable:\n300+ of my Workflows and art-pieces:\nLink: https://openart.ai/profile/neuralunk\n\nCREDITS to...\nAll those who create amazing Models and Lora's, controlnet and so on ....\nAll the fine people working on ComfyUI itself,\nand the amazing creators of Custom-Nodes that make all this possible !\n\nGreetz, Be well and Happy !\nPeter Lunk "],"color":"#148327","bgcolor":"#006f13"},{"id":40,"type":"PrimitiveNode","pos":[40,550],"size":{"0":210,"1":82},"flags":{},"order":3,"mode":0,"outputs":[{"name":"INT","type":"INT","links":[56,57],"slot_index":0,"widget":{"name":"batch_size"}}],"title":"Number of Frames","properties":{"Run widget replace on values":false},"widgets_values":[3,"fixed"],"color":"#432","bgcolor":"#653"},{"id":32,"type":"CheckpointLoaderSimple","pos":[-31,338],"size":{"0":416.72296142578125,"1":115.15579223632812},"flags":{},"order":4,"mode":0,"outputs":[{"name":"MODEL","type":"MODEL","links":[48],"shape":3,"slot_index":0},{"name":"CLIP","type":"CLIP","links":[44],"shape":3,"slot_index":1},{"name":"VAE","type":"VAE","links":null,"shape":3}],"properties":{"Node name for S&R":"CheckpointLoaderSimple"},"widgets_values":["toonyou_beta6.safetensors"],"color":"#223","bgcolor":"#335"},{"id":38,"type":"BatchPromptSchedule","pos":[820,670],"size":{"0":570,"1":570},"flags":{},"order":9,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":54},{"name":"max_frames","type":"INT","link":57,"widget":{"name":"max_frames"}}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[55],"shape":3,"slot_index":0},{"name":"NEG","type":"CONDITIONING","links":null,"shape":3}],"properties":{"Node name for S&R":"BatchPromptSchedule"},"widgets_values":["\"0\" : \"Point of view, walking in the dessert\"\n\n",3,false,"","0",0,0,0,0,0],"color":"#232","bgcolor":"#353"}],"links":[[3,4,0,6,0,"CLIP"],[5,6,0,7,2,"CONDITIONING"],[9,7,0,10,0,"LATENT"],[10,2,0,10,1,"VAE"],[39,9,0,7,3,"LATENT"],[44,32,1,4,0,"CLIP"],[48,32,0,36,0,"MODEL"],[49,33,0,36,1,"CONTEXT_OPTIONS"],[50,36,0,7,0,"MODEL"],[51,10,0,37,0,"IMAGE"],[54,4,0,38,0,"CLIP"],[55,38,0,7,1,"CONDITIONING"],[56,40,0,9,0,"INT"],[57,40,0,38,1,"INT"]],"groups":[],"config":{},"extra":{},"version":0.4}

also, make sure you are generating as small number of frames as possible, I used 3 and it worked for me

Screenshot 2024-01-03 at 1 41 29 AM
CedricBattah commented 11 months ago

Thank's for your help, I tried every step you did and still face the exact same problem. I think I'll give up and move to a1111

Weixuanf commented 11 months ago

Thank's for your help, I tried every step you did and still face the exact same problem. I think I'll give up and move to a1111

ohhh actually one important step, Number of Frames should be as small as possible, I'm using 3. How many frames are you trying to generate? Mac can't handle too many frames.

Screenshot 2024-01-03 at 1 41 29 AM
Kosinkadink commented 11 months ago

@CedricBattah what's the current state of your problem? Are you still getting the NDArray > 2**32 thing?

harksha commented 9 months ago

this worked for me too:

python main.py --force-fp16 --use-split-cross-attention