un1tz3r0 / controlnetvideo

Apply controlnet to video clips
71 stars 6 forks source link

the hardware running your example #1

Open tyeestudio opened 1 year ago

tyeestudio commented 1 year ago

thanks for sharing. wondering what kind of hardware to run your examples ? will macbook pro works ?

tyeestudio commented 1 year ago

use gpu server to run this, getting error as following:

python3 controlnetvideo.py PXL_20230422_013745844.TS.mp4 --controlnet depth21 --prompt 'graffuturism colorful intricate heavy detailed outlines' --prompt-strength 9 --show-input --show-detector --show-motion --dump-frames '{instem}_frames/{n:08d}.png' --init-image-strength 0.4 --color-amount 0.3 --feedthrough-strength 0.001 --show-output --num-inference-steps 15 --duration 60.0 --start-time 10.0 --skip-dumped-frames '{instem}_out.mp4' /home/tyeestudio/env/lib/python3.8/site-packages/timm/models/_factory.py:114: UserWarning: Mapping deprecated model name vit_base_resnet50_384 to current vit_base_r50_s16_384.orig_in21k_ft_in1k. model = create_fn( Processing frame 0 at time 0/60.0 seconds... 0.00s elapsed, 0.00s estimated time remaining

Traceback (most recent call last): File "controlnetvideo.py", line 871, in main() File "/home/tyeestudio/env/lib/python3.8/site-packages/click/core.py", line 1130, in call return self.main(args, kwargs) File "/home/tyeestudio/env/lib/python3.8/site-packages/click/core.py", line 1055, in main rv = self.invoke(ctx) File "/home/tyeestudio/env/lib/python3.8/site-packages/click/core.py", line 1404, in invoke return ctx.invoke(self.callback, ctx.params) File "/home/tyeestudio/env/lib/python3.8/site-packages/click/core.py", line 760, in invoke return __callback(args, kwargs) File "controlnetvideo.py", line 868, in main process_frames(input_video, output_video, frame_filter, start_time, end_time, duration, max_dimension, min_dimension, round_dims_to, fix_orientation) File "controlnetvideo.py", line 544, in process_frames video.fl(wrapper, keep_duration=True).write_videofile(output_video) File "/home/tyeestudio/env/lib/python3.8/site-packages/moviepy/Clip.py", line 136, in fl newclip = self.set_make_frame(lambda t: fun(self.get_frame, t)) File "", line 2, in set_make_frame File "/home/tyeestudio/env/lib/python3.8/site-packages/moviepy/decorators.py", line 14, in outplace f(newclip, *a, *k) File "/home/tyeestudio/env/lib/python3.8/site-packages/moviepy/video/VideoClip.py", line 644, in set_make_frame self.size = self.get_frame(0).shape[:2][::-1] File "", line 2, in get_frame File "/home/tyeestudio/env/lib/python3.8/site-packages/moviepy/decorators.py", line 89, in wrapper return f(new_a, new_kw) File "/home/tyeestudio/env/lib/python3.8/site-packages/moviepy/Clip.py", line 93, in get_frame return self.make_frame(t) File "/home/tyeestudio/env/lib/python3.8/site-packages/moviepy/Clip.py", line 136, in newclip = self.set_make_frame(lambda t: fun(self.get_frame, t)) File "controlnetvideo.py", line 539, in wrapper result = wrapped(framenum, PIL.Image.fromarray(gf(t)).resize((w,h))) File "controlnetvideo.py", line 800, in frame_filter output_frame = pipe( File "/home/tyeestudio/env/lib/python3.8/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "/home/tyeestudio/controlnet/controlnetvideo/stable_diffusion_controlnet_img2img.py", line 796, in call controlnet_conditioning_image = prepare_controlnet_conditioning_image( File "/home/tyeestudio/controlnet/controlnetvideo/stable_diffusion_controlnet_img2img.py", line 96, in prepare_controlnet_conditioning_image controlnet_conditioning_image = [ File "/home/tyeestudio/controlnet/controlnetvideo/stable_diffusion_controlnet_img2img.py", line 97, in np.array(i.resize((width, height), resample=PIL_INTERPOLATION["lanczos"]))[None, :] File "/home/tyeestudio/env/lib/python3.8/site-packages/PIL/Image.py", line 2193, in resize return self._new(self.im.resize(size, resample, box)) ValueError: height and width must be > 0

un1tz3r0 commented 1 year ago

ah okay it looks like you're trying to run it on an input video file 'PXL_20230422_013745844.TS.mp4' which I was using for testing that isn't part of the repo. Try replacing that in the command line arguments with the path to your own input video file to process. I'll try and update the docs with a better explanation of the examples.

As for what I am running this on during development, it's an AMD chipset PC with an Nvidia RTX 3090 Ti and a decent amount of disk and ram. This should run just fine on a macbook though, check out the diffusers docs page dedicated to running on Apple Silicon hardware.

tyeestudio commented 1 year ago

ah okay it looks like you're trying to run it on an input video file 'PXL_20230422_013745844.TS.mp4' which I was using for testing that isn't part of the repo. Try replacing that in the command line arguments with the path to your own input video file to process. I'll try and update the docs with a better explanation of the examples.

As for what I am running this on during development, it's an AMD chipset PC with an Nvidia RTX 3090 Ti and a decent amount of disk and ram. This should run just fine on a macbook though, check out the diffusers docs page dedicated to running on Apple Silicon hardware.

i copied your examples/PXL_20230422_013745844.TSb_out.mp4 to PXL_20230422_013745844.TS.mp4, and run it. This is running on Nvidia GPU, not from apple silicon hardward, that is not working.

xiaoli4881 commented 1 year ago

I also encountered the same error. Have you resolved it?

tyeestudio commented 1 year ago

no