AbdullahAlfaraj / Auto-Photoshop-StableDiffusion-Plugin

A user-friendly plug-in that makes it easy to generate stable diffusion images inside Photoshop using either Automatic or ComfyUI as a backend.
MIT License
6.58k stars 501 forks source link

Comfyui workflow features #431

Open drprs opened 7 months ago

drprs commented 7 months ago

Are the original features of the plugin using auto1111- selecting a area on the canvas and using it in img2img - possible in the comfyui backend? Is this a future planned feature? The only way I can use comfyui right now is using the custom workflow option.

AbdullahAlfaraj commented 7 months ago

Yeah it's possible, make sure you select ComfyUI as backend. can you screenshot the settings tab? image

drprs commented 7 months ago

I have already done that. I can t select the checkpoints from the list either. I get this in comfyui

ERROR:root:!!! Exception during processing !!! ERROR:root:Traceback (most recent call last): File "J:\comfy 249\ComfyUI\execution.py", line 153, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) File "J:\comfy 249\ComfyUI\execution.py", line 83, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) File "J:\comfy 249\ComfyUI\execution.py", line 76, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) File "J:\comfy 249\ComfyUI\nodes.py", line 1299, in sample return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise) File "J:\comfy 249\ComfyUI\nodes.py", line 1269, in common_ksampler samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, File "J:\comfy 249\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 22, in informative_sample raise e File "J:\comfy 249\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 9, in informative_sample return original_sample(*args, *kwargs) File "J:\comfy 249\ComfyUI\custom_nodes\ComfyUI-AnimateDiff-Evolved\animatediff\sampling.py", line 116, in animatediff_sample return orig_comfy_sample(model, args, kwargs) File "J:\comfy 249\ComfyUI\comfy\sample.py", line 100, in sample samples = sampler.sample(noise, positive_copy, negative_copy, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed) File "J:\comfy 249\ComfyUI\comfy\samplers.py", line 711, in sample return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed) File "J:\comfy 249\ComfyUI\comfy\samplers.py", line 617, in sample samples = sampler.sample(model_wrap, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar) File "J:\comfy 249\ComfyUI\comfy\samplers.py", line 556, in sample samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, self.extra_options) File "J:\comfy 249\python_embeded\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(*args, *kwargs) File "J:\comfy 249\ComfyUI\comfy\k_diffusion\sampling.py", line 137, in sample_euler denoised = model(x, sigma_hat s_in, extra_args) File "J:\comfy 249\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl return forward_call(*args, kwargs) File "J:\comfy 249\ComfyUI\comfy\samplers.py", line 277, in forward out = self.inner_model(x, sigma, cond=cond, uncond=uncond, cond_scale=cond_scale, model_options=model_options, seed=seed) File "J:\comfy 249\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl return forward_call(*args, *kwargs) File "J:\comfy 249\ComfyUI\comfy\samplers.py", line 267, in forward return self.apply_model(args, kwargs) File "J:\comfy 249\ComfyUI\comfy\samplers.py", line 264, in apply_model out = sampling_function(self.inner_model, x, timestep, uncond, cond, cond_scale, model_options=model_options, seed=seed) File "J:\comfy 249\ComfyUI\comfy\samplers.py", line 252, in sampling_function cond, uncond = calc_cond_uncond_batch(model, cond, uncond, x, timestep, model_options) File "J:\comfy 249\ComfyUI\comfy\samplers.py", line 230, in calc_cond_uncond_batch output = model.apply_model(inputx, timestep, c).chunk(batch_chunks) File "J:\comfy 249\ComfyUI\comfy\model_base.py", line 73, in apply_model model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, extra_conds).float() File "J:\comfy 249\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl return forward_call(*args, *kwargs) File "J:\comfy 249\ComfyUI\comfy\ldm\modules\diffusionmodules\openaimodel.py", line 855, in forward h = forward_timestep_embed(module, h, emb, context, transformer_options, time_context=time_context, num_video_frames=num_video_frames, image_only_indicator=image_only_indicator) File "J:\comfy 249\ComfyUI\comfy\ldm\modules\diffusionmodules\openaimodel.py", line 52, in forward_timestep_embed x = layer(x) File "J:\comfy 249\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl return forward_call(args, **kwargs) File "J:\comfy 249\python_embeded\lib\site-packages\torch\nn\modules\conv.py", line 463, in forward return self._conv_forward(input, self.weight, self.bias) File "J:\comfy 249\python_embeded\lib\site-packages\torch\nn\modules\conv.py", line 459, in _conv_forward return F.conv2d(input, weight, bias, self.stride, RuntimeError: Given groups=1, weight of size [320, 5, 3, 3], expected input[2, 9, 64, 72] to have 5 channels, but got 9 channels instead

drprs commented 7 months ago
image image
drprs commented 7 months ago

I have my model folder symlinked for comfyui. A1111 part of the plugin is working flawlessly