Closed MysticDaedra closed 6 months ago
Update: I think this might actually not be an extension error at all, but rather a bug with inpainting. Trying to inpaint with adetailer disabled returned the following:
13:26:27-956949 INFO Applying hypertile: unet=512
13:26:27-972644 TRACE Run mask: fn=init
13:26:27-986379 TRACE Mask args legacy: blur=4 padding=32
13:26:27-994252 TRACE Mask shape=(4096, 3584) opts=namespace(model=None, auto_mask='None', mask_only=False, mask_blur=0.004, mask_erode=0.01, mask_dilate=0.03571428571428571, seg_iou_thresh=0.5,
seg_score_thresh=0.5, seg_nms_thresh=0.5, seg_overlap_ratio=0.3, seg_points_per_batch=64, seg_topK=50, seg_colormap='pink', preview_type='Composite', seg_live=True,
weight_original=0.5, weight_mask=0.5, kernel_iterations=1, invert=False)
13:26:28-001889 TRACE Mask erode=0.010 kernel=(9, 9) mask=(4096, 3584)
13:26:28-008891 TRACE Mask dilate=0.036 kernel=(33, 33) mask=(4096, 3584)
13:26:28-041902 TRACE Mask blur=0.004 x=4 y=4 mask=(4096, 3584)
13:26:28-044904 DEBUG Mask: size=3584x4096 masked=209003px area=0.01 auto=None blur=0.004 erode=0.01 dilate=0.03571428571428571 type=Grayscale time=0.07
13:26:28-102549 TRACE Mask crop: mask=(3584, 4096) region=(1247, 326, 1838, 908) pad=32
13:26:28-104551 TRACE Mask expand: image=(3584, 4096) processing=(1024, 1024) region=(1247, 322, 1838, 913)
13:26:28-221868 INFO Saving: image="D:\Stable Diffusion Files\Outputs\init-images\08764-2bd7c7f3-init-image.png" type=PNG resolution=3584x4096 size=0
13:26:33-182553 ERROR Exception: 'float' object cannot be interpreted as an integer
13:26:33-183553 ERROR Arguments: args=('task(62hq0p859fnlnjd)', 2.0, 'young cla1re, highly detailed, (scared expression:1.2), fFaceDetail-SDXL EyeDetail-SDXL <lora:cla1re3 (20):1.0>', '', [],
<PIL.Image.Image image mode=RGBA size=1792x2048 at 0x22EC66F9A50>, None, {'image': <PIL.Image.Image image mode=RGBA size=3584x4096 at 0x22EC66FBDF0>, 'mask': <PIL.Image.Image image
mode=RGB size=3584x4096 at 0x22EC66FB070>}, None, None, None, None, 10, 3, 4, 1, 1, True, False, False, 1, 1, 1.2, 6, 0.7, 0, 1, 0, 1, 0.4, -1.0, -1.0, 0, 0, 0, 0, 1024, 1024, 1, 1,
'None', 1, 32, 0, None, '', '', '', 0, 0, 0, 0, False, 4, 0.95, False, 0.6, 1, '#000000', 0, [], 0, 1, 'None', 'None', 'None', 'None', 0.5, 0.5, 0.5, 0.5, None, None, None, None, 0, 0,
0, 0, 1, 1, 1, 1, 'None', 16, 'None', 1, True, 'None', 2, True, 1, 0, True, 'none', 3, 4, 0.25, 0.25, False, False, {'ad_model': 'face_yolov8m.pt', 'ad_model_classes': '', 'ad_prompt':
'cla1re, highly detailed, (scared expression:1.2), fFaceDetail-SDXL EyeDetail-SDXL <lora:cla1re3 (20):1.0>', 'ad_negative_prompt': '', 'ad_confidence': 0.75, 'ad_mask_k_largest': 0,
'ad_mask_min_ratio': 0, 'ad_mask_max_ratio': 1, 'ad_x_offset': 0, 'ad_y_offset': 0, 'ad_dilate_erode': 4, 'ad_mask_merge_invert': 'None', 'ad_mask_blur': 4, 'ad_denoising_strength':
0.4, 'ad_inpaint_only_masked': True, 'ad_inpaint_only_masked_padding': 32, 'ad_use_inpaint_width_height': False, 'ad_inpaint_width': 512, 'ad_inpaint_height': 512, 'ad_use_steps':
False, 'ad_steps': 28, 'ad_use_cfg_scale': False, 'ad_cfg_scale': 7, 'ad_use_checkpoint': False, 'ad_checkpoint': 'Use same checkpoint', 'ad_use_vae': False, 'ad_vae': 'Use same VAE',
'ad_use_sampler': False, 'ad_sampler': 'Default', 'ad_use_noise_multiplier': False, 'ad_noise_multiplier': 1, 'ad_use_clip_skip': False, 'ad_clip_skip': 1, 'ad_restore_face': False,
'ad_controlnet_model': 'None', 'ad_controlnet_module': 'None', 'ad_controlnet_weight': 1, 'ad_controlnet_guidance_start': 0, 'ad_controlnet_guidance_end': 1, 'is_api': ()},
{'ad_model': 'mediapipe_face_mesh_eyes_only', 'ad_model_classes': '', 'ad_prompt': 'purple magic eyes, EyeDetail-SDXL <lora:Stunning_eyes_2:1.0>', 'ad_negative_prompt': '',
'ad_confidence': 0.75, 'ad_mask_k_largest': 1, 'ad_mask_min_ratio': 0, 'ad_mask_max_ratio': 1, 'ad_x_offset': 0, 'ad_y_offset': 0, 'ad_dilate_erode': 4, 'ad_mask_merge_invert': 'None',
'ad_mask_blur': 4, 'ad_denoising_strength': 0.4, 'ad_inpaint_only_masked': True, 'ad_inpaint_only_masked_padding': 32, 'ad_use_inpaint_width_height': False, 'ad_inpaint_width': 512,
'ad_inpaint_height': 512, 'ad_use_steps': False, 'ad_steps': 28, 'ad_use_cfg_scale': False, 'ad_cfg_scale': 7, 'ad_use_checkpoint': False, 'ad_checkpoint': 'Use same checkpoint',
'ad_use_vae': False, 'ad_vae': 'Use same VAE', 'ad_use_sampler': False, 'ad_sampler': 'Default', 'ad_use_noise_multiplier': False, 'ad_noise_multiplier': 1, 'ad_use_clip_skip': False,
'ad_clip_skip': 1, 'ad_restore_face': False, 'ad_controlnet_model': 'None', 'ad_controlnet_module': 'None', 'ad_controlnet_weight': 1, 'ad_controlnet_guidance_start': 0,
'ad_controlnet_guidance_end': 1, 'is_api': ()}, {'ad_model': 'hand_yolov8s.pt', 'ad_model_classes': '', 'ad_prompt': 'young girl hand', 'ad_negative_prompt': 'bad-hands-SDXL',
'ad_confidence': 0.6, 'ad_mask_k_largest': 0, 'ad_mask_min_ratio': 0, 'ad_mask_max_ratio': 1, 'ad_x_offset': 0, 'ad_y_offset': 0, 'ad_dilate_erode': 4, 'ad_mask_merge_invert': 'None',
'ad_mask_blur': 4, 'ad_denoising_strength': 0.4, 'ad_inpaint_only_masked': True, 'ad_inpaint_only_masked_padding': 32, 'ad_use_inpaint_width_height': False, 'ad_inpaint_width': 512,
'ad_inpaint_height': 512, 'ad_use_steps': False, 'ad_steps': 28, 'ad_use_cfg_scale': False, 'ad_cfg_scale': 7, 'ad_use_checkpoint': False, 'ad_checkpoint': 'Use same checkpoint',
'ad_use_vae': False, 'ad_vae': 'Use same VAE', 'ad_use_sampler': False, 'ad_sampler': 'Default', 'ad_use_noise_multiplier': False, 'ad_noise_multiplier': 1, 'ad_use_clip_skip': False,
'ad_clip_skip': 1, 'ad_restore_face': False, 'ad_controlnet_model': 'None', 'ad_controlnet_module': 'None', 'ad_controlnet_weight': 1, 'ad_controlnet_guidance_start': 0,
'ad_controlnet_guidance_end': 1, 'is_api': ()}, {'ad_model': 'hand_yolov8s.pt', 'ad_model_classes': '', 'ad_prompt': 'young girl hand', 'ad_negative_prompt': 'bad-hands-SDXL',
'ad_confidence': 0.6, 'ad_mask_k_largest': 0, 'ad_mask_min_ratio': 0, 'ad_mask_max_ratio': 1, 'ad_x_offset': 0, 'ad_y_offset': 0, 'ad_dilate_erode': 4, 'ad_mask_merge_invert': 'None',
'ad_mask_blur': 4, 'ad_denoising_strength': 0.3, 'ad_inpaint_only_masked': True, 'ad_inpaint_only_masked_padding': 32, 'ad_use_inpaint_width_height': False, 'ad_inpaint_width': 512,
'ad_inpaint_height': 512, 'ad_use_steps': False, 'ad_steps': 28, 'ad_use_cfg_scale': False, 'ad_cfg_scale': 7, 'ad_use_checkpoint': False, 'ad_checkpoint': 'Use same checkpoint',
'ad_use_vae': False, 'ad_vae': 'Use same VAE', 'ad_use_sampler': False, 'ad_sampler': 'Default', 'ad_use_noise_multiplier': False, 'ad_noise_multiplier': 1, 'ad_use_clip_skip': False,
'ad_clip_skip': 1, 'ad_restore_face': False, 'ad_controlnet_model': 'None', 'ad_controlnet_module': 'None', 'ad_controlnet_weight': 1, 'ad_controlnet_guidance_start': 0,
'ad_controlnet_guidance_end': 1, 'is_api': ()}, {'ad_model': 'None', 'ad_model_classes': '', 'ad_prompt': '', 'ad_negative_prompt': '', 'ad_confidence': 0.3, 'ad_mask_k_largest': 0,
'ad_mask_min_ratio': 0, 'ad_mask_max_ratio': 1, 'ad_x_offset': 0, 'ad_y_offset': 0, 'ad_dilate_erode': 4, 'ad_mask_merge_invert': 'None', 'ad_mask_blur': 4, 'ad_denoising_strength':
0.4, 'ad_inpaint_only_masked': True, 'ad_inpaint_only_masked_padding': 32, 'ad_use_inpaint_width_height': False, 'ad_inpaint_width': 512, 'ad_inpaint_height': 512, 'ad_use_steps':
False, 'ad_steps': 28, 'ad_use_cfg_scale': False, 'ad_cfg_scale': 7, 'ad_use_checkpoint': False, 'ad_checkpoint': 'Use same checkpoint', 'ad_use_vae': False, 'ad_vae': 'Use same VAE',
'ad_use_sampler': False, 'ad_sampler': 'Default', 'ad_use_noise_multiplier': False, 'ad_noise_multiplier': 1, 'ad_use_clip_skip': False, 'ad_clip_skip': 1, 'ad_restore_face': False,
'ad_controlnet_model': 'None', 'ad_controlnet_module': 'None', 'ad_controlnet_weight': 1, 'ad_controlnet_guidance_start': 0, 'ad_controlnet_guidance_end': 1, 'is_api': ()},
{'ad_model': 'None', 'ad_model_classes': '', 'ad_prompt': '', 'ad_negative_prompt': '', 'ad_confidence': 0.3, 'ad_mask_k_largest': 0, 'ad_mask_min_ratio': 0, 'ad_mask_max_ratio': 1,
'ad_x_offset': 0, 'ad_y_offset': 0, 'ad_dilate_erode': 4, 'ad_mask_merge_invert': 'None', 'ad_mask_blur': 4, 'ad_denoising_strength': 0.4, 'ad_inpaint_only_masked': True,
'ad_inpaint_only_masked_padding': 32, 'ad_use_inpaint_width_height': False, 'ad_inpaint_width': 512, 'ad_inpaint_height': 512, 'ad_use_steps': False, 'ad_steps': 28,
'ad_use_cfg_scale': False, 'ad_cfg_scale': 7, 'ad_use_checkpoint': False, 'ad_checkpoint': 'Use same checkpoint', 'ad_use_vae': False, 'ad_vae': 'Use same VAE', 'ad_use_sampler':
False, 'ad_sampler': 'Default', 'ad_use_noise_multiplier': False, 'ad_noise_multiplier': 1, 'ad_use_clip_skip': False, 'ad_clip_skip': 1, 'ad_restore_face': False,
'ad_controlnet_model': 'None', 'ad_controlnet_module': 'None', 'ad_controlnet_weight': 1, 'ad_controlnet_guidance_start': 0, 'ad_controlnet_guidance_end': 1, 'is_api': ()},
{'ad_model': 'None', 'ad_model_classes': '', 'ad_prompt': '', 'ad_negative_prompt': '', 'ad_confidence': 0.3, 'ad_mask_k_largest': 0, 'ad_mask_min_ratio': 0, 'ad_mask_max_ratio': 1,
'ad_x_offset': 0, 'ad_y_offset': 0, 'ad_dilate_erode': 4, 'ad_mask_merge_invert': 'None', 'ad_mask_blur': 4, 'ad_denoising_strength': 0.4, 'ad_inpaint_only_masked': True,
'ad_inpaint_only_masked_padding': 32, 'ad_use_inpaint_width_height': False, 'ad_inpaint_width': 512, 'ad_inpaint_height': 512, 'ad_use_steps': False, 'ad_steps': 28,
'ad_use_cfg_scale': False, 'ad_cfg_scale': 7, 'ad_use_checkpoint': False, 'ad_checkpoint': 'Use same checkpoint', 'ad_use_vae': False, 'ad_vae': 'Use same VAE', 'ad_use_sampler':
False, 'ad_sampler': 'Default', 'ad_use_noise_multiplier': False, 'ad_noise_multiplier': 1, 'ad_use_clip_skip': False, 'ad_clip_skip': 1, 'ad_restore_face': False,
'ad_controlnet_model': 'None', 'ad_controlnet_module': 'None', 'ad_controlnet_weight': 1, 'ad_controlnet_guidance_start': 0, 'ad_controlnet_guidance_end': 1, 'is_api': ()},
{'ad_model': 'None', 'ad_model_classes': '', 'ad_prompt': '', 'ad_negative_prompt': '', 'ad_confidence': 0.3, 'ad_mask_k_largest': 0, 'ad_mask_min_ratio': 0, 'ad_mask_max_ratio': 1,
'ad_x_offset': 0, 'ad_y_offset': 0, 'ad_dilate_erode': 4, 'ad_mask_merge_invert': 'None', 'ad_mask_blur': 4, 'ad_denoising_strength': 0.4, 'ad_inpaint_only_masked': True,
'ad_inpaint_only_masked_padding': 32, 'ad_use_inpaint_width_height': False, 'ad_inpaint_width': 512, 'ad_inpaint_height': 512, 'ad_use_steps': False, 'ad_steps': 28,
'ad_use_cfg_scale': False, 'ad_cfg_scale': 7, 'ad_use_checkpoint': False, 'ad_checkpoint': 'Use same checkpoint', 'ad_use_vae': False, 'ad_vae': 'Use same VAE', 'ad_use_sampler':
False, 'ad_sampler': 'Default', 'ad_use_noise_multiplier': False, 'ad_noise_multiplier': 1, 'ad_use_clip_skip': False, 'ad_clip_skip': 1, 'ad_restore_face': False,
'ad_controlnet_model': 'None', 'ad_controlnet_module': 'None', 'ad_controlnet_weight': 1, 'ad_controlnet_guidance_start': 0, 'ad_controlnet_guidance_end': 1, 'is_api': ()}, '', '',
0.5, True, 1, False, 'None', None, 'None', 16, 'None', 2, True, 1, 0, 'none', 3, 4, 0.25, 0.25, 0.5, 0.5, 0.1, 1, True, '', 0.5, 0.9, '', 0.5, 0.9, 4, 0.5, 'Linear', 'None',
'<span>  Outpainting</span><br>', 128, 8, ['left', 'right', 'up', 'down'], 1, 0.05, 128, 4, 0, ['left', 'right', 'up', 'down'], False, False, 'positive', 'comma', 0, False, False,
'', '<span>  SD Upscale</span><br>', 128, 31, 2, 'SVD 1.0', 14, True, 1, 3, 6, 0.5, 0.1, 'None', 2, True, 1, 0, 0, '', [], 0, '', [], 0, '', [], False, True, False, False, False,
False, 0, '<p style="margin-bottom:0.75em">Will upscale the image depending on the selected target size type</p>', 512, 0, 8, 32, 64, 0.35, 32, 0, True, 0, False, 8, 0, 0, 2048, 2048,
2, 'None', [], 'FaceID Base', True, True, 1, 1, 1, 0.5, True, 'person', 1, 0.5, True) kwargs={}
13:26:33-206410 ERROR gradio call: TypeError
╭────────────────────────────────────────────────────────────────────────────────────── Traceback (most recent call last) ──────────────────────────────────────────────────────────────────────────────────────╮
│ D:\automatic\modules\call_queue.py:31 in f │
│ │
│ 30 │ │ │ try: │
│ ❱ 31 │ │ │ │ res = func(*args, **kwargs) │
│ 32 │ │ │ │ progress.record_results(id_task, res) │
│ │
│ D:\automatic\modules\img2img.py:264 in img2img │
│ │
│ 263 │ │ if processed is None: │
│ ❱ 264 │ │ │ processed = processing.process_images(p) │
│ 265 │ p.close() │
│ │
│ D:\automatic\modules\processing.py:193 in process_images │
│ │
│ 192 │ │ │ with context_hypertile_vae(p), context_hypertile_unet(p): │
│ ❱ 193 │ │ │ │ processed = process_images_inner(p) │
│ 194 │
│ │
│ D:\automatic\modules\processing.py:264 in process_images_inner │
│ │
│ 263 │ │ │ with devices.autocast(): │
│ ❱ 264 │ │ │ │ p.init(p.all_prompts, p.all_seeds, p.all_subseeds) │
│ 265 │ │ extra_network_data = None │
│ │
│ D:\automatic\modules\processing_class.py:423 in init │
│ │
│ 422 │ │ │ │ if image.width != self.width or image.height != self.height: │
│ ❱ 423 │ │ │ │ │ image = images.resize_image(3, image, self.width, self.height, self.resize_name) │
│ 424 │ │ │ if self.image_mask is not None and self.inpainting_fill != 1: │
│ │
│ D:\automatic\modules\images.py:292 in resize_image │
│ │
│ 291 │ elif resize_mode == 3: # fill │
│ ❱ 292 │ │ res = fill(im) │
│ 293 │ elif resize_mode == 4: # edge │
│ │
│ D:\automatic\modules\images.py:280 in fill │
│ │
│ 279 │ │ ratio = min(width / im.width, height / im.height) │
│ ❱ 280 │ │ im = resize(im, im.width * ratio, im.height * ratio) │
│ 281 │ │ res = Image.new(im.mode, (width, height), color=color) │
│ │
│ D:\automatic\modules\images.py:229 in resize │
│ │
│ 228 │ │ if upscaler_name is None or upscaler_name == "None" or im.mode == 'L': │
│ ❱ 229 │ │ │ return im.resize((w, h), resample=Image.Resampling.LANCZOS) # force for mask │
│ 230 │ │ scale = max(w / im.width, h / im.height) │
│ │
│ D:\automatic\venv\lib\site-packages\PIL\Image.py:2200 in resize │
│ │
│ 2199 │ │ │
│ ❱ 2200 │ │ return self._new(self.im.resize(size, resample, box)) │
│ 2201 │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
TypeError: 'float' object cannot be interpreted as an integer
13:28:00-145295 DEBUG Server: alive=True jobs=1 requests=1346 uptime=2700 memory=8.71/31.9 backend=Backend.DIFFUSERS state=job="upscale batch 1/72" 0/72
As you can see, this is almost exactly the same error that was being returned before when adetailer was trying to run... only this time, with regular unadulterated inpainting.
fixed. it was simple float vs int thing.
Issue Description
EDIT: As per comment below, this actually appears to be a bug with inpainting, not adetailer.
I'm getting a traceback error with today's dev update. Didn't have this problem yesterday, which I think means the update did something. Anyways, here's the log:
Version Platform Description
Python 3.10.6 Dev branch d0e35a7a Windows 11 Professional Nvidia RTX 3070 8GB Mozilla Firefox v123.0.1
URL link of the extension
https://github.com/Bing-su/adetailer
URL link of the issue reported in the extension repository
No response
Acknowledgements