Open den3asphalt opened 3 weeks ago
Looking at the console log, this caught my attention:
*** Error verifying pickled file from C:\Users\username\Desktop\SD\test\stable-diffusion-webui\models\DAT\DAT_x2.pth
*** The file may be malicious, so the program is not going to read it.
*** You can skip this check with --disable-safe-unpickle commandline argument.
Looks like a possible cause, plus a workaround. Have you tried using --disable-safe-unpickle
?
It is because this URL that A1111 uses to download is not valid now: "https://raw.githubusercontent.com/n0kovo/dat_upscaler_models/main/DAT/DAT_x2.pth"
You can download the models from the original page here Just put them in "models\DAT" folder and it will work.
Looking at the console log, this caught my attention:
*** Error verifying pickled file from C:\Users\username\Desktop\SD\test\stable-diffusion-webui\models\DAT\DAT_x2.pth *** The file may be malicious, so the program is not going to read it. *** You can skip this check with --disable-safe-unpickle commandline argument.
Looks like a possible cause, plus a workaround. Have you tried using
--disable-safe-unpickle
?
I have already tried this and it did not solve the problem, only giving me different errors.
*** Error completing request███████████████████████████████████▍ | 18/28 [00:02<00:00, 10.74it/s]
*** Arguments: ('task(6vxgwoqgi4t2zi2)', <gradio.routes.Request object at 0x00000184281DDC90>, '1girl, mika_\\(Blue_Archive\\), Blue_Archive,', '', [], 1, 1, 7, 512, 512, True, 0.7, 2, 'DAT x2', 10, 0, 0, 'Use same checkpoint', 'Use same sampler', 'Use same scheduler', '', '', [], 0, 18, 'Euler a', 'Automatic', False, '', 0.8, -1, False, -1, 0, 0, 0, False, False, 'positive', 'comma', 0, False, False, 'start', '', 1, '', [], 0, '', [], 0, '', [], True, False, False, False, False, False, False, 0, False) {}
Traceback (most recent call last):
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\modules\call_queue.py", line 57, in f
res = list(func(*args, **kwargs))
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\modules\call_queue.py", line 36, in f
res = func(*args, **kwargs)
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\modules\txt2img.py", line 109, in txt2img
processed = processing.process_images(p)
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\modules\processing.py", line 845, in process_images
res = process_images_inner(p)
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\modules\processing.py", line 981, in process_images_inner
samples_ddim = p.sample(conditioning=p.c, unconditional_conditioning=p.uc, seeds=p.seeds, subseeds=p.subseeds, subseed_strength=p.subseed_strength, prompts=p.prompts)
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\modules\processing.py", line 1344, in sample
return self.sample_hr_pass(samples, decoded_samples, seeds, subseeds, subseed_strength, prompts)
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\modules\processing.py", line 1393, in sample_hr_pass
image = images.resize_image(0, image, target_width, target_height, upscaler_name=self.hr_upscaler)
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\modules\images.py", line 288, in resize_image
res = resize(im, width, height)
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\modules\images.py", line 280, in resize
im = upscaler.scaler.upscale(im, scale, upscaler.data_path)
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\modules\upscaler.py", line 68, in upscale
img = self.do_upscale(img, selected_model)
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\modules\dat_model.py", line 32, in do_upscale
model_descriptor = modelloader.load_spandrel_model(
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\modules\modelloader.py", line 150, in load_spandrel_model
model_descriptor = spandrel.ModelLoader(device=device).load_from_file(str(path))
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\venv\lib\site-packages\spandrel\__helpers\loader.py", line 41, in load_from_file
state_dict = self.load_state_dict_from_file(path)
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\venv\lib\site-packages\spandrel\__helpers\loader.py", line 60, in load_state_dict_from_file
state_dict = self._load_pth(path)
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\venv\lib\site-packages\spandrel\__helpers\loader.py", line 82, in _load_pth
return torch.load(
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\modules\safe.py", line 108, in load
return load_with_extra(filename, *args, extra_handler=global_extra_handler, **kwargs)
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\modules\safe.py", line 156, in load_with_extra
return unsafe_torch_load(filename, *args, **kwargs)
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\venv\lib\site-packages\torch\serialization.py", line 1028, in load
return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\venv\lib\site-packages\torch\serialization.py", line 1246, in _legacy_load
magic_number = pickle_module.load(f, **pickle_load_args)
File "C:\Users\username\Desktop\SD\test\stable-diffusion-webui\venv\lib\site-packages\spandrel\__helpers\unpickler.py", line 29, in <lambda>
load=lambda *args, **kwargs: RestrictedUnpickler(*args, **kwargs).load(),
File "C:\Users\username\AppData\Local\Programs\Python\Python310\lib\pickle.py", line 1213, in load
dispatch[key[0]](self)
KeyError: 118
---
It is because this URL that A1111 uses to download is not valid now: "https://raw.githubusercontent.com/n0kovo/dat_upscaler_models/main/DAT/DAT_x2.pth"
You can download the models from the original page here Just put them in "models\DAT" folder and it will work.
Thanks for letting me know. I tried the model I downloaded from "pretrained models" at this URL and it worked. It looks like the problem occurred in #14690.
Hopefully this will be fixed in the next webui. This issue will be kept open until the code is corrected.
Checklist
What happened?
Simply put, an error occurs when trying to use DAT_x2 with Hires. fix.
Steps to reproduce the problem
What should have happened?
Hires fix works and is upscaled.
What browsers do you use to access the UI ?
Google Chrome
Sysinfo
sysinfo-2024-06-13-21-39.json
Console logs
Additional information
I have recently updated my environment with updated webui, drivers, Python, etc.