Closed RiepinPavlo closed 1 year ago
I simply don't use DMDnet because of the transparent gray square alone (which you mentioned) and it tends to upscale the face in a way that looks so artificial, I know it's faster than others on CPU but I would rather not use it and instead use "None". On a side note for me GFPGAN is the best enhancer and much faster than codeformer.
Can't reproduce, working here with CPU too.
Describe the bug
Launching App Using provider ['CPUExecutionProvider'] - Device:cpu Running on local URL: http://127.0.0.1:7860
To create a public link, set
share=True
inlaunch()
. Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}} find model: C:\Users\Admin/.insightface\models\buffalo_l\1k3d68.onnx landmark_3d_68 ['None', 3, 192, 192] 0.0 1.0 Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}} find model: C:\Users\Admin/.insightface\models\buffalo_l\2d106det.onnx landmark_2d_106 ['None', 3, 192, 192] 0.0 1.0 Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}} find model: C:\Users\Admin/.insightface\models\buffalo_l\det_10g.onnx detection [1, 3, '?', '?'] 127.5 128.0 Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}} find model: C:\Users\Admin/.insightface\models\buffalo_l\genderage.onnx genderage ['None', 3, 96, 96] 0.0 1.0 Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}} find model: C:\Users\Admin/.insightface\models\buffalo_l\w600k_r50.onnx recognition ['None', 3, 112, 112] 127.5 127.5 set det-size: (640, 640) Error init processor plugin dmdnet... red Traceback (most recent call last): File "E:\nsfw-roop\roop-unleashed\installer\roop-unleashed\chain_img_processor\image.py", line 127, in init_processor self.add_processor_to_list(processor_id) File "E:\nsfw-roop\roop-unleashed\installer\roop-unleashed\chain_img_processor\image.py", line 113, in add_processor_to_list obj.init_plugin() File "E:\nsfw-roop\roop-unleashed\installer\roop-unleashed\plugins\plugin_dmdnet.py", line 72, in init_plugin create(self.device) File "E:\nsfw-roop\roop-unleashed\installer\roop-unleashed\plugins\plugin_dmdnet.py", line 179, in create weights = torch.load('./models/DMDNet.pth') File "E:\nsfw-roop\roop-unleashed\installer\installer_files\env\lib\site-packages\torch\serialization.py", line 809, in load return _load(opened_zipfile, map_location, pickle_module, pickle_load_args) File "E:\nsfw-roop\roop-unleashed\installer\installer_files\env\lib\site-packages\torch\serialization.py", line 1172, in _load result = unpickler.load() File "E:\nsfw-roop\roop-unleashed\installer\installer_files\env\lib\site-packages\torch\serialization.py", line 1142, in persistent_load typed_storage = load_tensor(dtype, nbytes, key, _maybe_decode_ascii(location)) File "E:\nsfw-roop\roop-unleashed\installer\installer_files\env\lib\site-packages\torch\serialization.py", line 1116, in load_tensor wrap_storage=restore_location(storage, location), File "E:\nsfw-roop\roop-unleashed\installer\installer_files\env\lib\site-packages\torch\serialization.py", line 217, in default_restore_location result = fn(storage, location) File "E:\nsfw-roop\roop-unleashed\installer\installer_files\env\lib\site-packages\torch\serialization.py", line 182, in _cuda_deserialize device = validate_cuda_device(location) File "E:\nsfw-roop\roop-unleashed\installer\installer_files\env\lib\site-packages\torch\serialization.py", line 166, in validate_cuda_device raise RuntimeError('Attempting to deserialize object on a CUDA ' RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU. To Reproduce** Steps to reproduce the behavior:Details What OS are you using?
Are you try to use a GPU?
---gpu
flagSanity Check