nod-ai / SHARK-Studio

SHARK Studio -- Web UI for SHARK+IREE High Performance Machine Learning Distribution
Apache License 2.0
1.42k stars 171 forks source link

Is there a braindead-level guide to installing this GUI? #637

Open ghost opened 1 year ago

ghost commented 1 year ago

i uh... need help. I get this message when i execute this (shark.venv)PS C:\Windows\system32\SHARK\web> python index.py

Traceback (most recent call last): File "C:\Windows\system32\SHARK\web\index.py", line 3, in <module> from models.stable_diffusion.main import stable_diff_inf File "C:\Windows\system32\SHARK\web\models\stable_diffusion\main.py", line 4, in <module> from models.stable_diffusion.cache_objects import ( File "C:\Windows\system32\SHARK\web\models\stable_diffusion\cache_objects.py", line 52, in <module> ) = (get_vae(args), get_unet(args), get_clip(args)) File "C:\Windows\system32\SHARK\web\models\stable_diffusion\opt_params.py", line 69, in get_vae return get_shark_model(args, bucket, model_name, iree_flags) File "C:\Windows\system32\SHARK\web\models\stable_diffusion\utils.py", line 52, in get_shark_model return _compile_module(args, shark_module, model_name, extra_args) File "C:\Windows\system32\SHARK\web\models\stable_diffusion\utils.py", line 30, in _compile_module shark_module.load_module(vmfb_path) File "C:\Windows\System32\SHARK\shark\shark_inference.py", line 209, in load_module ) = load_flatbuffer( File "C:\Windows\System32\SHARK\shark\iree_utils\compile_utils.py", line 308, in load_flatbuffer return get_iree_module(flatbuffer_blob, device, func_name) File "C:\Windows\System32\SHARK\shark\iree_utils\compile_utils.py", line 281, in get_iree_module ctx.add_vm_module(vm_module) File "C:\Windows\system32\SHARK\shark.venv\lib\site-packages\iree\runtime\system_api.py", line 255, in add_vm_module self.add_vm_modules((vm_module,)) File "C:\Windows\system32\SHARK\shark.venv\lib\site-packages\iree\runtime\system_api.py", line 252, in add_vm_modules self._vm_context.register_modules(vm_modules) RuntimeError: Error registering modules: D:\a\SHARK-Runtime\SHARK-Runtime\c\runtime\src\iree\hal\drivers\vulkan\native_executable.cc:157: UNKNOWN; VkResult=4294967283; while invoking native function hal.executable.create; while calling import; [ 1] native hal.executable.create:0 - [ 0] bytecode module.__init:3000 <stdin>:667:12 at <stdin>:21:3

What, do I do? I've been digging through the whole internet looking for some way to run this on my 5700XT and it;s given me so much grief that i'm near giving up

ghost commented 1 year ago

I'm going to try obliterating the directory and reattempting install

DOING SO MADE IT MUCH WORSE

PS C:\Windows\system32> set-executionpolicy remotesigned

Execution Policy Change The execution policy helps protect you from scripts that you do not trust. Changing the execution policy might expose you to the security risks described in the about_Execution_Policies help topic at https:/go.microsoft.com/fwlink/?LinkID=135170. Do you want to change the execution policy? [Y] Yes [A] Yes to All [N] No [L] No to All [S] Suspend [?] Help (default is "N"): Y

PS C:\Windows\system32> shark_sd_20221214_385.exe shark_tank local cache is located at C:\Users\Dire\.local/shark_tank/ . You may change this by setting the --local_tank_cache= flag Using cached models from C:\Users\Dire\.local/shark_tank/... Loading flatbuffer from C:\Windows\system32\vae_8dec_fp16_vulkan.vmfb Traceback (most recent call last): File "index.py", line 3, in <module> File "<frozen importlib._bootstrap>", line 1027, in _find_and_load File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 688, in _load_unlocked File "PyInstaller\loader\pyimod02_importers.py", line 499, in exec_module File "models\stable_diffusion\main.py", line 4, in <module> File "<frozen importlib._bootstrap>", line 1027, in _find_and_load File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 688, in _load_unlocked File "PyInstaller\loader\pyimod02_importers.py", line 499, in exec_module File "models\stable_diffusion\cache_objects.py", line 76, in <module> File "models\stable_diffusion\opt_params.py", line 69, in get_vae File "models\stable_diffusion\utils.py", line 48, in get_shark_model File "models\stable_diffusion\utils.py", line 30, in _compile_module File "shark\shark_inference.py", line 209, in load_module File "shark\iree_utils\compile_utils.py", line 308, in load_flatbuffer File "shark\iree_utils\compile_utils.py", line 281, in get_iree_module File "iree\runtime\system_api.py", line 255, in add_vm_module File "iree\runtime\system_api.py", line 252, in add_vm_modules RuntimeError: Error registering modules: D:\a\SHARK-Runtime\SHARK-Runtime\c\runtime\src\iree\hal\drivers\vulkan\native_executable.cc:157: UNKNOWN; VkResult=4294967283; while invoking native function hal.executable.create; while calling import; [ 1] native hal.executable.create:0 - [ 0] bytecode module.__init:3000 <stdin>:667:12 at <stdin>:21:3 [16448] Failed to execute script 'index' due to unhandled exception! PS C:\Windows\system32>

ghost commented 1 year ago

Okay, Getting further, Git clone to Directory and then shove the executable inside it and run it

ghost commented 1 year ago

Vulkan Instance Version 1.3.204 image image Very confused...

ghost commented 1 year ago

Attempting to see if Updating AMD Adrenalin helps at all

ghost commented 1 year ago

I should probably actually print out the whole vulkaninfo Jank solution to it but vulkaninfo.txt

ghost commented 1 year ago

Yay. (shark.venv) PS C:\Users\Dire\Desktop\S H A R K Y> python generate_sharktank.py Some weights of BertForSequenceClassification were not initialized from the model checkpoint at microsoft/MiniLM-L12-H384-uncased and are newly initialized: ['classifier.bias', 'classifier.weight'] You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference. Traceback (most recent call last): File "C:\Users\Dire\Desktop\S H A R K Y\generate_sharktank.py", line 240, in <module> save_torch_model(args.torch_model_csv) File "C:\Users\Dire\Desktop\S H A R K Y\generate_sharktank.py", line 80, in save_torch_model mlir_importer.import_debug( File "C:\Users\Dire\Desktop\S H A R K Y\shark\shark_importer.py", line 169, in import_debug imported_mlir = self.import_mlir( File "C:\Users\Dire\Desktop\S H A R K Y\shark\shark_importer.py", line 113, in import_mlir return self._torch_mlir(is_dynamic, tracing_required), func_name File "C:\Users\Dire\Desktop\S H A R K Y\shark\shark_importer.py", line 72, in _torch_mlir from shark.torch_mlir_utils import get_torch_mlir_module File "C:\Users\Dire\Desktop\S H A R K Y\shark\torch_mlir_utils.py", line 15, in <module> from torch_mlir.ir import StringAttr File "C:\Users\Dire\Desktop\S H A R K Y\shark.venv\lib\site-packages\torch_mlir\__init__.py", line 15, in <module> from torch_mlir.passmanager import PassManager File "C:\Users\Dire\Desktop\S H A R K Y\shark.venv\lib\site-packages\torch_mlir\passmanager.py", line 5, in <module> from ._mlir_libs._mlir.passmanager import * File "C:\Users\Dire\Desktop\S H A R K Y\shark.venv\lib\site-packages\torch_mlir\_mlir_libs\__init__.py", line 107, in <module> _site_initialize() File "C:\Users\Dire\Desktop\S H A R K Y\shark.venv\lib\site-packages\torch_mlir\_mlir_libs\__init__.py", line 56, in _site_initialize from ._mlir import ir ImportError: DLL load failed while importing _mlir: The specified module could not be found.

ghost commented 1 year ago

Something is missing from Hugging Face... from models.stable_diffusion.main import stable_diff_inf That line is killing the script

ghost commented 1 year ago

i have come to the conclusion that i am helplessly stranded

averad commented 1 year ago

@Jamesedward11 try:

https://github.com/nod-ai/SHARK/blob/main/shark/examples/shark_inference/stable_diffusion/stable_diffusion_amd.md

clear out any files in C:\Users\Dire\.local\shark_tank\ and C:\Users\Dire\AppData\Local\AMD\VkCache\