openvinotoolkit / stable-diffusion-webui

Stable Diffusion web UI
GNU Affero General Public License v3.0
247 stars 39 forks source link

[Bug]: Interface hangs when trying to use openvino webui with OpenSUSE + A770 #108

Open compellingbytes opened 1 month ago

compellingbytes commented 1 month ago

Is there an existing issue for this?

What happened?

I tried installing the openVINO Automatic1111 Webui on openSUSE Tumbleweed just to see if it would work, and it turns out, it mostly does work! I got to create an image with the prompt "Messy Room" using my cpu (an i5 13500), but when I try and switch to my GPU, an Intel Arc A770LE, the ui hangs. I've had it try to generate an image of a messy room for about 45 minutes now, and nothing.

StableDiffIntelArcOpenVINOHang

Steps to reproduce the problem

I'm using openSUSE Tumbleweed w/ kernel 6.8.7-1-default. I installed the following packages:

I used pyenv to install Python 3.10.6 Followed the directions to install Pip installed torch 2.1.0 and torchvision 0.16.0 as per the solution given in issue #81 after encountering that issue.

Switched to openVINO gpu and tried to run prompt "Messy room"

What should have happened?

And image should've generated in a few seconds.

Sysinfo

{ "Platform": "Linux-6.8.7-1-default-x86_64-with-glibc2.39", "Python": "3.10.6", "Version": "1.6.0", "Commit": "e5a634da06c62d72dbdc764b16c65ef3408aa588", "Script path": "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui", "Data path": "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui", "Extensions dir": "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/extensions", "Checksum": "516631b0530ed47f6ea3a2f72e7663904cadcace2e116f7e6c5780db3c3927e9", "Commandline": [ "launch.py", "--skip-torch-cuda-test", "--precision", "full", "--no-half" ], "Torch env info": { "torch_version": "2.1.0+cu121", "is_debug_build": "False", "cuda_compiled_version": "12.1", "gcc_version": "(SUSE Linux) 13.2.1 20240206 [revision 67ac78caf31f7cb3202177e6428a46d829b70f23]", "clang_version": null, "cmake_version": "version 3.29.2", "os": "openSUSE Tumbleweed (x86_64)", "libc_version": "glibc-2.39", "python_version": "3.10.6 (main, May 8 2024, 04:11:57) [GCC 13.2.1 20240206 [revision 67ac78caf31f7cb3202177e6428a46d829b70f23]] (64-bit runtime)", "python_platform": "Linux-6.8.7-1-default-x86_64-with-glibc2.39", "is_cuda_available": "False", "cuda_runtime_version": null, "cuda_module_loading": "N/A", "nvidia_driver_version": null, "nvidia_gpu_models": null, "cudnn_version": null, "pip_version": "pip3", "pip_packages": [ "numpy==1.23.5", "open-clip-torch==2.20.0", "pytorch-lightning==1.9.4", "torch==2.1.0", "torchdiffeq==0.2.3", "torchmetrics==1.4.0", "torchsde==0.2.5", "torchvision==0.16.0", "triton==2.1.0" ], "conda_packages": null, "hip_compiled_version": "N/A", "hip_runtime_version": "N/A", "miopen_runtime_version": "N/A", "caching_allocator_config": "", "is_xnnpack_available": "True", "cpu_info": [ "Architecture: x86_64", "CPU op-mode(s): 32-bit, 64-bit", "Address sizes: 46 bits physical, 48 bits virtual", "Byte Order: Little Endian", "CPU(s): 20", "On-line CPU(s) list: 0-19", "Vendor ID: GenuineIntel", "Model name: 13th Gen Intel(R) Core(TM) i5-13500", "CPU family: 6", "Model: 191", "Thread(s) per core: 2", "Core(s) per socket: 14", "Socket(s): 1", "Stepping: 2", "CPU(s) scaling MHz: 82%", "CPU max MHz: 4800.0000", "CPU min MHz: 800.0000", "BogoMIPS: 4993.00", "Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc cpuid aperfmperf tsc_known_freq pni pclmulqdq dtes64 monitor ds_cpl vmx smx est tm2 ssse3 sdbg fma cx16 xtpr pdcm sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand lahf_lm abm 3dnowprefetch cpuid_fault epb ssbd ibrs ibpb stibp ibrs_enhanced tpr_shadow flexpriority ept vpid ept_ad fsgsbase tsc_adjust bmi1 avx2 smep bmi2 erms invpcid rdseed adx smap clflushopt clwb intel_pt sha_ni xsaveopt xsavec xgetbv1 xsaves split_lock_detect avx_vnni dtherm ida arat pln pts hwp hwp_notify hwp_act_window hwp_epp hwp_pkg_req hfi vnmi umip pku ospke waitpkg gfni vaes vpclmulqdq tme rdpid movdiri movdir64b fsrm md_clear serialize pconfig arch_lbr ibt flush_l1d arch_capabilities", "Virtualization: VT-x", "L1d cache: 544 KiB (14 instances)", "L1i cache: 704 KiB (14 instances)", "L2 cache: 11.5 MiB (8 instances)", "L3 cache: 24 MiB (1 instance)", "NUMA node(s): 1", "NUMA node0 CPU(s): 0-19", "Vulnerability Gather data sampling: Not affected", "Vulnerability Itlb multihit: Not affected", "Vulnerability L1tf: Not affected", "Vulnerability Mds: Not affected", "Vulnerability Meltdown: Not affected", "Vulnerability Mmio stale data: Not affected", "Vulnerability Reg file data sampling: Mitigation; Clear Register File", "Vulnerability Retbleed: Not affected", "Vulnerability Spec rstack overflow: Not affected", "Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl", "Vulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization", "Vulnerability Spectre v2: Mitigation; Enhanced / Automatic IBRS; IBPB conditional; RSB filling; PBRSB-eIBRS SW sequence; BHI BHI_DIS_S", "Vulnerability Srbds: Not affected", "Vulnerability Tsx async abort: Not affected" ] }, "Exceptions": [], "CPU": { "model": "x86_64", "count logical": 20, "count physical": 14 }, "RAM": { "total": "31GB", "used": "20GB", "free": "680MB", "active": "4GB", "inactive": "23GB", "buffers": "2MB", "cached": "10GB", "shared": "2GB" }, "Extensions": [], "Inactive extensions": [], "Environment": { "COMMANDLINE_ARGS": "--skip-torch-cuda-test --precision full --no-half", "GIT": "git", "GRADIO_ANALYTICS_ENABLED": "False" }, "Config": { "samples_save": true, "samples_format": "png", "samples_filename_pattern": "", "save_images_add_number": true, "grid_save": true, "grid_format": "png", "grid_extended_filename": false, "grid_only_if_multiple": true, "grid_prevent_empty_spots": false, "grid_zip_filename_pattern": "", "n_rows": -1, "font": "", "grid_text_active_color": "#000000", "grid_text_inactive_color": "#999999", "grid_background_color": "#ffffff", "enable_pnginfo": true, "save_txt": false, "save_images_before_face_restoration": false, "save_images_before_highres_fix": false, "save_images_before_color_correction": false, "save_mask": false, "save_mask_composite": false, "jpeg_quality": 80, "webp_lossless": false, "export_for_4chan": true, "img_downscale_threshold": 4.0, "target_side_length": 4000, "img_max_size_mp": 200, "use_original_name_batch": true, "use_upscaler_name_as_suffix": false, "save_selected_only": true, "save_init_img": false, "temp_dir": "", "clean_temp_dir_at_start": false, "save_incomplete_images": false, "outdir_samples": "", "outdir_txt2img_samples": "outputs/txt2img-images", "outdir_img2img_samples": "outputs/img2img-images", "outdir_extras_samples": "outputs/extras-images", "outdir_grids": "", "outdir_txt2img_grids": "outputs/txt2img-grids", "outdir_img2img_grids": "outputs/img2img-grids", "outdir_save": "log/images", "outdir_init_images": "outputs/init-images", "save_to_dirs": true, "grid_save_to_dirs": true, "use_save_to_dirs_for_ui": false, "directories_filename_pattern": "[date]", "directories_max_prompt_words": 8, "ESRGAN_tile": 192, "ESRGAN_tile_overlap": 8, "realesrgan_enabled_models": [ "R-ESRGAN 4x+", "R-ESRGAN 4x+ Anime6B" ], "upscaler_for_img2img": null, "face_restoration": false, "face_restoration_model": "CodeFormer", "code_former_weight": 0.5, "face_restoration_unload": false, "auto_launch_browser": "Local", "show_warnings": false, "show_gradio_deprecation_warnings": true, "memmon_poll_rate": 8, "samples_log_stdout": false, "multiple_tqdm": true, "print_hypernet_extra": false, "list_hidden_files": true, "disable_mmap_load_safetensors": false, "hide_ldm_prints": true, "api_enable_requests": true, "api_forbid_local_requests": true, "api_useragent": "", "unload_models_when_training": false, "pin_memory": false, "save_optimizer_state": false, "save_training_settings_to_txt": true, "dataset_filename_word_regex": "", "dataset_filename_join_string": " ", "training_image_repeats_per_epoch": 1, "training_write_csv_every": 500, "training_xattention_optimizations": false, "training_enable_tensorboard": false, "training_tensorboard_save_images": false, "training_tensorboard_flush_every": 120, "sd_model_checkpoint": "v1-5-pruned-emaonly.safetensors [6ce0161689]", "sd_checkpoints_limit": 1, "sd_checkpoints_keep_in_cpu": true, "sd_checkpoint_cache": 0, "sd_unet": "Automatic", "enable_quantization": false, "enable_emphasis": true, "enable_batch_seeds": true, "comma_padding_backtrack": 20, "CLIP_stop_at_last_layers": 1, "upcast_attn": false, "randn_source": "GPU", "tiling": false, "hires_fix_refiner_pass": "second pass", "sdxl_crop_top": 0, "sdxl_crop_left": 0, "sdxl_refiner_low_aesthetic_score": 2.5, "sdxl_refiner_high_aesthetic_score": 6.0, "sd_vae_explanation": "VAE is a neural network that transforms a standard RGB\nimage into latent space representation and back. Latent space representation is what stable diffusion is working on during sampling\n(i.e. when the progress bar is between empty and full). For txt2img, VAE is used to create a resulting image after the sampling is finished.\nFor img2img, VAE is used to process user's input image before the sampling, and to create an image after sampling.", "sd_vae_checkpoint_cache": 0, "sd_vae": "Automatic", "sd_vae_overrides_per_model_preferences": true, "auto_vae_precision": true, "sd_vae_encode_method": "Full", "sd_vae_decode_method": "Full", "inpainting_mask_weight": 1.0, "initial_noise_multiplier": 1.0, "img2img_extra_noise": 0.0, "img2img_color_correction": false, "img2img_fix_steps": false, "img2img_background_color": "#ffffff", "img2img_editor_height": 720, "img2img_sketch_default_brush_color": "#ffffff", "img2img_inpaint_mask_brush_color": "#ffffff", "img2img_inpaint_sketch_default_brush_color": "#ffffff", "return_mask": false, "return_mask_composite": false, "cross_attention_optimization": "Automatic", "s_min_uncond": 0.0, "token_merging_ratio": 0.0, "token_merging_ratio_img2img": 0.0, "token_merging_ratio_hr": 0.0, "pad_cond_uncond": false, "persistent_cond_cache": true, "batch_cond_uncond": true, "use_old_emphasis_implementation": false, "use_old_karras_scheduler_sigmas": false, "no_dpmpp_sde_batch_determinism": false, "use_old_hires_fix_width_height": false, "dont_fix_second_order_samplers_schedule": false, "hires_fix_use_firstpass_conds": false, "use_old_scheduling": false, "interrogate_keep_models_in_memory": false, "interrogate_return_ranks": false, "interrogate_clip_num_beams": 1, "interrogate_clip_min_length": 24, "interrogate_clip_max_length": 48, "interrogate_clip_dict_limit": 1500, "interrogate_clip_skip_categories": [], "interrogate_deepbooru_score_threshold": 0.5, "deepbooru_sort_alpha": true, "deepbooru_use_spaces": true, "deepbooru_escape": true, "deepbooru_filter_tags": "", "extra_networks_show_hidden_directories": true, "extra_networks_hidden_models": "When searched", "extra_networks_default_multiplier": 1.0, "extra_networks_card_width": 0, "extra_networks_card_height": 0, "extra_networks_card_text_scale": 1.0, "extra_networks_card_show_desc": true, "extra_networks_add_text_separator": " ", "ui_extra_networks_tab_reorder": "", "textual_inversion_print_at_load": false, "textual_inversion_add_hashes_to_infotext": true, "sd_hypernetwork": "None", "localization": "None", "gradio_theme": "Default", "gradio_themes_cache": true, "gallery_height": "", "return_grid": true, "do_not_show_images": false, "send_seed": true, "send_size": true, "js_modal_lightbox": true, "js_modal_lightbox_initially_zoomed": true, "js_modal_lightbox_gamepad": false, "js_modal_lightbox_gamepad_repeat": 250, "show_progress_in_title": true, "samplers_in_dropdown": true, "dimensions_and_batch_together": true, "keyedit_precision_attention": 0.1, "keyedit_precision_extra": 0.05, "keyedit_delimiters": ".,\/!?%^*;:{}=`~()", "keyedit_move": true, "quicksettings_list": [ "sd_model_checkpoint" ], "ui_tab_order": [], "hidden_tabs": [], "ui_reorder_list": [], "hires_fix_show_sampler": false, "hires_fix_show_prompts": false, "disable_token_counters": false, "add_model_hash_to_info": true, "add_model_name_to_info": true, "add_user_name_to_info": false, "add_version_to_infotext": true, "disable_weights_auto_swap": true, "infotext_styles": "Apply if any", "show_progressbar": true, "live_previews_enable": true, "live_previews_image_format": "png", "show_progress_grid": true, "show_progress_every_n_steps": 10, "show_progress_type": "Approx NN", "live_preview_allow_lowvram_full": false, "live_preview_content": "Prompt", "live_preview_refresh_period": 1000, "live_preview_fast_interrupt": false, "hide_samplers": [], "eta_ddim": 0.0, "eta_ancestral": 1.0, "ddim_discretize": "uniform", "s_churn": 0.0, "s_tmin": 0.0, "s_tmax": 0.0, "s_noise": 1.0, "k_sched_type": "Automatic", "sigma_min": 0.0, "sigma_max": 0.0, "rho": 0.0, "eta_noise_seed_delta": 0, "always_discard_next_to_last_sigma": false, "sgm_noise_multiplier": false, "uni_pc_variant": "bh1", "uni_pc_skip_type": "time_uniform", "uni_pc_order": 3, "uni_pc_lower_order_final": true, "postprocessing_enable_in_main_ui": [], "postprocessing_operation_order": [], "upscaling_max_images_in_cache": 5, "disabled_extensions": [], "disable_all_extensions": "none", "restore_config_state_file": "", "sd_checkpoint_hash": "6ce0161689b3853acaa03779ec93eafe75a02f4ced659bee03f50797806fa2fa" }, "Startup": { "total": 4.738841533660889, "records": { "initial startup": 0.0004184246063232422, "prepare environment/checks": 2.9802322387695312e-05, "prepare environment/git version info": 0.0025970935821533203, "prepare environment/torch GPU test": 0.00024080276489257812, "prepare environment/clone repositores": 0.012154340744018555, "prepare environment/run extensions installers": 0.0002231597900390625, "prepare environment": 0.04314088821411133, "launcher": 0.0003376007080078125, "import torch": 2.1142807006835938, "import gradio": 0.39220762252807617, "setup paths": 0.5379493236541748, "import ldm": 0.0017533302307128906, "import sgm": 2.6226043701171875e-06, "initialize shared": 0.026759624481201172, "other imports": 0.29731321334838867, "opts onchange": 0.0002048015594482422, "setup SD model": 0.00014400482177734375, "setup codeformer": 0.0439298152923584, "setup gfpgan": 0.00650477409362793, "set samplers": 2.5510787963867188e-05, "list extensions": 5.841255187988281e-05, "restore config state file": 3.814697265625e-06, "list SD models": 0.00019359588623046875, "list localizations": 5.340576171875e-05, "load scripts/custom_code.py": 0.00034546852111816406, "load scripts/img2imgalt.py": 0.00016951560974121094, "load scripts/loopback.py": 7.224082946777344e-05, "load scripts/openvino_accelerate.py": 0.3756701946258545, "load scripts/outpainting_mk_2.py": 0.00016808509826660156, "load scripts/poor_mans_outpainting.py": 8.630752563476562e-05, "load scripts/postprocessing_codeformer.py": 6.413459777832031e-05, "load scripts/postprocessing_gfpgan.py": 5.0067901611328125e-05, "load scripts/postprocessing_upscale.py": 0.00010442733764648438, "load scripts/prompt_matrix.py": 8.463859558105469e-05, "load scripts/prompts_from_file.py": 8.20159912109375e-05, "load scripts/refiner.py": 0.00014543533325195312, "load scripts/sd_upscale.py": 7.724761962890625e-05, "load scripts/seed.py": 8.487701416015625e-05, "load scripts/xyz_grid.py": 0.0010390281677246094, "load scripts/ldsr_model.py": 0.15517139434814453, "load scripts/lora_script.py": 0.07253193855285645, "load scripts/scunet_model.py": 0.012982845306396484, "load scripts/swinir_model.py": 0.01287388801574707, "load scripts/hotkey_config.py": 9.72747802734375e-05, "load scripts/extra_options_section.py": 0.00010776519775390625, "load scripts": 0.6320576667785645, "load upscalers": 0.002237081527709961, "refresh VAE": 0.0005240440368652344, "refresh textual inversion templates": 3.123283386230469e-05, "scripts list_optimizers": 0.00016188621520996094, "scripts list_unets": 6.9141387939453125e-06, "reload hypernetworks": 0.0037238597869873047, "initialize extra networks": 0.004498720169067383, "scripts before_ui_callback": 0.00033664703369140625, "create ui": 0.2935323715209961, "gradio launch": 0.358872652053833, "add APIs": 0.005404233932495117, "app_started_callback/lora_script.py": 0.00019073486328125, "app_started_callback": 0.0001919269561767578 } }, "Packages": [ "absl-py==2.1.0", "accelerate==0.21.0", "addict==2.4.0", "aenum==3.1.15", "aiofiles==23.2.1", "aiohttp==3.9.5", "aiosignal==1.3.1", "altair==5.3.0", "antlr4-python3-runtime==4.9.3", "anyio==3.7.1", "async-timeout==4.0.3", "attrs==23.2.0", "basicsr==1.4.2", "beautifulsoup4==4.12.3", "blendmodes==2023", "boltons==24.0.0", "certifi==2024.2.2", "charset-normalizer==3.3.2", "clean-fid==0.1.35", "click==8.1.7", "clip==1.0", "cmake==3.29.2", "colorama==0.4.6", "contourpy==1.2.1", "cycler==0.12.1", "deprecation==2.1.0", "diffusers==0.23.0", "einops==0.4.1", "exceptiongroup==1.2.1", "facexlib==0.3.0", "fastapi==0.94.0", "ffmpy==0.3.2", "filelock==3.14.0", "filterpy==1.4.5", "fonttools==4.51.0", "frozenlist==1.4.1", "fsspec==2024.3.1", "ftfy==6.2.0", "future==1.0.0", "gdown==5.1.0", "gfpgan==1.3.8", "gitdb==4.0.11", "gitpython==3.1.37", "gradio-client==0.5.0", "gradio==3.41.2", "grpcio==1.63.0", "h11==0.12.0", "httpcore==0.15.0", "httpx==0.24.1", "huggingface-hub==0.23.0", "idna==3.7", "imageio==2.34.1", "importlib-metadata==7.1.0", "importlib-resources==6.4.0", "inflection==0.5.1", "invisible-watermark==0.2.0", "jinja2==3.1.4", "jsonmerge==1.8.0", "jsonschema-specifications==2023.12.1", "jsonschema==4.22.0", "kiwisolver==1.4.5", "kornia==0.6.7", "lark==1.1.2", "lazy-loader==0.4", "lightning-utilities==0.11.2", "lit==18.1.4", "llvmlite==0.42.0", "lmdb==1.4.1", "lpips==0.1.4", "markdown==3.6", "markupsafe==2.1.5", "matplotlib==3.8.4", "mpmath==1.3.0", "multidict==6.0.5", "networkx==3.3", "numba==0.59.1", "numpy==1.23.5", "nvidia-cublas-cu12==12.1.3.1", "nvidia-cuda-cupti-cu12==12.1.105", "nvidia-cuda-nvrtc-cu12==12.1.105", "nvidia-cuda-runtime-cu12==12.1.105", "nvidia-cudnn-cu12==8.9.2.26", "nvidia-cufft-cu12==11.0.2.54", "nvidia-curand-cu12==10.3.2.106", "nvidia-cusolver-cu12==11.4.5.107", "nvidia-cusparse-cu12==12.1.0.106", "nvidia-nccl-cu12==2.18.1", "nvidia-nvjitlink-cu12==12.4.127", "nvidia-nvtx-cu12==12.1.105", "omegaconf==2.2.3", "open-clip-torch==2.20.0", "opencv-python==4.9.0.80", "openvino-telemetry==2024.1.0", "openvino==2023.2.0", "orjson==3.10.3", "packaging==24.0", "pandas==2.2.2", "piexif==1.1.3", "pillow==10.0.1", "pip==24.0", "platformdirs==4.2.1", "pretty-errors==1.2.25", "protobuf==3.20.0", "psutil==5.9.5", "pydantic==1.10.15", "pydub==0.25.1", "pyparsing==3.1.2", "pysocks==1.7.1", "python-dateutil==2.9.0.post0", "python-multipart==0.0.9", "pytorch-lightning==1.9.4", "pytz==2024.1", "pywavelets==1.6.0", "pyyaml==6.0.1", "realesrgan==0.3.0", "referencing==0.35.1", "regex==2024.4.28", "requests==2.31.0", "resize-right==0.0.2", "rpds-py==0.18.1", "safetensors==0.3.1", "scikit-image==0.21.0", "scipy==1.13.0", "semantic-version==2.10.0", "sentencepiece==0.2.0", "setuptools==63.2.0", "six==1.16.0", "smmap==5.0.1", "sniffio==1.3.1", "soupsieve==2.5", "starlette==0.26.1", "sympy==1.12", "tb-nightly==2.17.0a20240507", "tensorboard-data-server==0.7.2", "tifffile==2024.5.3", "timm==0.9.2", "tokenizers==0.13.3", "tomesd==0.1.3", "tomli==2.0.1", "toolz==0.12.1", "torch==2.1.0", "torchdiffeq==0.2.3", "torchmetrics==1.4.0", "torchsde==0.2.5", "torchvision==0.16.0", "tqdm==4.66.4", "trampoline==0.1.2", "transformers==4.30.2", "triton==2.1.0", "typing-extensions==4.11.0", "tzdata==2024.1", "urllib3==2.2.1", "uvicorn==0.29.0", "wcwidth==0.2.13", "websockets==11.0.3", "werkzeug==3.0.3", "yapf==0.40.2", "yarl==1.9.4", "zipp==3.18.1" ] }

What browsers do you use to access the UI ?

Firefox

Console logs

If you can tell me how to generate/where to find logs for this issue, I'll gladly look for them and add them here.

Additional information

Forgive the directory that I installed this in, I kinda did it just to see if it worked, and it is almost there. I hope someone can help me with getting this to run on OpenSUSE Tumbleweed, i imagine its some tiny tweak.

I'm guessing that it's possibly an issue related to the version of torch/torchvision? I'll also try updating now and running the prompt again when I wake up lol.

compellingbytes commented 1 month ago

Here's a readout I got after hitting ctrl+c:

File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 709, in _spawn_process
  p.start()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 121, in start
  self._popen = self._Popen(self)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/context.py", line 281, in _Popen
  return Popen(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 19, in __init__
  self._launch(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 71, in _launch
  code = process_obj._bootstrap(parent_sentinel=child_r)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 108, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 240, in _process_worker
  call_item = call_queue.get(block=True)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/queues.py", line 102, in get
  with self._rlock:
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/synchronize.py", line 95, in __enter__
  return self._semlock.__enter__()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 156, in sigint_handler
  dumpstacks()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 143, in dumpstacks
  for filename, lineno, name, line in traceback.extract_stack(stack):

# Thread: Thread-2 (run)(140484993353408)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 973, in _bootstrap
  self._bootstrap_inner()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 953, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1317, in run
  sleep(1)

# Thread: MainThread(140488675936064)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 48, in <module>
  main()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 44, in main
  start()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/launch_utils.py", line 440, in start
  webui.webui()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/webui.py", line 52, in webui
  initialize.initialize()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 75, in initialize
  initialize_rest(reload_script_modules=False)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 111, in initialize_rest
  scripts.load_scripts()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/scripts.py", line 382, in load_scripts
  script_module = script_loading.load_module(scriptfile.path)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/script_loading.py", line 10, in load_module
  module_spec.loader.exec_module(module)
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/scripts/openvino_accelerate.py", line 34, in <module>
  from openvino.frontend.pytorch.torchdynamo import backend # noqa: F401
File: "<frozen importlib._bootstrap>", line 1078, in _handle_fromlist
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/openvino/frontend/pytorch/torchdynamo/backend.py", line 15, in <module>
  from torch._inductor.compile_fx import compile_fx
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/compile_fx.py", line 26, in <module>
  from torch._inductor.codecache import code_hash, CompiledFxGraph
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1424, in <module>
  AsyncCompile.warm_pool()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1363, in warm_pool
  pool._adjust_process_count()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 692, in _adjust_process_count
  self._spawn_process()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 709, in _spawn_process
  p.start()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 121, in start
  self._popen = self._Popen(self)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/context.py", line 281, in _Popen
  return Popen(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 19, in __init__
  self._launch(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 71, in _launch
  code = process_obj._bootstrap(parent_sentinel=child_r)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 108, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 240, in _process_worker
  call_item = call_queue.get(block=True)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/queues.py", line 102, in get
  with self._rlock:
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/synchronize.py", line 95, in __enter__
  return self._semlock.__enter__()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 156, in sigint_handler
  dumpstacks()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 143, in dumpstacks
  for filename, lineno, name, line in traceback.extract_stack(stack):

# Thread: Thread-2 (run)(140484993353408)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 973, in _bootstrap
  self._bootstrap_inner()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 953, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1317, in run
  sleep(1)

# Thread: MainThread(140488675936064)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 48, in <module>
  main()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 44, in main
  start()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/launch_utils.py", line 440, in start
  webui.webui()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/webui.py", line 52, in webui
  initialize.initialize()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 75, in initialize
  initialize_rest(reload_script_modules=False)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 111, in initialize_rest
  scripts.load_scripts()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/scripts.py", line 382, in load_scripts
  script_module = script_loading.load_module(scriptfile.path)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/script_loading.py", line 10, in load_module
  module_spec.loader.exec_module(module)
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/scripts/openvino_accelerate.py", line 34, in <module>
  from openvino.frontend.pytorch.torchdynamo import backend # noqa: F401
File: "<frozen importlib._bootstrap>", line 1078, in _handle_fromlist
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/openvino/frontend/pytorch/torchdynamo/backend.py", line 15, in <module>
  from torch._inductor.compile_fx import compile_fx
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/compile_fx.py", line 26, in <module>
  from torch._inductor.codecache import code_hash, CompiledFxGraph
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1424, in <module>
  AsyncCompile.warm_pool()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1363, in warm_pool
  pool._adjust_process_count()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 692, in _adjust_process_count
  self._spawn_process()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 709, in _spawn_process
  p.start()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 121, in start
  self._popen = self._Popen(self)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/context.py", line 281, in _Popen
  return Popen(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 19, in __init__
  self._launch(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 71, in _launch
  code = process_obj._bootstrap(parent_sentinel=child_r)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 108, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 240, in _process_worker
  call_item = call_queue.get(block=True)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/queues.py", line 102, in get
  with self._rlock:
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/synchronize.py", line 95, in __enter__
  return self._semlock.__enter__()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 156, in sigint_handler
  dumpstacks()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 143, in dumpstacks
  for filename, lineno, name, line in traceback.extract_stack(stack):
# Thread: Thread-2 (run)(140484993353408)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 973, in _bootstrap
  self._bootstrap_inner()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 953, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1317, in run
  sleep(1)

# Thread: MainThread(140488675936064)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 48, in <module>
  main()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 44, in main
  start()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/launch_utils.py", line 440, in start
  webui.webui()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/webui.py", line 52, in webui
  initialize.initialize()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 75, in initialize
  initialize_rest(reload_script_modules=False)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 111, in initialize_rest
  scripts.load_scripts()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/scripts.py", line 382, in load_scripts
  script_module = script_loading.load_module(scriptfile.path)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/script_loading.py", line 10, in load_module
  module_spec.loader.exec_module(module)
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/scripts/openvino_accelerate.py", line 34, in <module>
  from openvino.frontend.pytorch.torchdynamo import backend # noqa: F401
File: "<frozen importlib._bootstrap>", line 1078, in _handle_fromlist
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/openvino/frontend/pytorch/torchdynamo/backend.py", line 15, in <module>
  from torch._inductor.compile_fx import compile_fx
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/compile_fx.py", line 26, in <module>
  from torch._inductor.codecache import code_hash, CompiledFxGraph
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1424, in <module>
  AsyncCompile.warm_pool()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1363, in warm_pool
  pool._adjust_process_count()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 692, in _adjust_process_count
  self._spawn_process()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 709, in _spawn_process
  p.start()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 121, in start
  self._popen = self._Popen(self)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/context.py", line 281, in _Popen
  return Popen(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 19, in __init__
  self._launch(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 71, in _launch
  code = process_obj._bootstrap(parent_sentinel=child_r)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 108, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 240, in _process_worker
  call_item = call_queue.get(block=True)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/queues.py", line 102, in get
  with self._rlock:
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/synchronize.py", line 95, in __enter__
  return self._semlock.__enter__()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 156, in sigint_handler
  dumpstacks()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 143, in dumpstacks
  for filename, lineno, name, line in traceback.extract_stack(stack):

# Thread: Thread-2 (run)(140484993353408)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 973, in _bootstrap
  self._bootstrap_inner()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 953, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1317, in run
  sleep(1)

# Thread: MainThread(140488675936064)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 48, in <module>
  main()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 44, in main
  start()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/launch_utils.py", line 440, in start
  webui.webui()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/webui.py", line 52, in webui
  initialize.initialize()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 75, in initialize
  initialize_rest(reload_script_modules=False)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 111, in initialize_rest
  scripts.load_scripts()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/scripts.py", line 382, in load_scripts
  script_module = script_loading.load_module(scriptfile.path)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/script_loading.py", line 10, in load_module
  module_spec.loader.exec_module(module)
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/scripts/openvino_accelerate.py", line 34, in <module>
  from openvino.frontend.pytorch.torchdynamo import backend # noqa: F401
File: "<frozen importlib._bootstrap>", line 1078, in _handle_fromlist
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/openvino/frontend/pytorch/torchdynamo/backend.py", line 15, in <module>
  from torch._inductor.compile_fx import compile_fx
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/compile_fx.py", line 26, in <module>
  from torch._inductor.codecache import code_hash, CompiledFxGraph
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1424, in <module>
  AsyncCompile.warm_pool()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1363, in warm_pool
  pool._adjust_process_count()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 692, in _adjust_process_count
  self._spawn_process()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 709, in _spawn_process
  p.start()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 121, in start
  self._popen = self._Popen(self)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/context.py", line 281, in _Popen
  return Popen(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 19, in __init__
  self._launch(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 71, in _launch
  code = process_obj._bootstrap(parent_sentinel=child_r)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 108, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 240, in _process_worker
  call_item = call_queue.get(block=True)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/queues.py", line 102, in get
  with self._rlock:
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/synchronize.py", line 95, in __enter__
  return self._semlock.__enter__()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 156, in sigint_handler
  dumpstacks()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 143, in dumpstacks
  for filename, lineno, name, line in traceback.extract_stack(stack):
# Thread: Thread-2 (run)(140484993353408)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 973, in _bootstrap
  self._bootstrap_inner()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 953, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1317, in run
  sleep(1)

# Thread: MainThread(140488675936064)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 48, in <module>
  main()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 44, in main
  start()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/launch_utils.py", line 440, in start
  webui.webui()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/webui.py", line 52, in webui
  initialize.initialize()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 75, in initialize
  initialize_rest(reload_script_modules=False)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 111, in initialize_rest
  scripts.load_scripts()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/scripts.py", line 382, in load_scripts
  script_module = script_loading.load_module(scriptfile.path)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/script_loading.py", line 10, in load_module
  module_spec.loader.exec_module(module)
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/scripts/openvino_accelerate.py", line 34, in <module>
  from openvino.frontend.pytorch.torchdynamo import backend # noqa: F401
File: "<frozen importlib._bootstrap>", line 1078, in _handle_fromlist
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/openvino/frontend/pytorch/torchdynamo/backend.py", line 15, in <module>
  from torch._inductor.compile_fx import compile_fx
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/compile_fx.py", line 26, in <module>
  from torch._inductor.codecache import code_hash, CompiledFxGraph
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1424, in <module>
  AsyncCompile.warm_pool()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1363, in warm_pool
  pool._adjust_process_count()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 692, in _adjust_process_count
  self._spawn_process()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 709, in _spawn_process
  p.start()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 121, in start
  self._popen = self._Popen(self)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/context.py", line 281, in _Popen
  return Popen(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 19, in __init__
  self._launch(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 71, in _launch
  code = process_obj._bootstrap(parent_sentinel=child_r)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 108, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 240, in _process_worker
  call_item = call_queue.get(block=True)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/queues.py", line 102, in get
  with self._rlock:
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/synchronize.py", line 95, in __enter__
  return self._semlock.__enter__()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 156, in sigint_handler
  dumpstacks()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 143, in dumpstacks
  for filename, lineno, name, line in traceback.extract_stack(stack):
# Thread: Thread-2 (run)(140484993353408)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 973, in _bootstrap
  self._bootstrap_inner()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 953, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1317, in run
  sleep(1)

# Thread: MainThread(140488675936064)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 48, in <module>
  main()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 44, in main
  start()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/launch_utils.py", line 440, in start
  webui.webui()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/webui.py", line 52, in webui
  initialize.initialize()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 75, in initialize
  initialize_rest(reload_script_modules=False)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 111, in initialize_rest
  scripts.load_scripts()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/scripts.py", line 382, in load_scripts
  script_module = script_loading.load_module(scriptfile.path)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/script_loading.py", line 10, in load_module
  module_spec.loader.exec_module(module)
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/scripts/openvino_accelerate.py", line 34, in <module>
  from openvino.frontend.pytorch.torchdynamo import backend # noqa: F401
File: "<frozen importlib._bootstrap>", line 1078, in _handle_fromlist
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/openvino/frontend/pytorch/torchdynamo/backend.py", line 15, in <module>
  from torch._inductor.compile_fx import compile_fx
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/compile_fx.py", line 26, in <module>
  from torch._inductor.codecache import code_hash, CompiledFxGraph
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1424, in <module>
  AsyncCompile.warm_pool()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1363, in warm_pool
  pool._adjust_process_count()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 692, in _adjust_process_count
  self._spawn_process()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 709, in _spawn_process
  p.start()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 121, in start
  self._popen = self._Popen(self)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/context.py", line 281, in _Popen
  return Popen(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 19, in __init__
  self._launch(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 71, in _launch
  code = process_obj._bootstrap(parent_sentinel=child_r)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 108, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 240, in _process_worker
  call_item = call_queue.get(block=True)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/queues.py", line 103, in get
  res = self._recv_bytes()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/connection.py", line 221, in recv_bytes
  buf = self._recv_bytes(maxlength)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/connection.py", line 419, in _recv_bytes
  buf = self._recv(4)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/connection.py", line 384, in _recv
  chunk = read(handle, remaining)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 156, in sigint_handler
  dumpstacks()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 143, in dumpstacks
  for filename, lineno, name, line in traceback.extract_stack(stack):
# Thread: Thread-2 (run)(140484993353408)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 973, in _bootstrap
  self._bootstrap_inner()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 953, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1317, in run
  sleep(1)

# Thread: MainThread(140488675936064)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 48, in <module>
  main()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 44, in main
  start()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/launch_utils.py", line 440, in start
  webui.webui()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/webui.py", line 52, in webui
  initialize.initialize()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 75, in initialize
  initialize_rest(reload_script_modules=False)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 111, in initialize_rest
  scripts.load_scripts()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/scripts.py", line 382, in load_scripts
  script_module = script_loading.load_module(scriptfile.path)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/script_loading.py", line 10, in load_module
  module_spec.loader.exec_module(module)
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/scripts/openvino_accelerate.py", line 34, in <module>
  from openvino.frontend.pytorch.torchdynamo import backend # noqa: F401
File: "<frozen importlib._bootstrap>", line 1078, in _handle_fromlist
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/openvino/frontend/pytorch/torchdynamo/backend.py", line 15, in <module>
  from torch._inductor.compile_fx import compile_fx
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/compile_fx.py", line 26, in <module>
  from torch._inductor.codecache import code_hash, CompiledFxGraph
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1424, in <module>
  AsyncCompile.warm_pool()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1363, in warm_pool
  pool._adjust_process_count()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 692, in _adjust_process_count
  self._spawn_process()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 709, in _spawn_process
  p.start()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 121, in start
  self._popen = self._Popen(self)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/context.py", line 281, in _Popen
  return Popen(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 19, in __init__
  self._launch(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 71, in _launch
  code = process_obj._bootstrap(parent_sentinel=child_r)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 108, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 240, in _process_worker
  call_item = call_queue.get(block=True)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/queues.py", line 102, in get
  with self._rlock:
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/synchronize.py", line 95, in __enter__
  return self._semlock.__enter__()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 156, in sigint_handler
  dumpstacks()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 143, in dumpstacks
  for filename, lineno, name, line in traceback.extract_stack(stack):
# Thread: Thread-2 (run)(140484993353408)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 973, in _bootstrap
  self._bootstrap_inner()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 953, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1317, in run
  sleep(1)

# Thread: MainThread(140488675936064)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 48, in <module>
  main()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 44, in main
  start()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/launch_utils.py", line 440, in start
  webui.webui()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/webui.py", line 52, in webui
  initialize.initialize()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 75, in initialize
  initialize_rest(reload_script_modules=False)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 111, in initialize_rest
  scripts.load_scripts()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/scripts.py", line 382, in load_scripts
  script_module = script_loading.load_module(scriptfile.path)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/script_loading.py", line 10, in load_module
  module_spec.loader.exec_module(module)
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/scripts/openvino_accelerate.py", line 34, in <module>
  from openvino.frontend.pytorch.torchdynamo import backend # noqa: F401
File: "<frozen importlib._bootstrap>", line 1078, in _handle_fromlist
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/openvino/frontend/pytorch/torchdynamo/backend.py", line 15, in <module>
  from torch._inductor.compile_fx import compile_fx
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/compile_fx.py", line 26, in <module>
  from torch._inductor.codecache import code_hash, CompiledFxGraph
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1424, in <module>
  AsyncCompile.warm_pool()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1363, in warm_pool
  pool._adjust_process_count()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 692, in _adjust_process_count
  self._spawn_process()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 709, in _spawn_process
  p.start()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 121, in start
  self._popen = self._Popen(self)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/context.py", line 281, in _Popen
  return Popen(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 19, in __init__
  self._launch(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 71, in _launch
  code = process_obj._bootstrap(parent_sentinel=child_r)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 108, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 240, in _process_worker
  call_item = call_queue.get(block=True)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/queues.py", line 102, in get
  with self._rlock:
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/synchronize.py", line 95, in __enter__
  return self._semlock.__enter__()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 156, in sigint_handler
  dumpstacks()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 143, in dumpstacks
  for filename, lineno, name, line in traceback.extract_stack(stack):
# Thread: Thread-2 (run)(140484993353408)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 973, in _bootstrap
  self._bootstrap_inner()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 953, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1317, in run
  sleep(1)

# Thread: MainThread(140488675936064)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 48, in <module>
  main()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 44, in main
  start()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/launch_utils.py", line 440, in start
  webui.webui()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/webui.py", line 52, in webui
  initialize.initialize()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 75, in initialize
  initialize_rest(reload_script_modules=False)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 111, in initialize_rest
  scripts.load_scripts()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/scripts.py", line 382, in load_scripts
  script_module = script_loading.load_module(scriptfile.path)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/script_loading.py", line 10, in load_module
  module_spec.loader.exec_module(module)
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/scripts/openvino_accelerate.py", line 34, in <module>
  from openvino.frontend.pytorch.torchdynamo import backend # noqa: F401
File: "<frozen importlib._bootstrap>", line 1078, in _handle_fromlist
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/openvino/frontend/pytorch/torchdynamo/backend.py", line 15, in <module>
  from torch._inductor.compile_fx import compile_fx
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/compile_fx.py", line 26, in <module>
  from torch._inductor.codecache import code_hash, CompiledFxGraph
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1424, in <module>
  AsyncCompile.warm_pool()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1363, in warm_pool
  pool._adjust_process_count()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 692, in _adjust_process_count
  self._spawn_process()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 709, in _spawn_process
  p.start()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 121, in start
  self._popen = self._Popen(self)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/context.py", line 281, in _Popen
  return Popen(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 19, in __init__
  self._launch(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 71, in _launch
  code = process_obj._bootstrap(parent_sentinel=child_r)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 108, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 240, in _process_worker
  call_item = call_queue.get(block=True)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/queues.py", line 102, in get
  with self._rlock:
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/synchronize.py", line 95, in __enter__
  return self._semlock.__enter__()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 156, in sigint_handler
  dumpstacks()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 143, in dumpstacks
  for filename, lineno, name, line in traceback.extract_stack(stack):
# Thread: Thread-2 (run)(140484993353408)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 973, in _bootstrap
  self._bootstrap_inner()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 953, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1317, in run
  sleep(1)

# Thread: MainThread(140488675936064)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 48, in <module>
  main()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 44, in main
  start()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/launch_utils.py", line 440, in start
  webui.webui()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/webui.py", line 52, in webui
  initialize.initialize()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 75, in initialize
  initialize_rest(reload_script_modules=False)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize.py", line 111, in initialize_rest
  scripts.load_scripts()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/scripts.py", line 382, in load_scripts
  script_module = script_loading.load_module(scriptfile.path)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/script_loading.py", line 10, in load_module
  module_spec.loader.exec_module(module)
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/scripts/openvino_accelerate.py", line 34, in <module>
  from openvino.frontend.pytorch.torchdynamo import backend # noqa: F401
File: "<frozen importlib._bootstrap>", line 1078, in _handle_fromlist
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/openvino/frontend/pytorch/torchdynamo/backend.py", line 15, in <module>
  from torch._inductor.compile_fx import compile_fx
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/compile_fx.py", line 26, in <module>
  from torch._inductor.codecache import code_hash, CompiledFxGraph
File: "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File: "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File: "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File: "<frozen importlib._bootstrap_external>", line 883, in exec_module
File: "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1424, in <module>
  AsyncCompile.warm_pool()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_inductor/codecache.py", line 1363, in warm_pool
  pool._adjust_process_count()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 692, in _adjust_process_count
  self._spawn_process()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 709, in _spawn_process
  p.start()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 121, in start
  self._popen = self._Popen(self)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/context.py", line 281, in _Popen
  return Popen(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 19, in __init__
  self._launch(process_obj)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/popen_fork.py", line 71, in _launch
  code = process_obj._bootstrap(parent_sentinel=child_r)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/process.py", line 108, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 240, in _process_worker
  call_item = call_queue.get(block=True)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/queues.py", line 102, in get
  with self._rlock:
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/synchronize.py", line 95, in __enter__
  return self._semlock.__enter__()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 156, in sigint_handler
  dumpstacks()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 143, in dumpstacks
  for filename, lineno, name, line in traceback.extract_stack(stack):

# Thread: AnyIO worker thread(140482520811200)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 973, in _bootstrap
  self._bootstrap_inner()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
  self.run()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 797, in run
  item = self.queue.get()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/queue.py", line 171, in get
  self.not_empty.wait()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 320, in wait
  waiter.acquire()

# Thread: AnyIO worker thread(140482350941888)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 973, in _bootstrap
  self._bootstrap_inner()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
  self.run()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 797, in run
  item = self.queue.get()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/queue.py", line 171, in get
  self.not_empty.wait()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 320, in wait
  waiter.acquire()

# Thread: AnyIO worker thread(140482919270080)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 973, in _bootstrap
  self._bootstrap_inner()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
  self.run()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 807, in run
  result = context.run(func, *args)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/gradio/utils.py", line 707, in wrapper
  response = f(*args, **kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/call_queue.py", line 57, in f
  res = list(func(*args, **kwargs))
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/call_queue.py", line 36, in f
  res = func(*args, **kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/txt2img.py", line 52, in txt2img
  processed = modules.scripts.scripts_txt2img.run(p, *args)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/scripts.py", line 601, in run
  processed = script.run(p, *script_args)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/scripts/openvino_accelerate.py", line 1276, in run
  processed = process_images_openvino(p, model_config, vae_ckpt, p.sampler_name, enable_caching, override_hires, upscaler, hires_steps, d_strength, openvino_device, mode, is_xl_ckpt, refiner_ckpt, refiner_frac)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/scripts/openvino_accelerate.py", line 998, in process_images_openvino
  output = shared.sd_diffusers_model(
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
  return func(*args, **kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py", line 840, in __call__
  noise_pred = self.unet(
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
  return self._call_impl(*args, **kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
  return forward_call(*args, **kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/eval_frame.py", line 328, in _fn
  return fn(*args, **kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
  return self._call_impl(*args, **kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
  return forward_call(*args, **kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/eval_frame.py", line 490, in catch_errors
  return callback(frame, cache_entry, hooks, frame_state)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/convert_frame.py", line 641, in _convert_frame
  result = inner_convert(frame, cache_size, hooks, frame_state)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/convert_frame.py", line 133, in _fn
  return fn(*args, **kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/convert_frame.py", line 389, in _convert_frame_assert
  return _compile(
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/convert_frame.py", line 569, in _compile
  guarded_code = compile_inner(code, one_graph, hooks, transform)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/utils.py", line 189, in time_wrapper
  r = func(*args, **kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/convert_frame.py", line 491, in compile_inner
  out_code = transform_code_object(code, transform)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/bytecode_transformation.py", line 1028, in transform_code_object
  transformations(instructions, code_options)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/convert_frame.py", line 458, in transform
  tracer.run()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/symbolic_convert.py", line 2074, in run
  super().run()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/symbolic_convert.py", line 724, in run
  and self.step()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/symbolic_convert.py", line 688, in step
  getattr(self, inst.opname)(inst)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/symbolic_convert.py", line 2162, in RETURN_VALUE
  self.output.compile_subgraph(
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/output_graph.py", line 857, in compile_subgraph
  self.compile_and_call_fx_graph(tx, pass2.graph_output_vars(), root)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/contextlib.py", line 79, in inner
  return func(*args, **kwds)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/output_graph.py", line 957, in compile_and_call_fx_graph
  compiled_fn = self.call_user_compiler(gm)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/utils.py", line 189, in time_wrapper
  r = func(*args, **kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/output_graph.py", line 1009, in call_user_compiler
  compiled_fn = compiler_fn(gm, self.example_inputs())
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/repro/after_dynamo.py", line 117, in debug_wrapper
  compiled_gm = compiler_fn(gm, example_inputs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/__init__.py", line 1607, in __call__
  return self.compiler_fn(model_, inputs_, **self.kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/torch/_dynamo/backends/common.py", line 95, in wrapper
  return fn(model, inputs, **kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/scripts/openvino_accelerate.py", line 201, in openvino_fx
  compiled_model = openvino_compile_cached_model(maybe_fs_cached_name, *example_inputs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/scripts/openvino_accelerate.py", line 433, in openvino_compile_cached_model
  compiled_model = core.compile_model(om, get_device())
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/openvino/runtime/ie_api.py", line 543, in compile_model
  super().compile_model(model, device_name, {} if config is None else config),

# Thread: Thread-6(140482510325440)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 973, in _bootstrap
  self._bootstrap_inner()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
  self.run()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/tqdm/_monitor.py", line 60, in run
  self.was_killed.wait(self.sleep_interval)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 607, in wait
  signaled = self._cond.wait(timeout)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 324, in wait
  gotit = waiter.acquire(True, timeout)

# Thread: Thread-5 (run)(140485125473984)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 973, in _bootstrap
  self._bootstrap_inner()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 953, in run
  self._target(*self._args, **self._kwargs)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/sd_env/lib/python3.10/site-packages/uvicorn/server.py", line 65, in run
  return asyncio.run(self.serve(sockets=sockets))
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/asyncio/runners.py", line 44, in run
  return loop.run_until_complete(main)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/asyncio/base_events.py", line 633, in run_until_complete
  self.run_forever()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/asyncio/base_events.py", line 600, in run_forever
  self._run_once()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/asyncio/base_events.py", line 1860, in _run_once
  event_list = self._selector.select(timeout)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/selectors.py", line 469, in select
  fd_event_list = self._selector.poll(timeout, max_ev)

# Thread: Thread-2(140484993353408)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 973, in _bootstrap
  self._bootstrap_inner()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
  self.run()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 320, in run
  result_item, is_broken, cause = self.wait_result_broken_or_wakeup()
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/concurrent/futures/process.py", line 380, in wait_result_broken_or_wakeup
  ready = mp.connection.wait(readers + worker_sentinels)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/multiprocessing/connection.py", line 936, in wait
  ready = selector.select(timeout)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/selectors.py", line 416, in select
  fd_event_list = self._selector.poll(timeout)

# Thread: MainThread(140488675936064)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 48, in <module>
  main()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/launch.py", line 44, in main
  start()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/launch_utils.py", line 440, in start
  webui.webui()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/webui.py", line 126, in webui
  server_command = shared.state.wait_for_server_command(timeout=5)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/shared_state.py", line 62, in wait_for_server_command
  if self._server_command_signal.wait(timeout):
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 607, in wait
  signaled = self._cond.wait(timeout)
File: "/home/cbytes/.pyenv/versions/3.10.6/lib/python3.10/threading.py", line 324, in wait
  gotit = waiter.acquire(True, timeout)
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 156, in sigint_handler
  dumpstacks()
File: "/home/cbytes/Downloads/benchmark-launcher-cli-3.1.0-linux/stable-diffusion-webui/modules/initialize_util.py", line 143, in dumpstacks
  for filename, lineno, name, line in traceback.extract_stack(stack):