nod-ai / SHARK-Studio

SHARK Studio -- Web UI for SHARK+IREE High Performance Machine Learning Distribution
Apache License 2.0
1.41k stars 170 forks source link

Crash/error on 'stock' test gen #2021

Open Acrivec opened 10 months ago

Acrivec commented 10 months ago

Didn't change anything, just launched and pressed Generate Image

``` Code block collapsed, click to show ``` ```bat Henlooo python index.py --no-use_tuned --import_mlir --device_allocator=caching (shark.venv) G:\Downloads\Shark\SHARK\apps\stable_diffusion\web>python index.py shark_tank local cache is located at C:\Users\Acrivec\.local/shark_tank/ . You may change this by setting the --local_tank_cache= flag Clearing .mlir temporary files from a prior run. This may take some time... Clearing .mlir temporary files took 0.0000 seconds. gradio temporary image cache located at G:\Downloads\Shark\SHARK\apps\stable_diffusion\web\shark_tmp/gradio. You may change this by setting the GRADIO_TEMP_DIR environment variable. No temporary images files to clear. G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\components\dropdown.py:163: UserWarning: The value passed into gr.Dropdown() is not in the list of choices. Please update the list of choices to include: EulerAncestralDiscrete or set allow_custom_value=True. warnings.warn( vulkan devices are available. metal devices are not available. cuda devices are not available. rocm devices are available. local-sync devices are available. local-task devices are available. G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\components\dropdown.py:163: UserWarning: The value passed into gr.Dropdown() is not in the list of choices. Please update the list of choices to include: SharkEulerDiscrete or set allow_custom_value=True. warnings.warn( {'cpu': ['Intel(R) Core(TM) i7-6700K CPU @ 4.00GHz => cpu-task'], 'cuda': [], 'vulkan': ['AMD Radeon RX 7900 XTX => vulkan://0'], 'rocm': ['AMD Radeon RX 7900 XTX => rocm']} Running on local URL: http://0.0.0.0:8080 To create a public link, set `share=True` in `launch()`. shark_tank local cache is located at C:\Users\Acrivec\.local/shark_tank/ . You may change this by setting the --local_tank_cache= flag Found device AMD Radeon RX 7900 XTX. Using target triple rdna3-7900-windows. Tuned models are currently not supported for this setting. Traceback (most recent call last): File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\queueing.py", line 456, in call_prediction output = await route_utils.call_process_api( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\route_utils.py", line 232, in call_process_api output = await app.get_blocks().process_api( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\blocks.py", line 1522, in process_api result = await self.call_function( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\blocks.py", line 1156, in call_function prediction = await utils.async_iteration(iterator) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\utils.py", line 515, in async_iteration return await iterator.__anext__() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\utils.py", line 508, in __anext__ return await anyio.to_thread.run_sync( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\anyio\to_thread.py", line 33, in run_sync return await get_asynclib().run_sync_in_worker_thread( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\anyio\_backends\_asyncio.py", line 877, in run_sync_in_worker_thread return await future ^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\anyio\_backends\_asyncio.py", line 807, in run result = context.run(func, *args) ^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\utils.py", line 491, in run_sync_iterator_async return next(iterator) ^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\utils.py", line 662, in gen_wrapper yield from f(*args, **kwargs) File "G:\Downloads\Shark\SHARK\apps\stable_diffusion\web\ui\txt2img_ui.py", line 161, in txt2img_inf global_obj.set_schedulers(get_schedulers(model_id)) ^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\apps\stable_diffusion\src\schedulers\sd_schedulers.py", line 100, in get_schedulers ] = SharkEulerDiscreteScheduler.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\diffusers\schedulers\scheduling_utils.py", line 147, in from_pretrained return cls.from_config(config, return_unused_kwargs=return_unused_kwargs, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\diffusers\configuration_utils.py", line 254, in from_config model = cls(**init_dict) ^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\diffusers\configuration_utils.py", line 644, in inner_init init(self, *args, **init_kwargs) File "G:\Downloads\Shark\SHARK\apps\stable_diffusion\src\schedulers\shark_eulerdiscrete.py", line 35, in __init__ super().__init__( File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\diffusers\configuration_utils.py", line 644, in inner_init init(self, *args, **init_kwargs) TypeError: EulerDiscreteScheduler.__init__() takes from 1 to 11 positional arguments but 14 were given Traceback (most recent call last): File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\queueing.py", line 456, in call_prediction output = await route_utils.call_process_api( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\route_utils.py", line 232, in call_process_api output = await app.get_blocks().process_api( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\blocks.py", line 1522, in process_api result = await self.call_function( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\blocks.py", line 1156, in call_function prediction = await utils.async_iteration(iterator) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\utils.py", line 515, in async_iteration return await iterator.__anext__() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\utils.py", line 508, in __anext__ return await anyio.to_thread.run_sync( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\anyio\to_thread.py", line 33, in run_sync return await get_asynclib().run_sync_in_worker_thread( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\anyio\_backends\_asyncio.py", line 877, in run_sync_in_worker_thread return await future ^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\anyio\_backends\_asyncio.py", line 807, in run result = context.run(func, *args) ^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\utils.py", line 491, in run_sync_iterator_async return next(iterator) ^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\utils.py", line 662, in gen_wrapper yield from f(*args, **kwargs) File "G:\Downloads\Shark\SHARK\apps\stable_diffusion\web\ui\txt2img_ui.py", line 161, in txt2img_inf global_obj.set_schedulers(get_schedulers(model_id)) ^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\apps\stable_diffusion\src\schedulers\sd_schedulers.py", line 100, in get_schedulers ] = SharkEulerDiscreteScheduler.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\diffusers\schedulers\scheduling_utils.py", line 147, in from_pretrained return cls.from_config(config, return_unused_kwargs=return_unused_kwargs, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\diffusers\configuration_utils.py", line 254, in from_config model = cls(**init_dict) ^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\diffusers\configuration_utils.py", line 644, in inner_init init(self, *args, **init_kwargs) File "G:\Downloads\Shark\SHARK\apps\stable_diffusion\src\schedulers\shark_eulerdiscrete.py", line 35, in __init__ super().__init__( File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\diffusers\configuration_utils.py", line 644, in inner_init init(self, *args, **init_kwargs) TypeError: EulerDiscreteScheduler.__init__() takes from 1 to 11 positional arguments but 14 were given The above exception was the direct cause of the following exception: Traceback (most recent call last): File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\queueing.py", line 501, in process_events response = await self.call_prediction(awake_events, batch) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\queueing.py", line 465, in call_prediction raise Exception(str(error) if show_error else None) from error Exception: None ```
Acrivec commented 10 months ago

Here's with --clear-all and --debug

``` Code block collapsed, click to show ``` ```cmd (shark.venv) G:\Downloads\Shark\SHARK\apps\stable_diffusion\web>python index.py --clear-all --debug shark_tank local cache is located at C:\Users\Acrivec\.local/shark_tank/ . You may change this by setting the --local_tank_cache= flag Clearing .mlir temporary files from a prior run. This may take some time... Clearing .mlir temporary files took 0.0000 seconds. gradio temporary image cache located at G:\Downloads\Shark\SHARK\apps\stable_diffusion\web\shark_tmp/gradio. You may change this by setting the GRADIO_TEMP_DIR environment variable. Clearing gradio UI temporary image files from a prior run. This may take some time... Clearing gradio UI temporary image files took 0.0000 seconds. DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:matplotlib:CACHEDIR=C:\Users\Acrivec\.matplotlib DEBUG:matplotlib.font_manager:Using fontManager instance from C:\Users\Acrivec\.matplotlib\fontlist-v330.json DEBUG:httpx:load_ssl_context verify=True cert=None trust_env=True http2=False DEBUG:httpx:load_verify_locations cafile='G:\\Downloads\\Shark\\SHARK\\shark.venv\\Lib\\site-packages\\certifi\\cacert.pem' DEBUG:PIL.Image:Importing BlpImagePlugin DEBUG:PIL.Image:Importing BmpImagePlugin DEBUG:PIL.Image:Importing BufrStubImagePlugin DEBUG:PIL.Image:Importing CurImagePlugin DEBUG:PIL.Image:Importing DcxImagePlugin DEBUG:PIL.Image:Importing DdsImagePlugin DEBUG:PIL.Image:Importing EpsImagePlugin DEBUG:PIL.Image:Importing FitsImagePlugin DEBUG:PIL.Image:Importing FliImagePlugin DEBUG:PIL.Image:Importing FpxImagePlugin DEBUG:PIL.Image:Image: failed to import FpxImagePlugin: No module named 'olefile' DEBUG:PIL.Image:Importing FtexImagePlugin DEBUG:PIL.Image:Importing GbrImagePlugin DEBUG:PIL.Image:Importing GifImagePlugin DEBUG:PIL.Image:Importing GribStubImagePlugin DEBUG:PIL.Image:Importing Hdf5StubImagePlugin DEBUG:PIL.Image:Importing IcnsImagePlugin DEBUG:PIL.Image:Importing IcoImagePlugin DEBUG:PIL.Image:Importing ImImagePlugin DEBUG:PIL.Image:Importing ImtImagePlugin DEBUG:PIL.Image:Importing IptcImagePlugin DEBUG:PIL.Image:Importing JpegImagePlugin DEBUG:PIL.Image:Importing Jpeg2KImagePlugin DEBUG:PIL.Image:Importing McIdasImagePlugin DEBUG:PIL.Image:Importing MicImagePlugin DEBUG:PIL.Image:Image: failed to import MicImagePlugin: No module named 'olefile' DEBUG:PIL.Image:Importing MpegImagePlugin DEBUG:PIL.Image:Importing MpoImagePlugin DEBUG:PIL.Image:Importing MspImagePlugin DEBUG:PIL.Image:Importing PalmImagePlugin DEBUG:PIL.Image:Importing PcdImagePlugin DEBUG:PIL.Image:Importing PcxImagePlugin DEBUG:PIL.Image:Importing PdfImagePlugin DEBUG:PIL.Image:Importing PixarImagePlugin DEBUG:PIL.Image:Importing PngImagePlugin DEBUG:PIL.Image:Importing PpmImagePlugin DEBUG:PIL.Image:Importing PsdImagePlugin DEBUG:PIL.Image:Importing QoiImagePlugin DEBUG:PIL.Image:Importing SgiImagePlugin DEBUG:PIL.Image:Importing SpiderImagePlugin DEBUG:PIL.Image:Importing SunImagePlugin DEBUG:PIL.Image:Importing TgaImagePlugin DEBUG:PIL.Image:Importing TiffImagePlugin DEBUG:PIL.Image:Importing WebPImagePlugin DEBUG:PIL.Image:Importing WmfImagePlugin DEBUG:PIL.Image:Importing XbmImagePlugin DEBUG:PIL.Image:Importing XpmImagePlugin DEBUG:PIL.Image:Importing XVThumbImagePlugin DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "GET /gradio-messaging/en HTTP/1.1" 200 3 G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\components\dropdown.py:163: UserWarning: The value passed into gr.Dropdown() is not in the list of choices. Please update the list of choices to include: EulerAncestralDiscrete or set allow_custom_value=True. warnings.warn( vulkan devices are available. metal devices are not available. cuda devices are not available. rocm devices are available. local-sync devices are available. local-task devices are available. DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:asyncio:Using proactor: IocpProactor DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): checkip.amazonaws.com:443 DEBUG:PIL.PngImagePlugin:STREAM b'IHDR' 16 13 DEBUG:PIL.PngImagePlugin:STREAM b'gAMA' 41 4 DEBUG:PIL.PngImagePlugin:STREAM b'sRGB' 57 1 DEBUG:PIL.PngImagePlugin:STREAM b'PLTE' 70 768 DEBUG:PIL.PngImagePlugin:STREAM b'tRNS' 850 256 DEBUG:PIL.PngImagePlugin:STREAM b'IDAT' 1118 8192 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): checkip.amazonaws.com:443 DEBUG:PIL.PngImagePlugin:STREAM b'IHDR' 16 13 DEBUG:PIL.PngImagePlugin:STREAM b'gAMA' 41 4 DEBUG:PIL.PngImagePlugin:STREAM b'sRGB' 57 1 DEBUG:PIL.PngImagePlugin:STREAM b'PLTE' 70 768 DEBUG:PIL.PngImagePlugin:STREAM b'tRNS' 850 256 DEBUG:PIL.PngImagePlugin:STREAM b'IDAT' 1118 8192 G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\components\dropdown.py:163: UserWarning: The value passed into gr.Dropdown() is not in the list of choices. Please update the list of choices to include: SharkEulerDiscrete or set allow_custom_value=True. warnings.warn( DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): checkip.amazonaws.com:443 DEBUG:PIL.PngImagePlugin:STREAM b'IHDR' 16 13 DEBUG:PIL.PngImagePlugin:STREAM b'gAMA' 41 4 DEBUG:PIL.PngImagePlugin:STREAM b'sRGB' 57 1 DEBUG:PIL.PngImagePlugin:STREAM b'PLTE' 70 768 DEBUG:PIL.PngImagePlugin:STREAM b'tRNS' 850 256 DEBUG:PIL.PngImagePlugin:STREAM b'IDAT' 1118 8192 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): checkip.amazonaws.com:443 DEBUG:PIL.PngImagePlugin:STREAM b'IHDR' 16 13 DEBUG:PIL.PngImagePlugin:STREAM b'gAMA' 41 4 DEBUG:PIL.PngImagePlugin:STREAM b'sRGB' 57 1 DEBUG:PIL.PngImagePlugin:STREAM b'PLTE' 70 768 DEBUG:PIL.PngImagePlugin:STREAM b'tRNS' 850 256 DEBUG:PIL.PngImagePlugin:STREAM b'IDAT' 1118 8192 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): checkip.amazonaws.com:443 DEBUG:urllib3.connectionpool:https://checkip.amazonaws.com:443 "GET / HTTP/1.1" 200 14 DEBUG:PIL.PngImagePlugin:STREAM b'IHDR' 16 13 DEBUG:PIL.PngImagePlugin:STREAM b'gAMA' 41 4 DEBUG:PIL.PngImagePlugin:STREAM b'sRGB' 57 1 DEBUG:charset_normalizer:Encoding detection: ascii is most likely the one. DEBUG:PIL.PngImagePlugin:STREAM b'PLTE' 70 768 DEBUG:PIL.PngImagePlugin:STREAM b'tRNS' 850 256 DEBUG:PIL.PngImagePlugin:STREAM b'IDAT' 1118 8192 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:https://checkip.amazonaws.com:443 "GET / HTTP/1.1" 200 14 DEBUG:charset_normalizer:Encoding detection: ascii is most likely the one. DEBUG:PIL.PngImagePlugin:STREAM b'IHDR' 16 13 DEBUG:PIL.PngImagePlugin:STREAM b'gAMA' 41 4 DEBUG:PIL.PngImagePlugin:STREAM b'sRGB' 57 1 DEBUG:PIL.PngImagePlugin:STREAM b'PLTE' 70 768 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:PIL.PngImagePlugin:STREAM b'tRNS' 850 256 DEBUG:PIL.PngImagePlugin:STREAM b'IDAT' 1118 8192 DEBUG:urllib3.connectionpool:https://checkip.amazonaws.com:443 "GET / HTTP/1.1" 200 14 DEBUG:charset_normalizer:Encoding detection: ascii is most likely the one. DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:torch_mlir._mlir_libs:Initializing MLIR with module: _mlirRegisterEverything DEBUG:torch_mlir._mlir_libs:Registering dialects from initializer DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:PIL.PngImagePlugin:STREAM b'IHDR' 16 13 DEBUG:PIL.PngImagePlugin:STREAM b'gAMA' 41 4 DEBUG:PIL.PngImagePlugin:STREAM b'sRGB' 57 1 DEBUG:PIL.PngImagePlugin:STREAM b'PLTE' 70 768 DEBUG:PIL.PngImagePlugin:STREAM b'tRNS' 850 256 DEBUG:PIL.PngImagePlugin:STREAM b'IDAT' 1118 8192 DEBUG:urllib3.connectionpool:https://checkip.amazonaws.com:443 "GET / HTTP/1.1" 200 14 DEBUG:charset_normalizer:Encoding detection: ascii is most likely the one. DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "GET /pkg-version HTTP/1.1" 200 21 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "GET /pkg-version HTTP/1.1" 200 21 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:https://checkip.amazonaws.com:443 "GET / HTTP/1.1" 200 14 {'cpu': ['Intel(R) Core(TM) i7-6700K CPU @ 4.00GHz => cpu-task'], 'cuda': [], 'vulkan': ['AMD Radeon RX 7900 XTX => vulkan://0'], 'rocm': ['AMD Radeon RX 7900 XTX => rocm']} DEBUG:charset_normalizer:Encoding detection: ascii is most likely the one. DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "GET /pkg-version HTTP/1.1" 200 21 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "GET /pkg-version HTTP/1.1" 200 21 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): api.gradio.app:443 DEBUG:PIL.PngImagePlugin:STREAM b'IHDR' 16 13 DEBUG:PIL.PngImagePlugin:STREAM b'gAMA' 41 4 DEBUG:PIL.PngImagePlugin:STREAM b'sRGB' 57 1 DEBUG:PIL.PngImagePlugin:STREAM b'PLTE' 70 768 DEBUG:PIL.PngImagePlugin:STREAM b'tRNS' 850 256 DEBUG:PIL.PngImagePlugin:STREAM b'IDAT' 1118 8192 DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "GET /pkg-version HTTP/1.1" 200 21 DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "GET /pkg-version HTTP/1.1" 200 21 DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "POST /gradio-initiated-analytics/ HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "POST /gradio-initiated-analytics/ HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "GET /pkg-version HTTP/1.1" 200 21 DEBUG:asyncio:Using proactor: IocpProactor DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "POST /gradio-initiated-analytics/ HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "POST /gradio-initiated-analytics/ HTTP/1.1" 200 None Running on local URL: http://0.0.0.0:8080 DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:8080 DEBUG:urllib3.connectionpool:http://localhost:8080 "GET /startup-events HTTP/1.1" 200 5 DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:8080 DEBUG:urllib3.connectionpool:http://localhost:8080 "HEAD / HTTP/1.1" 200 0 To create a public link, set `share=True` in `launch()`. DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "POST /gradio-initiated-analytics/ HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "GET /pkg-version HTTP/1.1" 200 21 DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "POST /gradio-initiated-analytics/ HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "GET /pkg-version HTTP/1.1" 200 21 DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "GET /pkg-version HTTP/1.1" 200 21 DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "POST /gradio-initiated-analytics/ HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "POST /gradio-initiated-analytics/ HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "POST /gradio-initiated-analytics/ HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "GET /pkg-version HTTP/1.1" 200 21 DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "GET /pkg-version HTTP/1.1" 200 21 DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "POST /gradio-initiated-analytics/ HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "POST /gradio-initiated-analytics/ HTTP/1.1" 200 None DEBUG:urllib3.connectionpool:https://api.gradio.app:443 "POST /gradio-initiated-analytics/ HTTP/1.1" 200 None shark_tank local cache is located at C:\Users\Acrivec\.local/shark_tank/ . You may change this by setting the --local_tank_cache= flag DEBUG:matplotlib.pyplot:Loaded backend tkagg version 8.6. DEBUG:matplotlib.pyplot:Loaded backend agg version v2.2. DEBUG:matplotlib.pyplot:Loaded backend TkAgg version 8.6. DEBUG:matplotlib.pyplot:Loaded backend agg version v2.2. DEBUG:matplotlib.pyplot:Loaded backend TkAgg version 8.6. DEBUG:matplotlib.pyplot:Loaded backend agg version v2.2. Found device AMD Radeon RX 7900 XTX. Using target triple rdna3-7900-windows. Tuned models are currently not supported for this setting. DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): huggingface.co:443 DEBUG:urllib3.connectionpool:https://huggingface.co:443 "HEAD /stabilityai/stable-diffusion-2-1-base/resolve/main/scheduler/scheduler_config.json HTTP/1.1" 200 0 DEBUG:urllib3.connectionpool:https://huggingface.co:443 "HEAD /stabilityai/stable-diffusion-2-1-base/resolve/main/scheduler/scheduler_config.json HTTP/1.1" 200 0 DEBUG:urllib3.connectionpool:https://huggingface.co:443 "HEAD /stabilityai/stable-diffusion-2-1-base/resolve/main/scheduler/scheduler_config.json HTTP/1.1" 200 0 DEBUG:urllib3.connectionpool:https://huggingface.co:443 "HEAD /stabilityai/stable-diffusion-2-1-base/resolve/main/scheduler/scheduler_config.json HTTP/1.1" 200 0 DEBUG:urllib3.connectionpool:https://huggingface.co:443 "HEAD /stabilityai/stable-diffusion-2-1-base/resolve/main/scheduler/scheduler_config.json HTTP/1.1" 200 0 DEBUG:urllib3.connectionpool:https://huggingface.co:443 "HEAD /stabilityai/stable-diffusion-2-1-base/resolve/main/scheduler/scheduler_config.json HTTP/1.1" 200 0 DEBUG:urllib3.connectionpool:https://huggingface.co:443 "HEAD /stabilityai/stable-diffusion-2-1-base/resolve/main/scheduler/scheduler_config.json HTTP/1.1" 200 0 DEBUG:urllib3.connectionpool:https://huggingface.co:443 "HEAD /stabilityai/stable-diffusion-2-1-base/resolve/main/scheduler/scheduler_config.json HTTP/1.1" 200 0 DEBUG:urllib3.connectionpool:https://huggingface.co:443 "HEAD /stabilityai/stable-diffusion-2-1-base/resolve/main/scheduler/scheduler_config.json HTTP/1.1" 200 0 DEBUG:urllib3.connectionpool:https://huggingface.co:443 "HEAD /stabilityai/stable-diffusion-2-1-base/resolve/main/scheduler/scheduler_config.json HTTP/1.1" 200 0 DEBUG:urllib3.connectionpool:https://huggingface.co:443 "HEAD /stabilityai/stable-diffusion-2-1-base/resolve/main/scheduler/scheduler_config.json HTTP/1.1" 200 0 DEBUG:urllib3.connectionpool:https://huggingface.co:443 "HEAD /stabilityai/stable-diffusion-2-1-base/resolve/main/scheduler/scheduler_config.json HTTP/1.1" 200 0 DEBUG:urllib3.connectionpool:https://huggingface.co:443 "HEAD /stabilityai/stable-diffusion-2-1-base/resolve/main/scheduler/scheduler_config.json HTTP/1.1" 200 0 DEBUG:urllib3.connectionpool:https://huggingface.co:443 "HEAD /stabilityai/stable-diffusion-2-1-base/resolve/main/scheduler/scheduler_config.json HTTP/1.1" 200 0 DEBUG:matplotlib.pyplot:Loaded backend TkAgg version 8.6. Traceback (most recent call last): File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\queueing.py", line 456, in call_prediction output = await route_utils.call_process_api( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\route_utils.py", line 232, in call_process_api output = await app.get_blocks().process_api( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\blocks.py", line 1522, in process_api result = await self.call_function( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\blocks.py", line 1156, in call_function prediction = await utils.async_iteration(iterator) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\utils.py", line 515, in async_iteration return await iterator.__anext__() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\utils.py", line 508, in __anext__ return await anyio.to_thread.run_sync( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\anyio\to_thread.py", line 33, in run_sync return await get_asynclib().run_sync_in_worker_thread( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\anyio\_backends\_asyncio.py", line 877, in run_sync_in_worker_thread return await future ^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\anyio\_backends\_asyncio.py", line 807, in run result = context.run(func, *args) ^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\utils.py", line 491, in run_sync_iterator_async return next(iterator) ^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\utils.py", line 662, in gen_wrapper yield from f(*args, **kwargs) File "G:\Downloads\Shark\SHARK\apps\stable_diffusion\web\ui\txt2img_ui.py", line 161, in txt2img_inf global_obj.set_schedulers(get_schedulers(model_id)) ^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\apps\stable_diffusion\src\schedulers\sd_schedulers.py", line 100, in get_schedulers ] = SharkEulerDiscreteScheduler.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\diffusers\schedulers\scheduling_utils.py", line 147, in from_pretrained return cls.from_config(config, return_unused_kwargs=return_unused_kwargs, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\diffusers\configuration_utils.py", line 254, in from_config model = cls(**init_dict) ^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\diffusers\configuration_utils.py", line 644, in inner_init init(self, *args, **init_kwargs) File "G:\Downloads\Shark\SHARK\apps\stable_diffusion\src\schedulers\shark_eulerdiscrete.py", line 35, in __init__ super().__init__( File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\diffusers\configuration_utils.py", line 644, in inner_init init(self, *args, **init_kwargs) TypeError: EulerDiscreteScheduler.__init__() takes from 1 to 11 positional arguments but 14 were given Traceback (most recent call last): File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\queueing.py", line 456, in call_prediction output = await route_utils.call_process_api( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\route_utils.py", line 232, in call_process_api output = await app.get_blocks().process_api( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\blocks.py", line 1522, in process_api result = await self.call_function( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\blocks.py", line 1156, in call_function prediction = await utils.async_iteration(iterator) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\utils.py", line 515, in async_iteration return await iterator.__anext__() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\utils.py", line 508, in __anext__ return await anyio.to_thread.run_sync( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\anyio\to_thread.py", line 33, in run_sync return await get_asynclib().run_sync_in_worker_thread( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\anyio\_backends\_asyncio.py", line 877, in run_sync_in_worker_thread return await future ^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\anyio\_backends\_asyncio.py", line 807, in run result = context.run(func, *args) ^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\utils.py", line 491, in run_sync_iterator_async return next(iterator) ^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\utils.py", line 662, in gen_wrapper yield from f(*args, **kwargs) File "G:\Downloads\Shark\SHARK\apps\stable_diffusion\web\ui\txt2img_ui.py", line 161, in txt2img_inf global_obj.set_schedulers(get_schedulers(model_id)) ^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\apps\stable_diffusion\src\schedulers\sd_schedulers.py", line 100, in get_schedulers ] = SharkEulerDiscreteScheduler.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\diffusers\schedulers\scheduling_utils.py", line 147, in from_pretrained return cls.from_config(config, return_unused_kwargs=return_unused_kwargs, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\diffusers\configuration_utils.py", line 254, in from_config model = cls(**init_dict) ^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\diffusers\configuration_utils.py", line 644, in inner_init init(self, *args, **init_kwargs) File "G:\Downloads\Shark\SHARK\apps\stable_diffusion\src\schedulers\shark_eulerdiscrete.py", line 35, in __init__ super().__init__( File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\diffusers\configuration_utils.py", line 644, in inner_init init(self, *args, **init_kwargs) TypeError: EulerDiscreteScheduler.__init__() takes from 1 to 11 positional arguments but 14 were given The above exception was the direct cause of the following exception: Traceback (most recent call last): File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\queueing.py", line 501, in process_events response = await self.call_prediction(awake_events, batch) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\Downloads\Shark\SHARK\shark.venv\Lib\site-packages\gradio\queueing.py", line 465, in call_prediction raise Exception(str(error) if show_error else None) from error Exception: None ```
Acrivec commented 10 months ago

Interestingly, changing to DDIM still produces same error TypeError: EulerDiscreteScheduler.__init__() takes from 1 to 11 positional arguments but 14 were given