microsoft / ai-powered-notes-winui3-sample

MIT License
40 stars 6 forks source link

Challenges with "Downloading Whisper" Instructions #1

Open JohnCoffinAtTripSpark opened 1 month ago

JohnCoffinAtTripSpark commented 1 month ago

First, I enjoyed your Microsoft Build BRK248 "Use AI for 'real things' in your Windows Apps". It spoke to me directly.

Second, I was trying to run the AI sample and encountered a few challenges in the "Downloading Whisper" section. Until I resolve these, I cannot run the code (and I'm one of those "see how it works in sample code" types). See https://github.com/microsoft/ai-powered-notes-winui3-sample.

The key challenge is that the "generate" instruction in step 4 generates an exception (see snippet). The result is "no optimized model" so I'm sorta stuck troubleshooting this. It may be certificate-related, but any insights would be greatly appreciated.

Some observations which may or may not impact the primary challenge:

Kind regard,

John


image

C:\dev\Olive\examples\whisper>olive run --config whisper_cpu_int8.json
[2024-05-22 13:22:23,815] [INFO] [run.py:279:run] Loading Olive module configuration from: C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\olive\olive_config.json
[2024-05-22 13:22:23,818] [INFO] [run.py:285:run] Loading run configuration from: whisper_cpu_int8.json
[2024-05-22 13:22:23,938] [DEBUG] [run.py:173:run_engine] Registering pass OnnxConversion
[2024-05-22 13:22:23,947] [DEBUG] [run.py:173:run_engine] Registering pass OrtTransformersOptimization
[2024-05-22 13:22:23,949] [DEBUG] [run.py:173:run_engine] Registering pass OnnxDynamicQuantization
[2024-05-22 13:22:23,953] [DEBUG] [run.py:173:run_engine] Registering pass InsertBeamSearch
[2024-05-22 13:22:23,957] [DEBUG] [run.py:173:run_engine] Registering pass AppendPrePostProcessingOps
[2024-05-22 13:22:23,967] [DEBUG] [accelerator_creator.py:130:_fill_accelerators] The accelerator device and execution providers are specified, skipping deduce.
[2024-05-22 13:22:23,968] [DEBUG] [accelerator_creator.py:169:_check_execution_providers] Supported execution providers for device cpu: ['CPUExecutionProvider']
[2024-05-22 13:22:23,971] [DEBUG] [accelerator_creator.py:199:create_accelerators] Initial accelerators and execution providers: {'cpu': ['CPUExecutionProvider']}
[2024-05-22 13:22:23,972] [INFO] [accelerator_creator.py:224:create_accelerators] Running workflow on accelerator specs: cpu-cpu
[2024-05-22 13:22:23,973] [DEBUG] [run.py:229:run_engine] Pass OnnxConversion already registered
[2024-05-22 13:22:23,974] [DEBUG] [run.py:229:run_engine] Pass OrtTransformersOptimization already registered
[2024-05-22 13:22:23,974] [DEBUG] [run.py:229:run_engine] Pass OnnxDynamicQuantization already registered
[2024-05-22 13:22:23,975] [DEBUG] [run.py:229:run_engine] Pass InsertBeamSearch already registered
[2024-05-22 13:22:23,976] [DEBUG] [run.py:229:run_engine] Pass AppendPrePostProcessingOps already registered
[2024-05-22 13:22:23,976] [INFO] [engine.py:107:initialize] Using cache directory: cache
[2024-05-22 13:22:23,980] [INFO] [engine.py:263:run] Running Olive on accelerator: cpu-cpu
[2024-05-22 13:22:23,982] [INFO] [engine.py:1075:_create_system] Creating target system ...
[2024-05-22 13:22:23,988] [DEBUG] [engine.py:1071:create_system] create native OliveSystem SystemType.Local
[2024-05-22 13:22:23,988] [INFO] [engine.py:1078:_create_system] Target system created in 0.000511 seconds
[2024-05-22 13:22:23,989] [INFO] [engine.py:1087:_create_system] Creating host system ...
[2024-05-22 13:22:23,990] [DEBUG] [engine.py:1071:create_system] create native OliveSystem SystemType.Local
[2024-05-22 13:22:23,991] [INFO] [engine.py:1090:_create_system] Host system created in 0.002005 seconds
[2024-05-22 13:22:24,134] [DEBUG] [engine.py:709:_cache_model] Cached model 359edf01 to cache\models\359edf01.json
[2024-05-22 13:22:24,134] [DEBUG] [engine.py:336:run_accelerator] Running Olive in no-search mode ...
[2024-05-22 13:22:24,139] [DEBUG] [engine.py:428:run_no_search] Running ['conversion', 'transformers_optimization', 'onnx_dynamic_quantization', 'insert_beam_search', 'prepost'] with no search ...
[2024-05-22 13:22:24,145] [INFO] [engine.py:865:_run_pass] Running pass conversion:OnnxConversion
[2024-05-22 13:22:24,149] [DEBUG] [resource_path.py:156:create_resource_path] Resource path code/user_script.py is inferred to be of type file.
[2024-05-22 13:22:24,156] [DEBUG] [resource_path.py:156:create_resource_path] Resource path code is inferred to be of type folder.
[2024-05-22 13:22:24,264] [DEBUG] [resource_path.py:156:create_resource_path] Resource path code is inferred to be of type folder.
[2024-05-22 13:22:24,269] [DEBUG] [resource_path.py:156:create_resource_path] Resource path code/user_script.py is inferred to be of type file.
C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
  warnings.warn(
[2024-05-22 13:22:24,474] [ERROR] [engine.py:947:_run_pass] Pass run failed.
Traceback (most recent call last):
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\connectionpool.py", line 467, in _make_request
    self._validate_conn(conn)
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\connectionpool.py", line 1099, in _validate_conn
    conn.connect()
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\connection.py", line 653, in connect
    sock_and_verified = _ssl_wrap_socket_and_match_hostname(
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\connection.py", line 806, in _ssl_wrap_socket_and_match_hostname
    ssl_sock = ssl_wrap_socket(
               ^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\util\ssl_.py", line 465, in ssl_wrap_socket
    ssl_sock = _ssl_wrap_socket_impl(sock, context, tls_in_tls, server_hostname)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\util\ssl_.py", line 509, in _ssl_wrap_socket_impl
    return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\ssl.py", line 455, in wrap_socket
    return self.sslsocket_class._create(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\ssl.py", line 1042, in _create
    self.do_handshake()
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\ssl.py", line 1320, in do_handshake
    self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\connectionpool.py", line 793, in urlopen
    response = self._make_request(
               ^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\connectionpool.py", line 491, in _make_request
    raise new_e
urllib3.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\requests\adapters.py", line 589, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\connectionpool.py", line 847, in urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\util\retry.py", line 515, in increment
    raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /openai/whisper-small/resolve/main/config.json (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\olive\engine\engine.py", line 935, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, data_root, output_model_path, pass_search_point)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\olive\systems\local.py", line 31, in run_pass
    model = model_config.create_model()
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\olive\model\config\model_config.py", line 109, in create_model
    return cls(**self.config)
           ^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\olive\model\handler\pytorch.py", line 92, in __init__
    hf_model_config = self.get_hf_model_config().to_dict()
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\olive\model\handler\mixin\hf_config.py", line 51, in get_hf_model_config
    return get_hf_model_config(self.get_model_path_or_name(), **self.hf_config.get_loading_args_from_pretrained())
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\olive\model\utils\hf_utils.py", line 83, in get_hf_model_config
    return AutoConfig.from_pretrained(model_name, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 934, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\configuration_utils.py", line 632, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\configuration_utils.py", line 689, in _get_config_dict
    resolved_config_file = cached_file(
                           ^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\utils\hub.py", line 399, in cached_file
    resolved_file = hf_hub_download(
                    ^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\file_download.py", line 1221, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\file_download.py", line 1282, in _hf_hub_download_to_cache_dir
    (url_to_download, etag, commit_hash, expected_size, head_call_error) = _get_metadata_or_catch_error(
                                                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\file_download.py", line 1722, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(url=url, proxies=proxies, timeout=etag_timeout, headers=headers)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\file_download.py", line 1645, in get_hf_file_metadata
    r = _request_wrapper(
        ^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\file_download.py", line 372, in _request_wrapper
    response = _request_wrapper(
               ^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\file_download.py", line 395, in _request_wrapper
    response = get_session().request(method=method, url=url, **params)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\requests\sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\requests\sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\utils\_http.py", line 66, in send
    return super().send(request, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\requests\adapters.py", line 620, in send
    raise SSLError(e, request=request)
requests.exceptions.SSLError: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /openai/whisper-small/resolve/main/config.json (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')))"), '(Request ID: ab1b465d-0436-4989-abf9-f87759832672)')
[2024-05-22 13:22:24,511] [WARNING] [engine.py:358:run_accelerator] Failed to run Olive on cpu-cpu.
Traceback (most recent call last):
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\connectionpool.py", line 467, in _make_request
    self._validate_conn(conn)
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\connectionpool.py", line 1099, in _validate_conn
    conn.connect()
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\connection.py", line 653, in connect
    sock_and_verified = _ssl_wrap_socket_and_match_hostname(
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\connection.py", line 806, in _ssl_wrap_socket_and_match_hostname
    ssl_sock = ssl_wrap_socket(
               ^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\util\ssl_.py", line 465, in ssl_wrap_socket
    ssl_sock = _ssl_wrap_socket_impl(sock, context, tls_in_tls, server_hostname)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\util\ssl_.py", line 509, in _ssl_wrap_socket_impl
    return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\ssl.py", line 455, in wrap_socket
    return self.sslsocket_class._create(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\ssl.py", line 1042, in _create
    self.do_handshake()
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\ssl.py", line 1320, in do_handshake
    self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\connectionpool.py", line 793, in urlopen
    response = self._make_request(
               ^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\connectionpool.py", line 491, in _make_request
    raise new_e
urllib3.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\requests\adapters.py", line 589, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\connectionpool.py", line 847, in urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\urllib3\util\retry.py", line 515, in increment
    raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /openai/whisper-small/resolve/main/config.json (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\olive\engine\engine.py", line 337, in run_accelerator
    output_footprint = self.run_no_search(
                       ^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\olive\engine\engine.py", line 429, in run_no_search
    should_prune, signal, model_ids = self._run_passes(
                                      ^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\olive\engine\engine.py", line 827, in _run_passes
    model_config, model_id = self._run_pass(
                             ^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\olive\engine\engine.py", line 935, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, data_root, output_model_path, pass_search_point)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\olive\systems\local.py", line 31, in run_pass
    model = model_config.create_model()
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\olive\model\config\model_config.py", line 109, in create_model
    return cls(**self.config)
           ^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\olive\model\handler\pytorch.py", line 92, in __init__
    hf_model_config = self.get_hf_model_config().to_dict()
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\olive\model\handler\mixin\hf_config.py", line 51, in get_hf_model_config
    return get_hf_model_config(self.get_model_path_or_name(), **self.hf_config.get_loading_args_from_pretrained())
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\olive\model\utils\hf_utils.py", line 83, in get_hf_model_config
    return AutoConfig.from_pretrained(model_name, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 934, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\configuration_utils.py", line 632, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\configuration_utils.py", line 689, in _get_config_dict
    resolved_config_file = cached_file(
                           ^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\utils\hub.py", line 399, in cached_file
    resolved_file = hf_hub_download(
                    ^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\file_download.py", line 1221, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\file_download.py", line 1282, in _hf_hub_download_to_cache_dir
    (url_to_download, etag, commit_hash, expected_size, head_call_error) = _get_metadata_or_catch_error(
                                                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\file_download.py", line 1722, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(url=url, proxies=proxies, timeout=etag_timeout, headers=headers)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\file_download.py", line 1645, in get_hf_file_metadata
    r = _request_wrapper(
        ^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\file_download.py", line 372, in _request_wrapper
    response = _request_wrapper(
               ^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\file_download.py", line 395, in _request_wrapper
    response = get_session().request(method=method, url=url, **params)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\requests\sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\requests\sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\huggingface_hub\utils\_http.py", line 66, in send
    return super().send(request, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\john.coffin\AppData\Local\Programs\Python\Python312\Lib\site-packages\requests\adapters.py", line 620, in send
    raise SSLError(e, request=request)
requests.exceptions.SSLError: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /openai/whisper-small/resolve/main/config.json (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)')))"), '(Request ID: ab1b465d-0436-4989-abf9-f87759832672)')
[2024-05-22 13:22:24,539] [INFO] [engine.py:280:run] Run history for cpu-cpu:
[2024-05-22 13:22:24,574] [INFO] [engine.py:568:dump_run_history] run history:
+------------+-------------------+-------------+----------------+-----------+
| model_id   | parent_model_id   | from_pass   | duration_sec   | metrics   |
+============+===================+=============+================+===========+
| 359edf01   |                   |             |                |           |
+------------+-------------------+-------------+----------------+-----------+
[2024-05-22 13:22:24,577] [INFO] [engine.py:295:run] No packaging config provided, skip packaging artifacts

C:\dev\Olive\examples\whisper>
nmetulev commented 1 month ago

Sorry about the issue you are running into. Optimizing your own model is not ideal and I'm not really sure what the issue is based on this (the folks on the olive repo might know). Let me take an action to look into ways we can avoid having to generate your own model in the first place.