Canner / WrenAI

🚀 An open-source SQL AI (Text-to-SQL) Agent that empowers data, product teams to chat with their data. 🤘
https://getwren.ai/oss
GNU Affero General Public License v3.0
2.1k stars 222 forks source link

I am getting error in deploying my database model as I am getting error "Failed to deploy. Please check the log for more details." #624

Open Rishav-11 opened 3 months ago

Rishav-11 commented 3 months ago

Describe the bug I am unable to deploy my database model. Even after clicking deploy button. Each time I click here this message pops up Failed to deploy. Please check the log for more details. I am also sharing my log below please check and also because of this it is not generating any answer, as it says "Failed to create asking task"

logs:-


2024-08-27 14:00:00 wren-engine-1      | 2024-08-27T08:30:00.984ZINFOmainio.wren.main.server.Server======== SERVER STARTED ========
2024-08-27 14:05:58 wren-engine-1      | 2024-08-27T08:35:58.666ZINFOpool-1-thread-1io.wren.main.PreviewServicePlanned SQL: WITH
2024-08-27 14:05:58 wren-engine-1      |   "adventureworks_addresstype" AS (
2024-08-27 14:05:58 wren-engine-1      |    SELECT
2024-08-27 14:05:58 wren-engine-1      |      "adventureworks_addresstype"."AddressTypeID" "AddressTypeID"
2024-08-27 14:05:58 wren-engine-1      |    , "adventureworks_addresstype"."ModifiedDate" "ModifiedDate"
2024-08-27 14:05:58 wren-engine-1      |    , "adventureworks_addresstype"."Name" "Name"
2024-08-27 14:05:58 wren-engine-1      |    , "adventureworks_addresstype"."rowguid" "rowguid"
2024-08-27 14:05:58 wren-engine-1      |    FROM
2024-08-27 14:05:58 wren-engine-1      |      (
2024-08-27 14:05:58 wren-engine-1      |       SELECT
2024-08-27 14:05:58 wren-engine-1      |         "adventureworks_addresstype"."AddressTypeID" "AddressTypeID"
2024-08-27 14:05:58 wren-engine-1      |       , "adventureworks_addresstype"."ModifiedDate" "ModifiedDate"
2024-08-27 14:05:58 wren-engine-1      |       , "adventureworks_addresstype"."Name" "Name"
2024-08-27 14:05:58 wren-engine-1      |       , "adventureworks_addresstype"."rowguid" "rowguid"
2024-08-27 14:05:58 wren-engine-1      |       FROM
2024-08-27 14:05:58 wren-engine-1      |         (
2024-08-27 14:05:58 wren-engine-1      |          SELECT
2024-08-27 14:05:58 wren-engine-1      |            "AddressTypeID" "AddressTypeID"
2024-08-27 14:05:58 wren-engine-1      |          , "ModifiedDate" "ModifiedDate"
2024-08-27 14:05:58 wren-engine-1      |          , "Name" "Name"
2024-08-27 14:05:58 wren-engine-1      |          , "rowguid" "rowguid"
2024-08-27 14:05:58 wren-engine-1      |          FROM
2024-08-27 14:05:58 wren-engine-1      |            "adventureworks"."addresstype" "adventureworks_addresstype"
2024-08-27 14:05:58 wren-engine-1      |       )  "adventureworks_addresstype"
2024-08-27 14:05:58 wren-engine-1      |    )  "adventureworks_addresstype"
2024-08-27 14:05:58 wren-engine-1      | ) 
2024-08-27 14:05:58 wren-engine-1      | SELECT *
2024-08-27 14:05:58 wren-engine-1      | FROM
2024-08-27 14:05:58 wren-engine-1      |   "adventureworks_addresstype"
2024-08-27 14:05:58 wren-engine-1      | 
2034-01-01 00:00:00 
2001-01-01 00:00:00 xited with code 0
bootstrap-1        | 2024-08-22T05:35:03.751699743Z init config.properties
2024-08-22 11:05:03 bootstrap-1        | wren.experimental-enable-dynamic-fields is not set, set it to true
2024-08-22 11:05:03 bootstrap-1        | create mdl folder
2024-08-22 11:05:03 bootstrap-1        | init mdl/sample.json
2024-08-27 14:21:52 wren-ui-1          | [2024-08-27T08:51:52.911] [DEBUG] DeployService - Deploying model, hash: ffbdcf3b8d92e501408591a70718bd5f9d66d3f4
2024-08-27 14:21:52 wren-ui-1          | [2024-08-27T08:51:52.936] [DEBUG] WrenAIAdaptor - Got error when deploying to wren AI, hash: ffbdcf3b8d92e501408591a70718bd5f9d66d3f4. Error: connect ECONNREFUSED 172.18.0.3:5555
2024-08-27 14:28:04 wren-ui-1          | [2024-08-27T08:58:04.960] [DEBUG] DeployService - Deploying model, hash: ffbdcf3b8d92e501408591a70718bd5f9d66d3f4
2024-08-27 14:28:04 wren-ui-1          | [2024-08-27T08:58:04.985] [DEBUG] WrenAIAdaptor - Got error when deploying to wren AI, hash: ffbdcf3b8d92e501408591a70718bd5f9d66d3f4. Error: connect ECONNREFUSED 172.18.0.3:5555

To Reproduce Steps to reproduce the behavior:

  1. Go to 'modeling'
  2. Click on 'deploy'
  3. See error

Expected behavior As it should be able to deploy those models and after that It should able to generate some responses.

Desktop (please complete the following information):

Wren AI Information

cyyeh commented 3 months ago

@Rishav-11

Thanks for reaching out! Could u provide us the service logs?

Thank you

docker logs wrenai-wren-ui-1 >& wrenai-wren-ui.log && \
docker logs wrenai-wren-ai-service-1 >& wrenai-wren-ai-service.log && \
docker logs wrenai-wren-engine-1 >& wrenai-wren-engine.log && \
docker logs wrenai-ibis-server-1 >& wrenai-ibis-server.log
Rishav-11 commented 3 months ago
2024-08-22 11:05:16 This module is deprecated and will be removed in Hamilton 2.0 Please use `hamilton.async_driver` instead. 
2024-08-22 11:05:21 INFO:     Started server process [7]
2024-08-22 11:05:05 Waiting for wren-ai-service to start...
2024-08-22 11:05:21 INFO:     Waiting for application startup.
2024-08-22 11:05:21 2024-08-22 05:35:21,039 - wren-ai-service - INFO - Initializing providers... (utils.py:64)
2024-08-22 11:05:24 2024-08-22 05:35:24,863 - wren-ai-service - INFO - Registering provider: openai_embedder (loader.py:66)
2024-08-22 11:05:24 2024-08-22 05:35:24,863 - wren-ai-service - INFO - Registering provider: qdrant (loader.py:66)
2024-08-22 11:05:24 2024-08-22 05:35:24,868 - wren-ai-service - INFO - Registering provider: azure_openai_embedder (loader.py:66)
2024-08-22 11:05:24 2024-08-22 05:35:24,879 - wren-ai-service - INFO - Registering provider: ollama_embedder (loader.py:66)
2024-08-22 11:05:24 2024-08-22 05:35:24,884 - wren-ai-service - INFO - Registering provider: wren_ui (loader.py:66)
2024-08-22 11:05:24 2024-08-22 05:35:24,884 - wren-ai-service - INFO - Registering provider: wren_ibis (loader.py:66)
2024-08-22 11:05:24 2024-08-22 05:35:24,884 - wren-ai-service - INFO - Registering provider: wren_engine (loader.py:66)
2024-08-22 11:05:24 2024-08-22 05:35:24,915 - wren-ai-service - INFO - Registering provider: azure_openai_llm (loader.py:66)
2024-08-22 11:05:24 2024-08-22 05:35:24,928 - wren-ai-service - INFO - Registering provider: ollama_llm (loader.py:66)
2024-08-22 11:05:24 2024-08-22 05:35:24,932 - wren-ai-service - INFO - Registering provider: openai_llm (loader.py:66)
2024-08-22 11:05:24 2024-08-22 05:35:24,932 - wren-ai-service - INFO - Using OpenAILLM provider with API base: https://api.openai.com/v1 (openai.py:135)
2024-08-22 11:05:28 ERROR:    Traceback (most recent call last):
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions
2024-08-22 11:05:28     yield
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 233, in handle_request
2024-08-22 11:05:28     resp = self._pool.handle_request(req)
2024-08-22 11:05:28            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request
2024-08-22 11:05:28     raise exc from None
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request
2024-08-22 11:05:28     response = connection.handle_request(
2024-08-22 11:05:28                ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 99, in handle_request
2024-08-22 11:05:28     raise exc
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 76, in handle_request
2024-08-22 11:05:28     stream = self._connect(request)
2024-08-22 11:05:28              ^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 154, in _connect
2024-08-22 11:05:28     stream = stream.start_tls(**kwargs)
2024-08-22 11:05:28              ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/httpcore/_backends/sync.py", line 152, in start_tls
2024-08-22 11:05:28     with map_exceptions(exc_map):
2024-08-22 11:05:28   File "/usr/local/lib/python3.12/contextlib.py", line 155, in __exit__
2024-08-22 11:05:28     self.gen.throw(value)
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
2024-08-22 11:05:28     raise to_exc(exc) from exc
2024-08-22 11:05:28 httpcore.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1000)
2024-08-22 11:05:28 
2024-08-22 11:05:28 The above exception was the direct cause of the following exception:
2024-08-22 11:05:28 
2024-08-22 11:05:28 Traceback (most recent call last):
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 952, in _request
2024-08-22 11:05:28     response = self._client.send(
2024-08-22 11:05:28                ^^^^^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 914, in send
2024-08-22 11:05:28     response = self._send_handling_auth(
2024-08-22 11:05:28                ^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 942, in _send_handling_auth
2024-08-22 11:05:28     response = self._send_handling_redirects(
2024-08-22 11:05:28                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
2024-08-22 11:05:28     response = self._send_single_request(request)
2024-08-22 11:05:28                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 1015, in _send_single_request
2024-08-22 11:05:28     response = transport.handle_request(request)
2024-08-22 11:05:28                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 232, in handle_request
2024-08-22 11:05:28     with map_httpcore_exceptions():
2024-08-22 11:05:28   File "/usr/local/lib/python3.12/contextlib.py", line 155, in __exit__
2024-08-22 11:05:28     self.gen.throw(value)
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
2024-08-22 11:05:28     raise mapped_exc(message) from exc
2024-08-22 11:05:28 httpx.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1000)
2024-08-22 11:05:28 
2024-08-22 11:05:28 The above exception was the direct cause of the following exception:
2024-08-22 11:05:28 
2024-08-22 11:05:28 Traceback (most recent call last):
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 734, in lifespan
2024-08-22 11:05:28     async with self.lifespan_context(app) as maybe_state:
2024-08-22 11:05:28   File "/usr/local/lib/python3.12/contextlib.py", line 204, in __aenter__
2024-08-22 11:05:28     return await anext(self.gen)
2024-08-22 11:05:28            ^^^^^^^^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/src/__main__.py", line 28, in lifespan
2024-08-22 11:05:28     container.init_globals()
2024-08-22 11:05:28   File "/src/globals.py", line 53, in init_globals
2024-08-22 11:05:28     llm_provider, embedder_provider, document_store_provider, engine = init_providers(
2024-08-22 11:05:28                                                                        ^^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/src/utils.py", line 67, in init_providers
2024-08-22 11:05:28     llm_provider = loader.get_provider(os.getenv("LLM_PROVIDER", "openai_llm"))()
2024-08-22 11:05:28                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/src/providers/llm/openai.py", line 138, in __init__
2024-08-22 11:05:28     _verify_api_key(self._api_key.resolve_value(), self._api_base)
2024-08-22 11:05:28   File "/src/providers/llm/openai.py", line 129, in _verify_api_key
2024-08-22 11:05:28     OpenAI(api_key=api_key, base_url=api_base).models.list()
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/openai/resources/models.py", line 80, in list
2024-08-22 11:05:28     return self._get_api_list(
2024-08-22 11:05:28            ^^^^^^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1289, in get_api_list
2024-08-22 11:05:28     return self._request_api_list(model, page, opts)
2024-08-22 11:05:28            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1134, in _request_api_list
2024-08-22 11:05:28     return self.request(page, options, stream=False)
2024-08-22 11:05:28            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 921, in request
2024-08-22 11:05:28     return self._request(
2024-08-22 11:05:28            ^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 976, in _request
2024-08-22 11:05:28     return self._retry_request(
2024-08-22 11:05:28            ^^^^^^^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1053, in _retry_request
2024-08-22 11:05:28     return self._request(
2024-08-22 11:05:28            ^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 976, in _request
2024-08-22 11:05:28     return self._retry_request(
2024-08-22 11:05:28            ^^^^^^^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1053, in _retry_request
2024-08-22 11:05:28     return self._request(
2024-08-22 11:05:28            ^^^^^^^^^^^^^^
2024-08-22 11:05:28   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 986, in _request
2024-08-22 11:05:28     raise APIConnectionError(request=request) from err
2024-08-22 11:05:28 openai.APIConnectionError: Connection error.
2024-08-22 11:05:28 
2024-08-22 11:05:28 ERROR:    Application startup failed. Exiting.
2024-08-22 13:33:52 This module is deprecated and will be removed in Hamilton 2.0 Please use `hamilton.async_driver` instead. 
2024-08-22 13:33:53 INFO:     Started server process [7]
2024-08-22 13:33:53 INFO:     Waiting for application startup.
2024-08-22 13:33:53 2024-08-22 08:03:53,542 - wren-ai-service - INFO - Initializing providers... (utils.py:64)
2024-08-22 13:33:55 2024-08-22 08:03:55,008 - wren-ai-service - INFO - Registering provider: openai_embedder (loader.py:66)
2024-08-22 13:33:55 2024-08-22 08:03:55,008 - wren-ai-service - INFO - Registering provider: qdrant (loader.py:66)
2024-08-22 13:33:55 2024-08-22 08:03:55,010 - wren-ai-service - INFO - Registering provider: azure_openai_embedder (loader.py:66)
2024-08-22 13:33:55 2024-08-22 08:03:55,014 - wren-ai-service - INFO - Registering provider: ollama_embedder (loader.py:66)
2024-08-22 13:33:55 2024-08-22 08:03:55,016 - wren-ai-service - INFO - Registering provider: wren_ui (loader.py:66)
2024-08-22 13:33:55 2024-08-22 08:03:55,016 - wren-ai-service - INFO - Registering provider: wren_ibis (loader.py:66)
2024-08-22 13:33:55 2024-08-22 08:03:55,016 - wren-ai-service - INFO - Registering provider: wren_engine (loader.py:66)
2024-08-22 13:33:55 2024-08-22 08:03:55,026 - wren-ai-service - INFO - Registering provider: azure_openai_llm (loader.py:66)
2024-08-22 13:33:55 2024-08-22 08:03:55,031 - wren-ai-service - INFO - Registering provider: ollama_llm (loader.py:66)
2024-08-22 13:33:55 2024-08-22 08:03:55,032 - wren-ai-service - INFO - Registering provider: openai_llm (loader.py:66)
2024-08-22 13:33:55 2024-08-22 08:03:55,033 - wren-ai-service - INFO - Using OpenAILLM provider with API base: https://api.openai.com/v1 (openai.py:135)
2024-08-22 13:33:58 ERROR:    Traceback (most recent call last):
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions
2024-08-22 13:33:58     yield
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 233, in handle_request
2024-08-22 13:33:58     resp = self._pool.handle_request(req)
2024-08-22 13:33:58            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request
2024-08-22 13:33:58     raise exc from None
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request
2024-08-22 13:33:58     response = connection.handle_request(
2024-08-22 13:33:58                ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 99, in handle_request
2024-08-22 13:33:58     raise exc
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 76, in handle_request
2024-08-22 13:33:58     stream = self._connect(request)
2024-08-22 13:33:58              ^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 154, in _connect
2024-08-22 13:33:58     stream = stream.start_tls(**kwargs)
2024-08-22 13:33:58              ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/httpcore/_backends/sync.py", line 152, in start_tls
2024-08-22 13:33:58     with map_exceptions(exc_map):
2024-08-22 13:33:58   File "/usr/local/lib/python3.12/contextlib.py", line 155, in __exit__
2024-08-22 13:33:58     self.gen.throw(value)
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
2024-08-22 13:33:58     raise to_exc(exc) from exc
2024-08-22 13:33:58 httpcore.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1000)
2024-08-22 13:33:58 
2024-08-22 13:33:58 The above exception was the direct cause of the following exception:
2024-08-22 13:33:58 
2024-08-22 13:33:58 Traceback (most recent call last):
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 952, in _request
2024-08-22 13:33:58     response = self._client.send(
2024-08-22 13:33:58                ^^^^^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 914, in send
2024-08-22 13:33:58     response = self._send_handling_auth(
2024-08-22 13:33:58                ^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 942, in _send_handling_auth
2024-08-22 13:33:58     response = self._send_handling_redirects(
2024-08-22 13:33:58                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
2024-08-22 13:33:58     response = self._send_single_request(request)
2024-08-22 13:33:58                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 1015, in _send_single_request
2024-08-22 13:33:58     response = transport.handle_request(request)
2024-08-22 13:33:58                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 232, in handle_request
2024-08-22 13:33:58     with map_httpcore_exceptions():
2024-08-22 13:33:58   File "/usr/local/lib/python3.12/contextlib.py", line 155, in __exit__
2024-08-22 13:33:58     self.gen.throw(value)
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
2024-08-22 13:33:58     raise mapped_exc(message) from exc
2024-08-22 13:33:58 httpx.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1000)
2024-08-22 13:33:58 
2024-08-22 13:33:58 The above exception was the direct cause of the following exception:
2024-08-22 13:33:58 
2024-08-22 13:33:58 Traceback (most recent call last):
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 734, in lifespan
2024-08-22 13:33:58     async with self.lifespan_context(app) as maybe_state:
2024-08-22 13:33:58   File "/usr/local/lib/python3.12/contextlib.py", line 204, in __aenter__
2024-08-22 13:33:58     return await anext(self.gen)
2024-08-22 13:33:58            ^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/src/__main__.py", line 28, in lifespan
2024-08-22 13:33:58     container.init_globals()
2024-08-22 13:33:58   File "/src/globals.py", line 53, in init_globals
2024-08-22 13:33:58     llm_provider, embedder_provider, document_store_provider, engine = init_providers(
2024-08-22 13:33:58                                                                        ^^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/src/utils.py", line 67, in init_providers
2024-08-22 13:33:58     llm_provider = loader.get_provider(os.getenv("LLM_PROVIDER", "openai_llm"))()
2024-08-22 13:33:58                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/src/providers/llm/openai.py", line 138, in __init__
2024-08-22 13:33:58     _verify_api_key(self._api_key.resolve_value(), self._api_base)
2024-08-22 13:33:58   File "/src/providers/llm/openai.py", line 129, in _verify_api_key
2024-08-22 13:33:58     OpenAI(api_key=api_key, base_url=api_base).models.list()
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/openai/resources/models.py", line 80, in list
2024-08-22 13:33:58     return self._get_api_list(
2024-08-22 13:33:58            ^^^^^^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1289, in get_api_list
2024-08-22 13:33:58     return self._request_api_list(model, page, opts)
2024-08-22 13:33:58            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1134, in _request_api_list
2024-08-22 13:33:58     return self.request(page, options, stream=False)
2024-08-22 13:33:58            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 921, in request
2024-08-22 13:33:58     return self._request(
2024-08-22 13:33:58            ^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 976, in _request
2024-08-22 13:33:58     return self._retry_request(
2024-08-22 13:33:58            ^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1053, in _retry_request
2024-08-22 13:33:58     return self._request(
2024-08-22 13:33:58            ^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 976, in _request
2024-08-22 13:33:58     return self._retry_request(
2024-08-22 13:33:58            ^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1053, in _retry_request
2024-08-22 13:33:58     return self._request(
2024-08-22 13:33:58            ^^^^^^^^^^^^^^
2024-08-22 13:33:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 986, in _request
2024-08-22 13:33:58     raise APIConnectionError(request=request) from err
2024-08-22 13:33:58 openai.APIConnectionError: Connection error.
2024-08-22 13:33:58 
2024-08-22 13:33:58 ERROR:    Application startup failed. Exiting.
2024-08-22 13:34:53 This module is deprecated and will be removed in Hamilton 2.0 Please use `hamilton.async_driver` instead. 
2024-08-22 13:34:54 INFO:     Started server process [7]
2024-08-22 13:34:54 INFO:     Waiting for application startup.
2024-08-22 13:34:54 2024-08-22 08:04:54,413 - wren-ai-service - INFO - Initializing providers... (utils.py:64)
2024-08-22 13:34:55 2024-08-22 08:04:55,621 - wren-ai-service - INFO - Registering provider: openai_embedder (loader.py:66)
2024-08-22 13:34:55 2024-08-22 08:04:55,621 - wren-ai-service - INFO - Registering provider: qdrant (loader.py:66)
2024-08-22 13:34:55 2024-08-22 08:04:55,622 - wren-ai-service - INFO - Registering provider: azure_openai_embedder (loader.py:66)
2024-08-22 13:34:55 2024-08-22 08:04:55,623 - wren-ai-service - INFO - Registering provider: ollama_embedder (loader.py:66)
2024-08-22 13:34:55 2024-08-22 08:04:55,623 - wren-ai-service - INFO - Registering provider: wren_ui (loader.py:66)
2024-08-22 13:34:55 2024-08-22 08:04:55,624 - wren-ai-service - INFO - Registering provider: wren_ibis (loader.py:66)
2024-08-22 13:34:55 2024-08-22 08:04:55,624 - wren-ai-service - INFO - Registering provider: wren_engine (loader.py:66)
2024-08-22 13:34:55 2024-08-22 08:04:55,628 - wren-ai-service - INFO - Registering provider: azure_openai_llm (loader.py:66)
2024-08-22 13:34:55 2024-08-22 08:04:55,629 - wren-ai-service - INFO - Registering provider: ollama_llm (loader.py:66)
2024-08-22 13:34:55 2024-08-22 08:04:55,630 - wren-ai-service - INFO - Registering provider: openai_llm (loader.py:66)
2024-08-22 13:34:55 2024-08-22 08:04:55,630 - wren-ai-service - INFO - Using OpenAILLM provider with API base: https://api.openai.com/v1 (openai.py:135)
2024-08-22 13:34:58 ERROR:    Traceback (most recent call last):
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions
2024-08-22 13:34:58     yield
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 233, in handle_request
2024-08-22 13:34:58     resp = self._pool.handle_request(req)
2024-08-22 13:34:58            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request
2024-08-22 13:34:58     raise exc from None
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request
2024-08-22 13:34:58     response = connection.handle_request(
2024-08-22 13:34:58                ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 99, in handle_request
2024-08-22 13:34:58     raise exc
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 76, in handle_request
2024-08-22 13:34:58     stream = self._connect(request)
2024-08-22 13:34:58              ^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 154, in _connect
2024-08-22 13:34:58     stream = stream.start_tls(**kwargs)
2024-08-22 13:34:58              ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/httpcore/_backends/sync.py", line 152, in start_tls
2024-08-22 13:34:58     with map_exceptions(exc_map):
2024-08-22 13:34:58   File "/usr/local/lib/python3.12/contextlib.py", line 155, in __exit__
2024-08-22 13:34:58     self.gen.throw(value)
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
2024-08-22 13:34:58     raise to_exc(exc) from exc
2024-08-22 13:34:58 httpcore.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1000)
2024-08-22 13:34:58 
2024-08-22 13:34:58 The above exception was the direct cause of the following exception:
2024-08-22 13:34:58 
2024-08-22 13:34:58 Traceback (most recent call last):
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 952, in _request
2024-08-22 13:34:58     response = self._client.send(
2024-08-22 13:34:58                ^^^^^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 914, in send
2024-08-22 13:34:58     response = self._send_handling_auth(
2024-08-22 13:34:58                ^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 942, in _send_handling_auth
2024-08-22 13:34:58     response = self._send_handling_redirects(
2024-08-22 13:34:58                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
2024-08-22 13:34:58     response = self._send_single_request(request)
2024-08-22 13:34:58                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 1015, in _send_single_request
2024-08-22 13:34:58     response = transport.handle_request(request)
2024-08-22 13:34:58                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 232, in handle_request
2024-08-22 13:34:58     with map_httpcore_exceptions():
2024-08-22 13:34:58   File "/usr/local/lib/python3.12/contextlib.py", line 155, in __exit__
2024-08-22 13:34:58     self.gen.throw(value)
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
2024-08-22 13:34:58     raise mapped_exc(message) from exc
2024-08-22 13:34:58 httpx.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1000)
2024-08-22 13:34:58 
2024-08-22 13:34:58 The above exception was the direct cause of the following exception:
2024-08-22 13:34:58 
2024-08-22 13:34:58 Traceback (most recent call last):
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 734, in lifespan
2024-08-22 13:34:58     async with self.lifespan_context(app) as maybe_state:
2024-08-22 13:34:58   File "/usr/local/lib/python3.12/contextlib.py", line 204, in __aenter__
2024-08-22 13:34:58     return await anext(self.gen)
2024-08-22 13:34:58            ^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/src/__main__.py", line 28, in lifespan
2024-08-22 13:34:58     container.init_globals()
2024-08-22 13:34:58   File "/src/globals.py", line 53, in init_globals
2024-08-22 13:34:58     llm_provider, embedder_provider, document_store_provider, engine = init_providers(
2024-08-22 13:34:58                                                                        ^^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/src/utils.py", line 67, in init_providers
2024-08-22 13:34:58     llm_provider = loader.get_provider(os.getenv("LLM_PROVIDER", "openai_llm"))()
2024-08-22 13:34:58                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/src/providers/llm/openai.py", line 138, in __init__
2024-08-22 13:34:58     _verify_api_key(self._api_key.resolve_value(), self._api_base)
2024-08-22 13:34:58   File "/src/providers/llm/openai.py", line 129, in _verify_api_key
2024-08-22 13:34:58     OpenAI(api_key=api_key, base_url=api_base).models.list()
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/openai/resources/models.py", line 80, in list
2024-08-22 13:34:58     return self._get_api_list(
2024-08-22 13:34:58            ^^^^^^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1289, in get_api_list
2024-08-22 13:34:58     return self._request_api_list(model, page, opts)
2024-08-22 13:34:58            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1134, in _request_api_list
2024-08-22 13:34:58     return self.request(page, options, stream=False)
2024-08-22 13:34:58            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 921, in request
2024-08-22 13:34:58     return self._request(
2024-08-22 13:34:58            ^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 976, in _request
2024-08-22 13:34:58     return self._retry_request(
2024-08-22 13:34:58            ^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1053, in _retry_request
2024-08-22 13:34:58     return self._request(
2024-08-22 13:34:58            ^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 976, in _request
2024-08-22 13:34:58     return self._retry_request(
2024-08-22 13:34:58            ^^^^^^^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1053, in _retry_request
2024-08-22 13:34:58     return self._request(
2024-08-22 13:34:58            ^^^^^^^^^^^^^^
2024-08-22 13:34:58   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 986, in _request
2024-08-22 13:34:58     raise APIConnectionError(request=request) from err
2024-08-22 13:34:58 openai.APIConnectionError: Connection error.
2024-08-22 13:34:58 
2024-08-22 13:34:58 ERROR:    Application startup failed. Exiting.
2024-08-27 09:40:18 This module is deprecated and will be removed in Hamilton 2.0 Please use `hamilton.async_driver` instead. 
2024-08-27 09:40:20 INFO:     Started server process [7]
2024-08-27 09:40:20 INFO:     Waiting for application startup.
2024-08-27 09:40:20 2024-08-27 04:10:20,187 - wren-ai-service - INFO - Initializing providers... (utils.py:64)
2024-08-27 09:40:22 2024-08-27 04:10:22,777 - wren-ai-service - INFO - Registering provider: openai_embedder (loader.py:66)
2024-08-27 09:40:22 2024-08-27 04:10:22,778 - wren-ai-service - INFO - Registering provider: qdrant (loader.py:66)
2024-08-27 09:40:22 2024-08-27 04:10:22,779 - wren-ai-service - INFO - Registering provider: azure_openai_embedder (loader.py:66)
2024-08-27 09:40:22 2024-08-27 04:10:22,784 - wren-ai-service - INFO - Registering provider: ollama_embedder (loader.py:66)
2024-08-27 09:40:22 2024-08-27 04:10:22,786 - wren-ai-service - INFO - Registering provider: wren_ui (loader.py:66)
2024-08-27 09:40:22 2024-08-27 04:10:22,786 - wren-ai-service - INFO - Registering provider: wren_ibis (loader.py:66)
2024-08-27 09:40:22 2024-08-27 04:10:22,786 - wren-ai-service - INFO - Registering provider: wren_engine (loader.py:66)
2024-08-27 09:40:22 2024-08-27 04:10:22,795 - wren-ai-service - INFO - Registering provider: azure_openai_llm (loader.py:66)
2024-08-27 09:40:22 2024-08-27 04:10:22,799 - wren-ai-service - INFO - Registering provider: ollama_llm (loader.py:66)
2024-08-27 09:40:22 2024-08-27 04:10:22,800 - wren-ai-service - INFO - Registering provider: openai_llm (loader.py:66)
2024-08-27 09:40:22 2024-08-27 04:10:22,800 - wren-ai-service - INFO - Using OpenAILLM provider with API base: https://api.openai.com/v1 (openai.py:135)
2024-08-27 09:40:25 ERROR:    Traceback (most recent call last):
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions
2024-08-27 09:40:25     yield
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 233, in handle_request
2024-08-27 09:40:25     resp = self._pool.handle_request(req)
2024-08-27 09:40:25            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request
2024-08-27 09:40:25     raise exc from None
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request
2024-08-27 09:40:25     response = connection.handle_request(
2024-08-27 09:40:25                ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 99, in handle_request
2024-08-27 09:40:25     raise exc
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 76, in handle_request
2024-08-27 09:40:25     stream = self._connect(request)
2024-08-27 09:40:25              ^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 154, in _connect
2024-08-27 09:40:25     stream = stream.start_tls(**kwargs)
2024-08-27 09:40:25              ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/httpcore/_backends/sync.py", line 152, in start_tls
2024-08-27 09:40:25     with map_exceptions(exc_map):
2024-08-27 09:40:25   File "/usr/local/lib/python3.12/contextlib.py", line 155, in __exit__
2024-08-27 09:40:25     self.gen.throw(value)
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
2024-08-27 09:40:25     raise to_exc(exc) from exc
2024-08-27 09:40:25 httpcore.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1000)
2024-08-27 09:40:25 
2024-08-27 09:40:25 The above exception was the direct cause of the following exception:
2024-08-27 09:40:25 
2024-08-27 09:40:25 Traceback (most recent call last):
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 952, in _request
2024-08-27 09:40:25     response = self._client.send(
2024-08-27 09:40:25                ^^^^^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 914, in send
2024-08-27 09:40:25     response = self._send_handling_auth(
2024-08-27 09:40:25                ^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 942, in _send_handling_auth
2024-08-27 09:40:25     response = self._send_handling_redirects(
2024-08-27 09:40:25                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
2024-08-27 09:40:25     response = self._send_single_request(request)
2024-08-27 09:40:25                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 1015, in _send_single_request
2024-08-27 09:40:25     response = transport.handle_request(request)
2024-08-27 09:40:25                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 232, in handle_request
2024-08-27 09:40:25     with map_httpcore_exceptions():
2024-08-27 09:40:25   File "/usr/local/lib/python3.12/contextlib.py", line 155, in __exit__
2024-08-27 09:40:25     self.gen.throw(value)
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
2024-08-27 09:40:25     raise mapped_exc(message) from exc
2024-08-27 09:40:25 httpx.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1000)
2024-08-27 09:40:25 
2024-08-27 09:40:25 The above exception was the direct cause of the following exception:
2024-08-27 09:40:25 
2024-08-27 09:40:25 Traceback (most recent call last):
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 734, in lifespan
2024-08-27 09:40:25     async with self.lifespan_context(app) as maybe_state:
2024-08-27 09:40:25   File "/usr/local/lib/python3.12/contextlib.py", line 204, in __aenter__
2024-08-27 09:40:25     return await anext(self.gen)
2024-08-27 09:40:25            ^^^^^^^^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/src/__main__.py", line 28, in lifespan
2024-08-27 09:40:25     container.init_globals()
2024-08-27 09:40:25   File "/src/globals.py", line 53, in init_globals
2024-08-27 09:40:25     llm_provider, embedder_provider, document_store_provider, engine = init_providers(
2024-08-27 09:40:25                                                                        ^^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/src/utils.py", line 67, in init_providers
2024-08-27 09:40:25     llm_provider = loader.get_provider(os.getenv("LLM_PROVIDER", "openai_llm"))()
2024-08-27 09:40:25                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/src/providers/llm/openai.py", line 138, in __init__
2024-08-27 09:40:25     _verify_api_key(self._api_key.resolve_value(), self._api_base)
2024-08-27 09:40:25   File "/src/providers/llm/openai.py", line 129, in _verify_api_key
2024-08-27 09:40:25     OpenAI(api_key=api_key, base_url=api_base).models.list()
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/openai/resources/models.py", line 80, in list
2024-08-27 09:40:25     return self._get_api_list(
2024-08-27 09:40:25            ^^^^^^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1289, in get_api_list
2024-08-27 09:40:25     return self._request_api_list(model, page, opts)
2024-08-27 09:40:25            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1134, in _request_api_list
2024-08-27 09:40:25     return self.request(page, options, stream=False)
2024-08-27 09:40:25            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 921, in request
2024-08-27 09:40:25     return self._request(
2024-08-27 09:40:25            ^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 976, in _request
2024-08-27 09:40:25     return self._retry_request(
2024-08-27 09:40:25            ^^^^^^^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1053, in _retry_request
2024-08-27 09:40:25     return self._request(
2024-08-27 09:40:25            ^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 976, in _request
2024-08-27 09:40:25     return self._retry_request(
2024-08-27 09:40:25            ^^^^^^^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1053, in _retry_request
2024-08-27 09:40:25     return self._request(
2024-08-27 09:40:25            ^^^^^^^^^^^^^^
2024-08-27 09:40:25   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 986, in _request
2024-08-27 09:40:25     raise APIConnectionError(request=request) from err
2024-08-27 09:40:25 openai.APIConnectionError: Connection error.
2024-08-27 09:40:25 
2024-08-27 09:40:25 ERROR:    Application startup failed. Exiting.
2024-08-27 13:06:27 This module is deprecated and will be removed in Hamilton 2.0 Please use `hamilton.async_driver` instead. 
2024-08-27 13:06:28 INFO:     Started server process [7]
2024-08-27 13:06:28 INFO:     Waiting for application startup.
2024-08-27 13:06:28 2024-08-27 07:36:28,829 - wren-ai-service - INFO - Initializing providers... (utils.py:64)
2024-08-27 13:06:30 2024-08-27 07:36:30,549 - wren-ai-service - INFO - Registering provider: openai_embedder (loader.py:66)
2024-08-27 13:06:30 2024-08-27 07:36:30,549 - wren-ai-service - INFO - Registering provider: qdrant (loader.py:66)
2024-08-27 13:06:30 2024-08-27 07:36:30,551 - wren-ai-service - INFO - Registering provider: azure_openai_embedder (loader.py:66)
2024-08-27 13:06:30 2024-08-27 07:36:30,563 - wren-ai-service - INFO - Registering provider: ollama_embedder (loader.py:66)
2024-08-27 13:06:30 2024-08-27 07:36:30,567 - wren-ai-service - INFO - Registering provider: wren_ui (loader.py:66)
2024-08-27 13:06:30 2024-08-27 07:36:30,567 - wren-ai-service - INFO - Registering provider: wren_ibis (loader.py:66)
2024-08-27 13:06:30 2024-08-27 07:36:30,567 - wren-ai-service - INFO - Registering provider: wren_engine (loader.py:66)
2024-08-27 13:06:30 2024-08-27 07:36:30,581 - wren-ai-service - INFO - Registering provider: azure_openai_llm (loader.py:66)
2024-08-27 13:06:30 2024-08-27 07:36:30,589 - wren-ai-service - INFO - Registering provider: ollama_llm (loader.py:66)
2024-08-27 13:06:30 2024-08-27 07:36:30,590 - wren-ai-service - INFO - Registering provider: openai_llm (loader.py:66)
2024-08-27 13:06:30 2024-08-27 07:36:30,590 - wren-ai-service - INFO - Using OpenAILLM provider with API base: https://api.openai.com/v1 (openai.py:135)
2024-08-27 13:06:33 ERROR:    Traceback (most recent call last):
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions
2024-08-27 13:06:33     yield
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 233, in handle_request
2024-08-27 13:06:33     resp = self._pool.handle_request(req)
2024-08-27 13:06:33            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request
2024-08-27 13:06:33     raise exc from None
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request
2024-08-27 13:06:33     response = connection.handle_request(
2024-08-27 13:06:33                ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 99, in handle_request
2024-08-27 13:06:33     raise exc
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 76, in handle_request
2024-08-27 13:06:33     stream = self._connect(request)
2024-08-27 13:06:33              ^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 154, in _connect
2024-08-27 13:06:33     stream = stream.start_tls(**kwargs)
2024-08-27 13:06:33              ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/httpcore/_backends/sync.py", line 152, in start_tls
2024-08-22 13:33:50 Waiting for wren-ai-service to start...
2024-08-22 13:34:51 Waiting for wren-ai-service to start...
2024-08-27 09:40:13 Waiting for wren-ai-service to start...
2024-08-27 13:06:23 Waiting for wren-ai-service to start...
2024-08-27 13:59:29 Waiting for wren-ai-service to start...
2024-08-27 13:06:33     with map_exceptions(exc_map):
2024-08-27 13:06:33   File "/usr/local/lib/python3.12/contextlib.py", line 155, in __exit__
2024-08-27 13:06:33     self.gen.throw(value)
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
2024-08-27 13:06:33     raise to_exc(exc) from exc
2024-08-27 13:06:33 httpcore.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1000)
2024-08-27 13:06:33 
2024-08-27 13:06:33 The above exception was the direct cause of the following exception:
2024-08-27 13:06:33 
2024-08-27 13:06:33 Traceback (most recent call last):
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 952, in _request
2024-08-27 13:06:33     response = self._client.send(
2024-08-27 13:06:33                ^^^^^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 914, in send
2024-08-27 13:06:33     response = self._send_handling_auth(
2024-08-27 13:06:33                ^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 942, in _send_handling_auth
2024-08-27 13:06:33     response = self._send_handling_redirects(
2024-08-27 13:06:33                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
2024-08-27 13:06:33     response = self._send_single_request(request)
2024-08-27 13:06:33                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 1015, in _send_single_request
2024-08-27 13:06:33     response = transport.handle_request(request)
2024-08-27 13:06:33                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 232, in handle_request
2024-08-27 13:06:33     with map_httpcore_exceptions():
2024-08-27 13:06:33   File "/usr/local/lib/python3.12/contextlib.py", line 155, in __exit__
2024-08-27 13:06:33     self.gen.throw(value)
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
2024-08-27 13:06:33     raise mapped_exc(message) from exc
2024-08-27 13:06:33 httpx.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1000)
2024-08-27 13:06:33 
2024-08-27 13:06:33 The above exception was the direct cause of the following exception:
2024-08-27 13:06:33 
2024-08-27 13:06:33 Traceback (most recent call last):
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 734, in lifespan
2024-08-27 13:06:33     async with self.lifespan_context(app) as maybe_state:
2024-08-27 13:06:33   File "/usr/local/lib/python3.12/contextlib.py", line 204, in __aenter__
2024-08-27 13:06:33     return await anext(self.gen)
2024-08-27 13:06:33            ^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/src/__main__.py", line 28, in lifespan
2024-08-27 13:06:33     container.init_globals()
2024-08-27 13:06:33   File "/src/globals.py", line 53, in init_globals
2024-08-27 13:06:33     llm_provider, embedder_provider, document_store_provider, engine = init_providers(
2024-08-27 13:06:33                                                                        ^^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/src/utils.py", line 67, in init_providers
2024-08-27 13:06:33     llm_provider = loader.get_provider(os.getenv("LLM_PROVIDER", "openai_llm"))()
2024-08-27 13:06:33                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/src/providers/llm/openai.py", line 138, in __init__
2024-08-27 13:06:33     _verify_api_key(self._api_key.resolve_value(), self._api_base)
2024-08-27 13:06:33   File "/src/providers/llm/openai.py", line 129, in _verify_api_key
2024-08-27 13:06:33     OpenAI(api_key=api_key, base_url=api_base).models.list()
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/openai/resources/models.py", line 80, in list
2024-08-27 13:06:33     return self._get_api_list(
2024-08-27 13:06:33            ^^^^^^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1289, in get_api_list
2024-08-27 13:06:33     return self._request_api_list(model, page, opts)
2024-08-27 13:06:33            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1134, in _request_api_list
2024-08-27 13:06:33     return self.request(page, options, stream=False)
2024-08-27 13:06:33            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 921, in request
2024-08-27 13:06:33     return self._request(
2024-08-27 13:06:33            ^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 976, in _request
2024-08-27 13:06:33     return self._retry_request(
2024-08-27 13:06:33            ^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1053, in _retry_request
2024-08-27 13:06:33     return self._request(
2024-08-27 13:06:33            ^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 976, in _request
2024-08-27 13:06:33     return self._retry_request(
2024-08-27 13:06:33            ^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1053, in _retry_request
2024-08-27 13:06:33     return self._request(
2024-08-27 13:06:33            ^^^^^^^^^^^^^^
2024-08-27 13:06:33   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 986, in _request
2024-08-27 13:06:33     raise APIConnectionError(request=request) from err
2024-08-27 13:06:33 openai.APIConnectionError: Connection error.
2024-08-27 13:06:33 
2024-08-27 13:06:33 ERROR:    Application startup failed. Exiting.
2024-08-27 13:59:36 This module is deprecated and will be removed in Hamilton 2.0 Please use `hamilton.async_driver` instead. 
2024-08-27 13:59:37 INFO:     Started server process [7]
2024-08-27 13:59:37 INFO:     Waiting for application startup.
2024-08-27 13:59:37 2024-08-27 08:29:37,492 - wren-ai-service - INFO - Initializing providers... (utils.py:64)
2024-08-27 13:59:39 2024-08-27 08:29:39,103 - wren-ai-service - INFO - Registering provider: openai_embedder (loader.py:66)
2024-08-27 13:59:39 2024-08-27 08:29:39,104 - wren-ai-service - INFO - Registering provider: qdrant (loader.py:66)
2024-08-27 13:59:39 2024-08-27 08:29:39,105 - wren-ai-service - INFO - Registering provider: azure_openai_embedder (loader.py:66)
2024-08-27 13:59:39 2024-08-27 08:29:39,112 - wren-ai-service - INFO - Registering provider: ollama_embedder (loader.py:66)
2024-08-27 13:59:39 2024-08-27 08:29:39,116 - wren-ai-service - INFO - Registering provider: wren_ui (loader.py:66)
2024-08-27 13:59:39 2024-08-27 08:29:39,116 - wren-ai-service - INFO - Registering provider: wren_ibis (loader.py:66)
2024-08-27 13:59:39 2024-08-27 08:29:39,116 - wren-ai-service - INFO - Registering provider: wren_engine (loader.py:66)
2024-08-27 13:59:39 2024-08-27 08:29:39,128 - wren-ai-service - INFO - Registering provider: azure_openai_llm (loader.py:66)
2024-08-27 13:59:39 2024-08-27 08:29:39,136 - wren-ai-service - INFO - Registering provider: ollama_llm (loader.py:66)
2024-08-27 13:59:39 2024-08-27 08:29:39,137 - wren-ai-service - INFO - Registering provider: openai_llm (loader.py:66)
2024-08-27 13:59:39 2024-08-27 08:29:39,137 - wren-ai-service - INFO - Using OpenAILLM provider with API base: https://api.openai.com/v1 (openai.py:135)
2024-08-27 13:59:42 ERROR:    Traceback (most recent call last):
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions
2024-08-27 13:59:42     yield
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 233, in handle_request
2024-08-27 13:59:42     resp = self._pool.handle_request(req)
2024-08-27 13:59:42            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request
2024-08-27 13:59:42     raise exc from None
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request
2024-08-27 13:59:42     response = connection.handle_request(
2024-08-27 13:59:42                ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 99, in handle_request
2024-08-27 13:59:42     raise exc
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 76, in handle_request
2024-08-27 13:59:42     stream = self._connect(request)
2024-08-27 13:59:42              ^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 154, in _connect
2024-08-27 13:59:42     stream = stream.start_tls(**kwargs)
2024-08-27 13:59:42              ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/httpcore/_backends/sync.py", line 152, in start_tls
2024-08-27 13:59:42     with map_exceptions(exc_map):
2024-08-27 13:59:42   File "/usr/local/lib/python3.12/contextlib.py", line 155, in __exit__
2024-08-27 13:59:42     self.gen.throw(value)
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
2024-08-27 13:59:42     raise to_exc(exc) from exc
2024-08-27 13:59:42 httpcore.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1000)
2024-08-27 13:59:42 
2024-08-27 13:59:42 The above exception was the direct cause of the following exception:
2024-08-27 13:59:42 
2024-08-27 13:59:42 Traceback (most recent call last):
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 952, in _request
2024-08-27 13:59:42     response = self._client.send(
2024-08-27 13:59:42                ^^^^^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 914, in send
2024-08-27 13:59:42     response = self._send_handling_auth(
2024-08-27 13:59:42                ^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 942, in _send_handling_auth
2024-08-27 13:59:42     response = self._send_handling_redirects(
2024-08-27 13:59:42                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
2024-08-27 13:59:42     response = self._send_single_request(request)
2024-08-27 13:59:42                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 1015, in _send_single_request
2024-08-27 13:59:42     response = transport.handle_request(request)
2024-08-27 13:59:42                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 232, in handle_request
2024-08-27 13:59:42     with map_httpcore_exceptions():
2024-08-27 13:59:42   File "/usr/local/lib/python3.12/contextlib.py", line 155, in __exit__
2024-08-27 13:59:42     self.gen.throw(value)
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
2024-08-27 13:59:42     raise mapped_exc(message) from exc
2024-08-27 13:59:42 httpx.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1000)
2024-08-27 13:59:42 
2024-08-27 13:59:42 The above exception was the direct cause of the following exception:
2024-08-27 13:59:42 
2024-08-27 13:59:42 Traceback (most recent call last):
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 734, in lifespan
2024-08-27 13:59:42     async with self.lifespan_context(app) as maybe_state:
2024-08-27 13:59:42   File "/usr/local/lib/python3.12/contextlib.py", line 204, in __aenter__
2024-08-27 13:59:42     return await anext(self.gen)
2024-08-27 13:59:42            ^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/src/__main__.py", line 28, in lifespan
2024-08-27 13:59:42     container.init_globals()
2024-08-27 13:59:42   File "/src/globals.py", line 53, in init_globals
2024-08-27 13:59:42     llm_provider, embedder_provider, document_store_provider, engine = init_providers(
2024-08-27 13:59:42                                                                        ^^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/src/utils.py", line 67, in init_providers
2024-08-27 13:59:42     llm_provider = loader.get_provider(os.getenv("LLM_PROVIDER", "openai_llm"))()
2024-08-27 13:59:42                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/src/providers/llm/openai.py", line 138, in __init__
2024-08-27 13:59:42     _verify_api_key(self._api_key.resolve_value(), self._api_base)
2024-08-27 13:59:42   File "/src/providers/llm/openai.py", line 129, in _verify_api_key
2024-08-27 13:59:42     OpenAI(api_key=api_key, base_url=api_base).models.list()
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/openai/resources/models.py", line 80, in list
2024-08-27 13:59:42     return self._get_api_list(
2024-08-27 13:59:42            ^^^^^^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1289, in get_api_list
2024-08-27 13:59:42     return self._request_api_list(model, page, opts)
2024-08-27 13:59:42            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1134, in _request_api_list
2024-08-27 13:59:42     return self.request(page, options, stream=False)
2024-08-27 13:59:42            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 921, in request
2024-08-27 13:59:42     return self._request(
2024-08-27 13:59:42            ^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 976, in _request
2024-08-27 13:59:42     return self._retry_request(
2024-08-27 13:59:42            ^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1053, in _retry_request
2024-08-27 13:59:42     return self._request(
2024-08-27 13:59:42            ^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 976, in _request
2024-08-27 13:59:42     return self._retry_request(
2024-08-27 13:59:42            ^^^^^^^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1053, in _retry_request
2024-08-27 13:59:42     return self._request(
2024-08-27 13:59:42            ^^^^^^^^^^^^^^
2024-08-27 13:59:42   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 986, in _request
2024-08-27 13:59:42     raise APIConnectionError(request=request) from err
2024-08-27 13:59:42 openai.APIConnectionError: Connection error.
2024-08-27 13:59:42 
2024-08-27 13:59:42 ERROR:    Application startup failed. Exiting.

This is the service log

cyyeh commented 3 months ago

@Rishav-11

Please try the following the steps to check if the issue could be fixed and tell me the testing results. Thank you

  1. turn off all of Wren AI's related services
  2. open PowerShell
  3. in PowerShell: cd ~/.wrenai
  4. in PowerShell: mv .env .env.txt
  5. in PowerShell: notepad .env.txt
  6. in notepad: replace WREN_AI_SERVICE_VERSION=0.8.0 with WREN_AI_SERVICE_VERSION=hotfix-openai-v1 and save the file and close it
  7. in PowerShell: mv .env.txt .env
  8. in PowerShell: docker-compose --env-file .env up -d
Rishav-11 commented 3 months ago

service log:-

2024-08-28 11:04:53 Waiting for wren-ai-service to start...
2024-08-28 11:05:00 This module is deprecated and will be removed in Hamilton 2.0 Please use `hamilton.async_driver` instead. 
2024-08-28 11:05:04 INFO:     Started server process [7]
2024-08-28 11:05:04 INFO:     Waiting for application startup.
2024-08-28 11:05:04 2024-08-28 05:35:04,428 - wren-ai-service - INFO - Initializing providers... (utils.py:64)
2024-08-28 11:05:07 2024-08-28 05:35:07,996 - wren-ai-service - INFO - Registering provider: openai_embedder (loader.py:66)
2024-08-28 11:05:07 2024-08-28 05:35:07,997 - wren-ai-service - INFO - Registering provider: qdrant (loader.py:66)
2024-08-28 11:05:08 2024-08-28 05:35:08,000 - wren-ai-service - INFO - Registering provider: azure_openai_embedder (loader.py:66)
2024-08-28 11:05:08 2024-08-28 05:35:08,009 - wren-ai-service - INFO - Registering provider: ollama_embedder (loader.py:66)
2024-08-28 11:05:08 2024-08-28 05:35:08,013 - wren-ai-service - INFO - Registering provider: wren_ui (loader.py:66)
2024-08-28 11:05:08 2024-08-28 05:35:08,013 - wren-ai-service - INFO - Registering provider: wren_ibis (loader.py:66)
2024-08-28 11:05:08 2024-08-28 05:35:08,013 - wren-ai-service - INFO - Registering provider: wren_engine (loader.py:66)
2024-08-28 11:05:08 2024-08-28 05:35:08,036 - wren-ai-service - INFO - Registering provider: azure_openai_llm (loader.py:66)
2024-08-28 11:05:08 2024-08-28 05:35:08,044 - wren-ai-service - INFO - Registering provider: ollama_llm (loader.py:66)
2024-08-28 11:05:08 2024-08-28 05:35:08,047 - wren-ai-service - INFO - Registering provider: openai_llm (loader.py:66)
2024-08-28 11:05:08 2024-08-28 05:35:08,047 - wren-ai-service - INFO - Using OpenAILLM provider with API base: https://api.openai.com/v1 (openai.py:135)
2024-08-28 11:05:11 ERROR:    Traceback (most recent call last):
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions
2024-08-28 11:05:11     yield
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 233, in handle_request
2024-08-28 11:05:11     resp = self._pool.handle_request(req)
2024-08-28 11:05:11            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 216, in handle_request
2024-08-28 11:05:11     raise exc from None
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 196, in handle_request
2024-08-28 11:05:11     response = connection.handle_request(
2024-08-28 11:05:11                ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 99, in handle_request
2024-08-28 11:05:11     raise exc
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 76, in handle_request
2024-08-28 11:05:11     stream = self._connect(request)
2024-08-28 11:05:11              ^^^^^^^^^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 154, in _connect
2024-08-28 11:05:11     stream = stream.start_tls(**kwargs)
2024-08-28 11:05:11              ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/httpcore/_backends/sync.py", line 152, in start_tls
2024-08-28 11:05:11     with map_exceptions(exc_map):
2024-08-28 11:05:11   File "/usr/local/lib/python3.12/contextlib.py", line 155, in __exit__
2024-08-28 11:05:11     self.gen.throw(value)
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
2024-08-28 11:05:11     raise to_exc(exc) from exc
2024-08-28 11:05:11 httpcore.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1000)
2024-08-28 11:05:11 
2024-08-28 11:05:11 The above exception was the direct cause of the following exception:
2024-08-28 11:05:11 
2024-08-28 11:05:11 Traceback (most recent call last):
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 973, in _request
2024-08-28 11:05:11     response = self._client.send(
2024-08-28 11:05:11                ^^^^^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 914, in send
2024-08-28 11:05:11     response = self._send_handling_auth(
2024-08-28 11:05:11                ^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 942, in _send_handling_auth
2024-08-28 11:05:11     response = self._send_handling_redirects(
2024-08-28 11:05:11                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
2024-08-28 11:05:11     response = self._send_single_request(request)
2024-08-28 11:05:11                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 1015, in _send_single_request
2024-08-28 11:05:11     response = transport.handle_request(request)
2024-08-28 11:05:11                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 232, in handle_request
2024-08-28 11:05:11     with map_httpcore_exceptions():
2024-08-28 11:05:11   File "/usr/local/lib/python3.12/contextlib.py", line 155, in __exit__
2024-08-28 11:05:11     self.gen.throw(value)
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
2024-08-28 11:05:11     raise mapped_exc(message) from exc
2024-08-28 11:05:11 httpx.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1000)
2024-08-28 11:05:11 
2024-08-28 11:05:11 The above exception was the direct cause of the following exception:
2024-08-28 11:05:11 
2024-08-28 11:05:11 Traceback (most recent call last):
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 734, in lifespan
2024-08-28 11:05:11     async with self.lifespan_context(app) as maybe_state:
2024-08-28 11:05:11   File "/usr/local/lib/python3.12/contextlib.py", line 204, in __aenter__
2024-08-28 11:05:11     return await anext(self.gen)
2024-08-28 11:05:11            ^^^^^^^^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/src/__main__.py", line 28, in lifespan
2024-08-28 11:05:11     container.init_globals()
2024-08-28 11:05:11   File "/src/globals.py", line 53, in init_globals
2024-08-28 11:05:11     llm_provider, embedder_provider, document_store_provider, engine = init_providers(
2024-08-28 11:05:11                                                                        ^^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/src/utils.py", line 67, in init_providers
2024-08-28 11:05:11     llm_provider = loader.get_provider(os.getenv("LLM_PROVIDER", "openai_llm"))()
2024-08-28 11:05:11                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/src/providers/llm/openai.py", line 138, in __init__
2024-08-28 11:05:11     _verify_api_key(self._api_key.resolve_value(), self._api_base)
2024-08-28 11:05:11   File "/src/providers/llm/openai.py", line 129, in _verify_api_key
2024-08-28 11:05:11     OpenAI(api_key=api_key, base_url=api_base).models.list()
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/openai/resources/models.py", line 80, in list
2024-08-28 11:05:11     return self._get_api_list(
2024-08-28 11:05:11            ^^^^^^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1309, in get_api_list
2024-08-28 11:05:11     return self._request_api_list(model, page, opts)
2024-08-28 11:05:11            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1160, in _request_api_list
2024-08-28 11:05:11     return self.request(page, options, stream=False)
2024-08-28 11:05:11            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 937, in request
2024-08-28 11:05:11     return self._request(
2024-08-28 11:05:11            ^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 997, in _request
2024-08-28 11:05:11     return self._retry_request(
2024-08-28 11:05:11            ^^^^^^^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1075, in _retry_request
2024-08-28 11:05:11     return self._request(
2024-08-28 11:05:11            ^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 997, in _request
2024-08-28 11:05:11     return self._retry_request(
2024-08-28 11:05:11            ^^^^^^^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1075, in _retry_request
2024-08-28 11:05:11     return self._request(
2024-08-28 11:05:11            ^^^^^^^^^^^^^^
2024-08-28 11:05:11   File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1007, in _request
2024-08-28 11:05:11     raise APIConnectionError(request=request) from err
2024-08-28 11:05:11 openai.APIConnectionError: Connection error.
2024-08-28 11:05:11 
2024-08-28 11:05:11 ERROR:    Application startup failed. Exiting.

powershell logs:-


PS C:\Users\Rishav.Raj> cd ~/.wrenai
PS C:\Users\Rishav.Raj\.wrenai> mv .env .env.txt
PS C:\Users\Rishav.Raj\.wrenai> notepad .env.txt
PS C:\Users\Rishav.Raj\.wrenai> mv .env.txt .env
PS C:\Users\Rishav.Raj\.wrenai> docker-compose --env-file .env up -d
time="2024-08-28T11:00:13+05:30" level=warning msg="The \"LLM_AZURE_OPENAI_API_KEY\" variable is not set. Defaulting to a blank string."
time="2024-08-28T11:00:13+05:30" level=warning msg="The \"EMBEDDER_AZURE_OPENAI_API_KEY\" variable is not set. Defaulting to a blank string."
time="2024-08-28T11:00:13+05:30" level=warning msg="The \"EMBEDDING_MODEL\" variable is not set. Defaulting to a blank string."
time="2024-08-28T11:00:13+05:30" level=warning msg="The \"EMBEDDING_MODEL_DIMENSION\" variable is not set. Defaulting to a blank string."
time="2024-08-28T11:00:13+05:30" level=warning msg="C:\\Users\\Rishav.Raj\\.wrenai\\docker-compose.yaml: `version` is obsolete"
[+] Running 11/11
 ✔ wren-ai-service Pulled                                                                                         23.2s
   ✔ 1f7ce2fa46ab Already exists                                                                                   0.0s
   ✔ 442c5d63eafd Already exists                                                                                   0.0s
   ✔ 887bfde68788 Already exists                                                                                   0.0s
   ✔ 78be9143582f Already exists                                                                                   0.0s
   ✔ ac7ada22b54e Already exists                                                                                   0.0s
   ✔ 3ea683cbf268 Pull complete                                                                                    5.7s
   ✔ 72f3a8c2c1b8 Pull complete                                                                                   18.6s
   ✔ bc5726ba0877 Pull complete                                                                                   18.8s
   ✔ d78323d0d2ba Pull complete                                                                                   18.9s
   ✔ 201f83fd28b5 Pull complete                                                                                   19.0s
[+] Running 8/9
 ✔ Network wren_wren                 Created                                                                       0.1s
 ✔ Network wren_default              Created                                                                       0.1s
 ✔ Volume "wren_data"                Created                                                                       0.0s
 ✔ Container wren-bootstrap-1        Started                                                                       2.2s
 ✔ Container wren-ibis-server-1      Started                                                                       2.2s
 ✔ Container wren-qdrant-1           Started                                                                       2.2s
 ✔ Container wren-wren-engine-1      Started                                                                       2.1s
 - Container wren-wren-ai-service-1  Starting                                                                      2.4s
 ✔ Container wren-wren-ui-1          Created                                                                       0.1s
Error response from daemon: driver failed programming external connectivity on endpoint wren-wren-ai-service-1 (066e94d6ffc1c43b1eb659cb8d9888d9ce32c8a3db6c1f5145dd36763b0467e0): Bind for 0.0.0.0:5555 failed: port is already allocated
PS C:\Users\Rishav.Raj\.wrenai> docker-compose --env-file .env up -d
time="2024-08-28T11:04:51+05:30" level=warning msg="The \"LLM_AZURE_OPENAI_API_KEY\" variable is not set. Defaulting to a blank string."
time="2024-08-28T11:04:51+05:30" level=warning msg="The \"EMBEDDER_AZURE_OPENAI_API_KEY\" variable is not set. Defaulting to a blank string."
time="2024-08-28T11:04:51+05:30" level=warning msg="The \"EMBEDDING_MODEL\" variable is not set. Defaulting to a blank string."
time="2024-08-28T11:04:51+05:30" level=warning msg="The \"EMBEDDING_MODEL_DIMENSION\" variable is not set. Defaulting to a blank string."
time="2024-08-28T11:04:51+05:30" level=warning msg="C:\\Users\\Rishav.Raj\\.wrenai\\docker-compose.yaml: `version` is obsolete"
[+] Running 6/6
 ✔ Container wren-bootstrap-1        Started                                                                       0.9s
 ✔ Container wren-qdrant-1           Started                                                                       0.9s
 ✔ Container wren-ibis-server-1      Started                                                                       0.9s
 ✔ Container wren-wren-engine-1      Started                                                                       1.5s
 ✔ Container wren-wren-ai-service-1  Started                                                                       0.6s
 ✔ Container wren-wren-ui-1          Started                                                                       0.6s
PS C:\Users\Rishav.Raj\.wrenai>

I am still getting the same result. Please help

cyyeh commented 3 months ago

@Rishav-11 it seems Wren AI wasn't shut down completely. So in power shell, it said port was allocated already. Please make sure Wren AI was shut down using Docker Desktop. Thank you

Rishav-11 commented 3 months ago

@cyyeh I have already done that as you can see that in the powershell logs that second time I had tried and the suggested version for the wren ai services has been installed properly. But even in this version after uploading the databases it still gives me same response as earlier and I have already shared the service log in the previous comment along with powershell log. Please look into it as I need it to be fixed asap.

cyyeh commented 3 months ago

@cyyeh I have already done that as you can see that in the powershell logs that second time I had tried and the suggested version for the wren ai services has been installed properly. But even in this version after uploading the databases it still gives me same response as earlier and I have already shared the service log in the previous comment along with powershell log. Please look into it as I need it to be fixed asap.

Sure, I will get back to u if there is any news. The reason you cannot use Wren AI comes down to certificate issue while connecting to OpenAI. So you may try other LLM providers first if possible

Rishav-11 commented 3 months ago

@cyyeh I don't have any access to any other LLM providers. I can only use OpenAI and I have used all their's version like 4o, 4-turbo, 3.5-turbo but I am still getting the same resopnse. Can you give me a timeline when this certificate issue will be resolved and also please inform me when you this issue is fixed.

Thanks

cyyeh commented 2 months ago

sorry, I couldn't reproduce the issue as of now