DataDog / datadogpy

The Datadog Python library
https://datadoghq.com/
Other
611 stars 302 forks source link

urllib3 2.0+ breaks tests? #800

Open pbsds opened 11 months ago

pbsds commented 11 months ago

with vcrpy 5 i think the tests break.

pbsds commented 11 months ago

Here is the output of our test run, clipped after the output of the first failing test.

``` ============================= test session starts ============================== platform linux -- Python 3.11.5, pytest-7.4.2, pluggy-1.2.0 rootdir: /build/datadog-0.47.0 plugins: vcr-1.0.2 collected 284 items / 12 deselected / 272 selected tests/integration/test_freezer.py . [ 0%] tests/integration/api/test_api.py FFFFFFFF.FFF.FFFFFF.FFFFF. [ 9%] tests/integration/api/test_aws_integration.py FFFFFF [ 12%] tests/integration/api/test_aws_logs.py EEEE [ 13%] tests/integration/api/test_azure_integration.py F [ 13%] tests/integration/api/test_gcp_integration.py F [ 14%] tests/integration/api/test_synthetics.py EEEEE [ 16%] tests/integration/dogstatsd/test_statsd_sender.py ................. [ 22%] tests/unit/api/test_api.py ........................................... [ 38%] tests/unit/dogstatsd/test_container.py ...... [ 40%] tests/unit/dogstatsd/test_statsd.py .................................... [ 53%] ...........s.......................................... [ 73%] tests/unit/dogwrap/test_dogwrap.py ............ [ 77%] tests/unit/threadstats/test_threadstats.py ........................ [ 86%] tests/unit/util/test_cli.py ......... [ 90%] tests/unit/util/test_compat.py ... [ 91%] tests/unit/util/test_format.py ........................ [100%] ==================================== ERRORS ==================================== _______ ERROR at setup of TestAwsLogsIntegration.test_list_log_services ________ cls = , method = 'POST' path = 'integration/aws', api_version = 'v1' body = '{"account_id": "123456789101", "role_name": "DatadogApiTestRole"}' attach_host_name = False, response_formatter = None, error_formatter = None suppress_response_errors_on_codes = None, compress_payload = False, params = {} _api_key = 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' _application_key = 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' _api_host = 'https://api.datadoghq.com', _mute = True, _host_name = 'test.host' @classmethod def submit( cls, method, path, api_version=None, body=None, attach_host_name=False, response_formatter=None, error_formatter=None, suppress_response_errors_on_codes=None, compress_payload=False, **params ): """ Make an HTTP API request :param method: HTTP method to use to contact API endpoint :type method: HTTP method string :param path: API endpoint url :type path: url :param api_version: The API version used :param body: dictionary to be sent in the body of the request :type body: dictionary :param response_formatter: function to format JSON response from HTTP API request :type response_formatter: JSON input function :param error_formatter: function to format JSON error response from HTTP API request :type error_formatter: JSON input function :param attach_host_name: link the new resource object to the host name :type attach_host_name: bool :param suppress_response_errors_on_codes: suppress ApiError on `errors` key in the response for the given HTTP status codes :type suppress_response_errors_on_codes: None|list(int) :param compress_payload: compress the payload using zlib :type compress_payload: bool :param params: dictionary to be sent in the query string of the request :type params: dictionary :returns: JSON or formatted response from HTTP API request """ try: # Check if it's ok to submit if not cls._should_submit(): _, backoff_time_left = cls._backoff_status() raise HttpBackoff(backoff_time_left) # Import API, User and HTTP settings from datadog.api import ( _api_key, _application_key, _api_host, _mute, _host_name, _proxies, _max_retries, _timeout, _cacert, _return_raw_response, ) # Check keys and add then to params if _api_key is None: raise ApiNotInitialized("API key is not set." " Please run 'initialize' method first.") # Set api and app keys in headers headers = {} headers["DD-API-KEY"] = _api_key if _application_key: headers["DD-APPLICATION-KEY"] = _application_key # Check if the api_version is provided if not api_version: api_version = _api_version # Attach host name to body if attach_host_name and body: # Is it a 'series' list of objects ? if "series" in body: # Adding the host name to all objects for obj_params in body["series"]: if obj_params.get("host", "") == "": obj_params["host"] = _host_name else: if body.get("host", "") == "": body["host"] = _host_name # If defined, make sure tags are defined as a comma-separated string if "tags" in params and isinstance(params["tags"], list): tag_list = normalize_tags(params["tags"]) params["tags"] = ",".join(tag_list) # If defined, make sure monitor_ids are defined as a comma-separated string if "monitor_ids" in params and isinstance(params["monitor_ids"], list): params["monitor_ids"] = ",".join(str(i) for i in params["monitor_ids"]) # Process the body, if necessary if isinstance(body, dict): body = json.dumps(body, sort_keys=cls._sort_keys) headers["Content-Type"] = "application/json" if compress_payload: body = zlib.compress(body.encode("utf-8")) headers["Content-Encoding"] = "deflate" # Construct the URL url = construct_url(_api_host, api_version, path) # Process requesting start_time = time.time() result = cls._get_http_client().request( method=method, url=url, headers=headers, params=params, data=body, timeout=_timeout, max_retries=_max_retries, proxies=_proxies, verify=_cacert, ) # Request succeeded: log it and reset the timeout counter duration = round((time.time() - start_time) * 1000.0, 4) log.info("%s %s %s (%sms)" % (result.status_code, method, url, duration)) cls._timeout_counter = 0 # Format response content content = result.content if content: try: if is_p3k(): > response_obj = json.loads(content.decode("utf-8")) E UnicodeDecodeError: 'utf-8' codec can't decode byte 0x8b in position 1: invalid start byte datadog/api/api_client.py:189: UnicodeDecodeError During handling of the above exception, another exception occurred: self = dog = @pytest.fixture(autouse=True) # TODO , scope="class" def aws_integration(self, dog): """Prepare AWS Integration.""" > dog.AwsIntegration.create( account_id=TEST_ACCOUNT_ID, role_name=TEST_ROLE_NAME ) tests/integration/api/test_aws_logs.py:18: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ datadog/api/aws_integration.py:97: in create return super(AwsIntegration, cls).create(id=cls._resource_id, **params) datadog/api/resources.py:50: in create return APIClient.submit("POST", path, api_version, body, attach_host_name=attach_host_name, **params) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cls = , method = 'POST' path = 'integration/aws', api_version = 'v1' body = '{"account_id": "123456789101", "role_name": "DatadogApiTestRole"}' attach_host_name = False, response_formatter = None, error_formatter = None suppress_response_errors_on_codes = None, compress_payload = False, params = {} _api_key = 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' _application_key = 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa' _api_host = 'https://api.datadoghq.com', _mute = True, _host_name = 'test.host' @classmethod def submit( cls, method, path, api_version=None, body=None, attach_host_name=False, response_formatter=None, error_formatter=None, suppress_response_errors_on_codes=None, compress_payload=False, **params ): """ Make an HTTP API request :param method: HTTP method to use to contact API endpoint :type method: HTTP method string :param path: API endpoint url :type path: url :param api_version: The API version used :param body: dictionary to be sent in the body of the request :type body: dictionary :param response_formatter: function to format JSON response from HTTP API request :type response_formatter: JSON input function :param error_formatter: function to format JSON error response from HTTP API request :type error_formatter: JSON input function :param attach_host_name: link the new resource object to the host name :type attach_host_name: bool :param suppress_response_errors_on_codes: suppress ApiError on `errors` key in the response for the given HTTP status codes :type suppress_response_errors_on_codes: None|list(int) :param compress_payload: compress the payload using zlib :type compress_payload: bool :param params: dictionary to be sent in the query string of the request :type params: dictionary :returns: JSON or formatted response from HTTP API request """ try: # Check if it's ok to submit if not cls._should_submit(): _, backoff_time_left = cls._backoff_status() raise HttpBackoff(backoff_time_left) # Import API, User and HTTP settings from datadog.api import ( _api_key, _application_key, _api_host, _mute, _host_name, _proxies, _max_retries, _timeout, _cacert, _return_raw_response, ) # Check keys and add then to params if _api_key is None: raise ApiNotInitialized("API key is not set." " Please run 'initialize' method first.") # Set api and app keys in headers headers = {} headers["DD-API-KEY"] = _api_key if _application_key: headers["DD-APPLICATION-KEY"] = _application_key # Check if the api_version is provided if not api_version: api_version = _api_version # Attach host name to body if attach_host_name and body: # Is it a 'series' list of objects ? if "series" in body: # Adding the host name to all objects for obj_params in body["series"]: if obj_params.get("host", "") == "": obj_params["host"] = _host_name else: if body.get("host", "") == "": body["host"] = _host_name # If defined, make sure tags are defined as a comma-separated string if "tags" in params and isinstance(params["tags"], list): tag_list = normalize_tags(params["tags"]) params["tags"] = ",".join(tag_list) # If defined, make sure monitor_ids are defined as a comma-separated string if "monitor_ids" in params and isinstance(params["monitor_ids"], list): params["monitor_ids"] = ",".join(str(i) for i in params["monitor_ids"]) # Process the body, if necessary if isinstance(body, dict): body = json.dumps(body, sort_keys=cls._sort_keys) headers["Content-Type"] = "application/json" if compress_payload: body = zlib.compress(body.encode("utf-8")) headers["Content-Encoding"] = "deflate" # Construct the URL url = construct_url(_api_host, api_version, path) # Process requesting start_time = time.time() result = cls._get_http_client().request( method=method, url=url, headers=headers, params=params, data=body, timeout=_timeout, max_retries=_max_retries, proxies=_proxies, verify=_cacert, ) # Request succeeded: log it and reset the timeout counter duration = round((time.time() - start_time) * 1000.0, 4) log.info("%s %s %s (%sms)" % (result.status_code, method, url, duration)) cls._timeout_counter = 0 # Format response content content = result.content if content: try: if is_p3k(): response_obj = json.loads(content.decode("utf-8")) else: response_obj = json.loads(content) except ValueError: > raise ValueError("Invalid JSON response: {0}".format(content)) E ValueError: Invalid JSON response: b'\x1f\x8b\x08\x00\x00\x00\x00\x00\x00\x03\xabVJ\xad(I-\xcaK\xcc\x89\xcfLQ\xb2R22352M6LLNLM41K1\xb5H\xb242H\xb3077LM\xb4\xb44\xb3T\xaa\x05\x00E\xc6\x0e\xe62\x00\x00\x00' datadog/api/api_client.py:193: ValueError ------------------------------ Captured log setup ------------------------------ WARNING datadog.api:hostname.py:34 Hostname: localhost is local WARNING datadog.api:hostname.py:120 Unable to reliably determine host name. You can define one in your `hosts` file, or in `datadog.conf` file if you have Datadog Agent installed. INFO vcr.cassette:cassette.py:200 .before_record_request at 0xfffff4f38a40> INFO vcr.cassette:cassette.py:236 Appending request and response {'body': {'string': b'\x1f\x8b\x08\x00\x00\x00\x00\x00\x00\x03\xabVJ\xad(I-\xcaK\xcc\x89\xcfLQ\xb2R22352M6LLNLM41K1\xb5H\xb242H\xb3077LM\xb4\xb44\xb3T\xaa\x05\x00E\xc6\x0e\xe62\x00\x00\x00'}, 'headers': {'Cache-Control': ['no-cache'], 'Connection': ['keep-alive'], 'Content-Encoding': ['gzip'], 'Content-Security-Policy': ["frame-ancestors 'self'; report-uri https://api.datadoghq.com/csp-report"], 'Content-Type': ['application/json'], 'DD-POOL': ['dogweb'], 'Date': ['Mon, 03 Feb 2020 16:22:13 GMT'], 'Pragma': ['no-cache'], 'Set-Cookie': ['DD-PSHARD=233; Max-Age=604800; Path=/; expires=Mon, 10-Feb-2020 16:22:12 GMT; secure; HttpOnly'], 'Strict-Transport-Security': ['max-age=15724800;'], 'Transfer-Encoding': ['chunked'], 'Vary': ['Accept-Encoding'], 'X-Content-Type-Options': ['nosniff'], 'X-DD-Debug': ['J5PL0LnJukdy69mckjXi3cjye/YJX2hkoCBkqKQi+tYjrsXYELx6DfDD11fhyjYF'], 'X-DD-VERSION': ['35.2134903'], 'X-Frame-Options': ['SAMEORIGIN']}, 'status': {'code': 200, 'message': 'OK'}} INFO vcr.cassette:cassette.py:236 Appending request and response {'body': {'string': b"\x1f\x8b\x08\x00\x00\x00\x00\x00\x00\x03\x8b\xaeV\xcaLQ\xb2R*6V\xd2Q\xcaILJ\xcd\x01r\x82\x8d\x15\x1c\x93\x93S\x8b\x8b\x15|\xf2\xd3\x8b\x95ju\xa0\x8aRs\x92\x90T9\xe7$\x16\x17g&+\xb8\xfa8\xe1T^f\x84\xa4\xc1\xb1\xa0 '39\xb1$3?\x0f\xb7\xa6\xe4\x9c\xfc\xd2\x94\xb4\xa2\xfc\xbc\x12\x14\xab\x80\x82n A\xec6\x15\xa5\xa6\x14gd\xa6!k\t\x82\n\xa1y!'17)%\x11\xc9h\x1f\xb0\x80\x823\xc8\xda\xf2\xc4\x92\xe4\x0c\xa8\x86X\x00\xdb\xc07O\x1a\x01\x00\x00"}, 'headers': {'Cache-Control': ['no-cache'], 'Connection': ['keep-alive'], 'Content-Encoding': ['gzip'], 'Content-Security-Policy': ["frame-ancestors 'self'; report-uri https://api.datadoghq.com/csp-report"], 'Content-Type': ['application/json'], 'DD-POOL': ['dogweb'], 'Date': ['Mon, 03 Feb 2020 16:22:13 GMT'], 'Pragma': ['no-cache'], 'Set-Cookie': ['DD-PSHARD=233; Max-Age=604800; Path=/; expires=Mon, 10-Feb-2020 16:22:13 GMT; secure; HttpOnly'], 'Strict-Transport-Security': ['max-age=15724800;'], 'Transfer-Encoding': ['chunked'], 'Vary': ['Accept-Encoding'], 'X-Content-Type-Options': ['nosniff'], 'X-DD-Debug': ['OxP+mFpjAbASiVhNf+t4MttAs95ZlMiGosIRnYJJGFoApNgv2oxtdzpnmNlMOki6'], 'X-DD-VERSION': ['35.2134903'], 'X-Frame-Options': ['SAMEORIGIN']}, 'status': {'code': 200, 'message': 'OK'}} INFO vcr.cassette:cassette.py:236 Appending request and response {'body': {'string': b'{}'}, 'headers': {'Cache-Control': ['no-cache'], 'Connection': ['keep-alive'], 'Content-Length': ['2'], 'Content-Security-Policy': ["frame-ancestors 'self'; report-uri https://api.datadoghq.com/csp-report"], 'Content-Type': ['application/json'], 'DD-POOL': ['dogweb'], 'Date': ['Mon, 03 Feb 2020 16:22:15 GMT'], 'Pragma': ['no-cache'], 'Set-Cookie': ['DD-PSHARD=233; Max-Age=604800; Path=/; expires=Mon, 10-Feb-2020 16:22:14 GMT; secure; HttpOnly'], 'Strict-Transport-Security': ['max-age=15724800;'], 'X-Content-Type-Options': ['nosniff'], 'X-DD-Debug': ['Lo9psmCk9egobltaxBGqrQFhgCcgUTQoFZpr2xiSR+6tucB/owychJvFjr9YMWzu'], 'X-DD-VERSION': ['35.2134903'], 'X-Frame-Options': ['SAMEORIGIN']}, 'status': {'code': 200, 'message': 'OK'}} INFO vcr.stubs:__init__.py:258 Playing response for from cassette ```

It seems this happens due to vcrpy not being fully compatible with urllib3 2.0. I'm looking further into this

github-actions[bot] commented 10 months ago

Thanks for your contribution!

This issue has been automatically marked as stale because it has not had activity in the last 30 days. Note that the issue will not be automatically closed, but this notification will remind us to investigate why there's been inactivity. Thank you for participating in the Datadog open source community.

If you would like this issue to remain open:

  1. Verify that you can still reproduce the issue in the latest version of this project.

  2. Comment that the issue is still reproducible and include updated details requested in the issue template.