astropy / pytest-remotedata

Pytest plugin to control whether tests are run that have remote data
BSD 3-Clause "New" or "Revised" License
23 stars 15 forks source link

0.3.3: pytest is failing in two units #66

Closed kloczek closed 1 year ago

kloczek commented 1 year ago

I'm packaging your module as an rpm package so I'm using the typical PEP517 based build, install and test cycle used on building packages from non-root account.

Here is pytest output:

```console + PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-pytest-remotedata-0.3.3-5.fc35.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-pytest-remotedata-0.3.3-5.fc35.x86_64/usr/lib/python3.8/site-packages + /usr/bin/pytest -ra -m 'not network' ==================================================================================== test session starts ==================================================================================== platform linux -- Python 3.8.16, pytest-7.2.0, pluggy-1.0.0 rootdir: /home/tkloczko/rpmbuild/BUILD/pytest-remotedata-0.3.3, configfile: setup.cfg, testpaths: tests plugins: remotedata-0.3.3 collected 15 items tests/test_skip_remote_data.py sss... [ 40%] tests/test_socketblocker.py .... [ 66%] tests/test_strict_check.py .F..F [100%] ========================================================================================= FAILURES ========================================================================================== ___________________________________________________________________________________ test_default_behavior ___________________________________________________________________________________ testdir = def test_default_behavior(testdir): _write_config_file(testdir, '') testdir.makepyfile(PYFILE_CONTENTS.format('False', '')) result = testdir.runpytest_subprocess() > result.assert_outcomes(passed=2) E AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...} E Omitting 4 identical items, use -vv to show E Differing items: E {'failed': 1} != {'failed': 0} E {'passed': 1} != {'passed': 2} E Use -v to get more diff /home/tkloczko/rpmbuild/BUILD/pytest-remotedata-0.3.3/tests/test_strict_check.py:39: AssertionError ----------------------------------------------------------------------------------- Captured stdout call ------------------------------------------------------------------------------------ running: /usr/bin/python3 -mpytest --basetemp=/tmp/pytest-of-tkloczko/pytest-5/test_default_behavior0/runpytest-0 in: /tmp/pytest-of-tkloczko/pytest-5/test_default_behavior0 ============================= test session starts ============================== platform linux -- Python 3.8.16, pytest-7.2.0, pluggy-1.0.0 rootdir: /tmp/pytest-of-tkloczko/pytest-5/test_default_behavior0, configfile: setup.cfg plugins: remotedata-0.3.3 collected 2 items test_default_behavior.py .F [100%] =================================== FAILURES =================================== _____________________________ test_internet_access _____________________________ self = http_class = req = , http_conn_args = {} host = 'astropy.org', h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.8/urllib/request.py:1354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , method = 'GET' url = '', body = None headers = {'Connection': 'close', 'Host': 'astropy.org', 'User-Agent': 'Python-urllib/3.8'} def request(self, method, url, body=None, headers={}, *, encode_chunked=False): """Send a complete request to the server.""" > self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.8/http/client.py:1256: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , method = 'GET' url = '', body = None headers = {'Connection': 'close', 'Host': 'astropy.org', 'User-Agent': 'Python-urllib/3.8'} encode_chunked = False def _send_request(self, method, url, body, headers, encode_chunked): # Honor explicitly requested Host: and Accept-Encoding: headers. header_names = frozenset(k.lower() for k in headers) skips = {} if 'host' in header_names: skips['skip_host'] = 1 if 'accept-encoding' in header_names: skips['skip_accept_encoding'] = 1 self.putrequest(method, url, **skips) # chunked encoding will happen if HTTP/1.1 is used and either # the caller passes encode_chunked=True or the following # conditions hold: # 1. content-length has not been explicitly set # 2. the body is a file or iterable, but not a str or bytes-like # 3. Transfer-Encoding has NOT been explicitly set by the caller if 'content-length' not in header_names: # only chunk body if not explicitly set for backwards # compatibility, assuming the client code is already handling the # chunking if 'transfer-encoding' not in header_names: # if content-length cannot be automatically determined, fall # back to chunked encoding encode_chunked = False content_length = self._get_content_length(body, method) if content_length is None: if body is not None: if self.debuglevel > 0: print('Unable to determine size of %r' % body) encode_chunked = True self.putheader('Transfer-Encoding', 'chunked') else: self.putheader('Content-Length', str(content_length)) else: encode_chunked = False for hdr, value in headers.items(): self.putheader(hdr, value) if isinstance(body, str): # RFC 2616 Section 3.7.1 says that text default has a # default charset of iso-8859-1. body = _encode(body, 'body') > self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.8/http/client.py:1302: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = message_body = None def endheaders(self, message_body=None, *, encode_chunked=False): """Indicate that the last header line has been sent to the server. This method sends the request to the server. The optional message_body argument can be used to pass a message body associated with the request. """ if self.__state == _CS_REQ_STARTED: self.__state = _CS_REQ_SENT else: raise CannotSendHeader() > self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.8/http/client.py:1251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = message_body = None, encode_chunked = False def _send_output(self, message_body=None, encode_chunked=False): """Send the currently buffered request and clear the buffer. Appends an extra \\r\\n to the buffer. A message_body may be specified, to be appended to the request. """ self._buffer.extend((b"", b"")) msg = b"\r\n".join(self._buffer) del self._buffer[:] > self.send(msg) /usr/lib64/python3.8/http/client.py:1011: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = data = b'GET / HTTP/1.1\r\nAccept-Encoding: identity\r\nHost: astropy.org\r\nUser-Agent: Python-urllib/3.8\r\nConnection: close\r\n\r\n' def send(self, data): """Send `data' to the server. ``data`` can be a string object, a bytes object, an array object, a file-like object that supports a .read() method, or an iterable object. """ if self.sock is None: if self.auto_open: > self.connect() /usr/lib64/python3.8/http/client.py:951: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def connect(self): """Connect to the host and port specified in __init__.""" > self.sock = self._create_connection( (self.host,self.port), self.timeout, self.source_address) /usr/lib64/python3.8/http/client.py:922: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('astropy.org', 80), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None > for res in getaddrinfo(host, port, 0, SOCK_STREAM): /usr/lib64/python3.8/socket.py:787: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ host = 'astropy.org', port = 80, family = 0, type = proto = 0, flags = 0 def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0): """Resolve host and port into list of address info entries. Translate the host/port argument into a sequence of 5-tuples that contain all the necessary arguments for creating a socket connected to that service. host is a domain name, a string representation of an IPv4/v6 address or None. port is a string service name such as 'http', a numeric port number or None. By passing None as the value of host and port, you can pass NULL to the underlying C API. The family, type and proto arguments can be optionally specified in order to narrow the list of addresses returned. Passing zero as a value for each of these arguments selects the full range of results. """ # We override this function since we want to translate the numeric family # and socket type values to enum constants. addrlist = [] > for res in _socket.getaddrinfo(host, port, family, type, proto, flags): E socket.gaierror: [Errno -3] Temporary failure in name resolution /usr/lib64/python3.8/socket.py:918: gaierror During handling of the above exception, another exception occurred: def test_internet_access(): > urlopen('http://astropy.org') test_default_behavior.py:9: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.8/urllib/request.py:222: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.8/urllib/request.py:525: in open response = self._open(req, data) /usr/lib64/python3.8/urllib/request.py:542: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.8/urllib/request.py:502: in _call_chain result = func(*args) /usr/lib64/python3.8/urllib/request.py:1383: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = 'astropy.org', h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.8/urllib/request.py:1357: URLError =========================== short test summary info ============================ FAILED test_default_behavior.py::test_internet_access - urllib.error.URLError... =================== 1 failed, 1 passed in 120.34s (0:02:00) ==================== ______________________________________________________________________________ test_strict_with_decorator[any] ______________________________________________________________________________ testdir = , source = 'any' @pytest.mark.parametrize('source', ['none', 'any']) def test_strict_with_decorator(testdir, source): _write_config_file(testdir, 'remote_data_strict = true') decorator = '@pytest.mark.remote_data' testdir.makepyfile(PYFILE_CONTENTS.format('True', decorator)) clarg = '--remote-data=' + source result = testdir.runpytest_subprocess(clarg) if source == 'none': outcomes = dict(passed=1, skipped=1) else: outcomes = dict(passed=2) > result.assert_outcomes(**outcomes) E AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...} E Omitting 4 identical items, use -vv to show E Differing items: E {'failed': 1} != {'failed': 0} E {'passed': 1} != {'passed': 2} E Use -v to get more diff /home/tkloczko/rpmbuild/BUILD/pytest-remotedata-0.3.3/tests/test_strict_check.py:65: AssertionError ----------------------------------------------------------------------------------- Captured stdout call ------------------------------------------------------------------------------------ running: /usr/bin/python3 -mpytest --basetemp=/tmp/pytest-of-tkloczko/pytest-5/test_strict_with_decorator1/runpytest-0 --remote-data=any in: /tmp/pytest-of-tkloczko/pytest-5/test_strict_with_decorator1 ============================= test session starts ============================== platform linux -- Python 3.8.16, pytest-7.2.0, pluggy-1.0.0 rootdir: /tmp/pytest-of-tkloczko/pytest-5/test_strict_with_decorator1, configfile: setup.cfg plugins: remotedata-0.3.3 collected 2 items test_strict_with_decorator.py .F [100%] =================================== FAILURES =================================== _____________________________ test_internet_access _____________________________ self = http_class = req = , http_conn_args = {} host = 'astropy.org', h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib64/python3.8/urllib/request.py:1354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , method = 'GET' url = '', body = None headers = {'Connection': 'close', 'Host': 'astropy.org', 'User-Agent': 'Python-urllib/3.8'} def request(self, method, url, body=None, headers={}, *, encode_chunked=False): """Send a complete request to the server.""" > self._send_request(method, url, body, headers, encode_chunked) /usr/lib64/python3.8/http/client.py:1256: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , method = 'GET' url = '', body = None headers = {'Connection': 'close', 'Host': 'astropy.org', 'User-Agent': 'Python-urllib/3.8'} encode_chunked = False def _send_request(self, method, url, body, headers, encode_chunked): # Honor explicitly requested Host: and Accept-Encoding: headers. header_names = frozenset(k.lower() for k in headers) skips = {} if 'host' in header_names: skips['skip_host'] = 1 if 'accept-encoding' in header_names: skips['skip_accept_encoding'] = 1 self.putrequest(method, url, **skips) # chunked encoding will happen if HTTP/1.1 is used and either # the caller passes encode_chunked=True or the following # conditions hold: # 1. content-length has not been explicitly set # 2. the body is a file or iterable, but not a str or bytes-like # 3. Transfer-Encoding has NOT been explicitly set by the caller if 'content-length' not in header_names: # only chunk body if not explicitly set for backwards # compatibility, assuming the client code is already handling the # chunking if 'transfer-encoding' not in header_names: # if content-length cannot be automatically determined, fall # back to chunked encoding encode_chunked = False content_length = self._get_content_length(body, method) if content_length is None: if body is not None: if self.debuglevel > 0: print('Unable to determine size of %r' % body) encode_chunked = True self.putheader('Transfer-Encoding', 'chunked') else: self.putheader('Content-Length', str(content_length)) else: encode_chunked = False for hdr, value in headers.items(): self.putheader(hdr, value) if isinstance(body, str): # RFC 2616 Section 3.7.1 says that text default has a # default charset of iso-8859-1. body = _encode(body, 'body') > self.endheaders(body, encode_chunked=encode_chunked) /usr/lib64/python3.8/http/client.py:1302: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = message_body = None def endheaders(self, message_body=None, *, encode_chunked=False): """Indicate that the last header line has been sent to the server. This method sends the request to the server. The optional message_body argument can be used to pass a message body associated with the request. """ if self.__state == _CS_REQ_STARTED: self.__state = _CS_REQ_SENT else: raise CannotSendHeader() > self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib64/python3.8/http/client.py:1251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = message_body = None, encode_chunked = False def _send_output(self, message_body=None, encode_chunked=False): """Send the currently buffered request and clear the buffer. Appends an extra \\r\\n to the buffer. A message_body may be specified, to be appended to the request. """ self._buffer.extend((b"", b"")) msg = b"\r\n".join(self._buffer) del self._buffer[:] > self.send(msg) /usr/lib64/python3.8/http/client.py:1011: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = data = b'GET / HTTP/1.1\r\nAccept-Encoding: identity\r\nHost: astropy.org\r\nUser-Agent: Python-urllib/3.8\r\nConnection: close\r\n\r\n' def send(self, data): """Send `data' to the server. ``data`` can be a string object, a bytes object, an array object, a file-like object that supports a .read() method, or an iterable object. """ if self.sock is None: if self.auto_open: > self.connect() /usr/lib64/python3.8/http/client.py:951: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def connect(self): """Connect to the host and port specified in __init__.""" > self.sock = self._create_connection( (self.host,self.port), self.timeout, self.source_address) /usr/lib64/python3.8/http/client.py:922: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('astropy.org', 80), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. """ host, port = address err = None > for res in getaddrinfo(host, port, 0, SOCK_STREAM): /usr/lib64/python3.8/socket.py:787: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ host = 'astropy.org', port = 80, family = 0, type = proto = 0, flags = 0 def getaddrinfo(host, port, family=0, type=0, proto=0, flags=0): """Resolve host and port into list of address info entries. Translate the host/port argument into a sequence of 5-tuples that contain all the necessary arguments for creating a socket connected to that service. host is a domain name, a string representation of an IPv4/v6 address or None. port is a string service name such as 'http', a numeric port number or None. By passing None as the value of host and port, you can pass NULL to the underlying C API. The family, type and proto arguments can be optionally specified in order to narrow the list of addresses returned. Passing zero as a value for each of these arguments selects the full range of results. """ # We override this function since we want to translate the numeric family # and socket type values to enum constants. addrlist = [] > for res in _socket.getaddrinfo(host, port, family, type, proto, flags): E socket.gaierror: [Errno -3] Temporary failure in name resolution /usr/lib64/python3.8/socket.py:918: gaierror During handling of the above exception, another exception occurred: @pytest.mark.remote_data def test_internet_access(): > urlopen('http://astropy.org') test_strict_with_decorator.py:9: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib64/python3.8/urllib/request.py:222: in urlopen return opener.open(url, data, timeout) /usr/lib64/python3.8/urllib/request.py:525: in open response = self._open(req, data) /usr/lib64/python3.8/urllib/request.py:542: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib64/python3.8/urllib/request.py:502: in _call_chain result = func(*args) /usr/lib64/python3.8/urllib/request.py:1383: in http_open return self.do_open(http.client.HTTPConnection, req) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = 'astropy.org', h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib64/python3.8/urllib/request.py:1357: URLError =========================== short test summary info ============================ FAILED test_strict_with_decorator.py::test_internet_access - urllib.error.URL... =================== 1 failed, 1 passed in 120.39s (0:02:00) ==================== ================================================================================== short test summary info ================================================================================== SKIPPED [3] ../../BUILDROOT/python-pytest-remotedata-0.3.3-5.fc35.x86_64/usr/lib/python3.8/site-packages/pytest_remotedata/plugin.py:90: need --remote-data option to run FAILED tests/test_strict_check.py::test_default_behavior - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...} FAILED tests/test_strict_check.py::test_strict_with_decorator[any] - AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...} ==================================================================== 2 failed, 10 passed, 3 skipped in 763.45s (0:12:43) ==================================================================== ```

Here is list of installed modules in build env

```console Package Version ----------------- -------------- appdirs 1.4.4 asn1crypto 1.5.1 attrs 22.2.0 bcrypt 3.2.2 Brlapi 0.8.3 build 0.9.0 cffi 1.15.1 contourpy 1.0.6 cryptography 38.0.4 cssselect 1.1.0 cycler 0.11.0 distro 1.8.0 dnspython 2.2.1 exceptiongroup 1.0.0 extras 1.0.0 fixtures 4.0.0 fonttools 4.38.0 gpg 1.18.0-unknown iniconfig 1.1.1 kiwisolver 1.4.4 libcomps 0.1.19 louis 3.24.0 lxml 4.9.1 matplotlib 3.6.2 numpy 1.23.1 olefile 0.46 packaging 21.3 pbr 5.9.0 pep517 0.13.0 Pillow 9.3.0 pip 22.3.1 pluggy 1.0.0 ply 3.11 pyasn1 0.4.8 pyasn1-modules 0.2.8 pycparser 2.21 PyGObject 3.42.2 pyparsing 3.0.9 pytest 7.2.0 python-dateutil 2.8.2 PyYAML 6.0 rpm 4.17.0 scour 0.38.2 setuptools 65.6.3 setuptools-scm 7.0.5 six 1.16.0 testtools 2.5.0 tomli 2.0.1 tpm2-pkcs11-tools 1.33.7 tpm2-pytss 1.1.0 typing_extensions 4.4.0 wheel 0.38.4 ```
pllim commented 1 year ago

@kloczek , thank you for reporting this. I think this is a duplicate of #41 . So I am closing this as duplicate but please note that the original issue is a known bug and unfortunately we do not have the resources currently to investigate. I would propose you skip that test for now.

In addition, turns out v0.4.0 was tagged and released to PyPI, but not reflected in the GitHub Release; I have corrected that.

kloczek commented 1 year ago

Thank you. I've subscribed to that ticket 👍