Closed pjbreaux closed 6 years ago
@dflanigan: as yet, I'm uncertain if this is a code issue or test issue. If I discover a code issue, I'll file it in the appropriate repo and link to this bug.
It appears, at first glance, that a previously failing test, specifically these:
../../../../../../../testlab/f5-openstack-lbaasv2-driver/::TLSListenersTestJSON::test_list_tls_listeners_two <- ../../buildbot/neutron-lbaas/neutron_lbaas/tests/tempest/v2/api/test_listeners_tls.py FAILED
../../../../../../../testlab/f5-openstack-lbaasv2-driver/::TLSListenersTestJSON::test_update_listener_empty_tls_container <- ../../buildbot/neutron-lbaas/.tox/apiv2/local/lib/python2.7/site-packages/tempest/lib/decorators.py SKIPPED
../../../../../../../testlab/f5-openstack-lbaasv2-driver/::TLSListenersTestJSON::test_update_listener_empty_uuid_tls_container <- ../../buildbot/neutron-lbaas/neutron_lbaas/tests/tempest/v2/api/test_listeners_tls.py FAILED
../../../../../../../testlab/f5-openstack-lbaasv2-driver/::TLSListenersTestJSON::test_update_listener_invalid_tls_container <- ../../buildbot/neutron-lbaas/neutron_lbaas/tests/tempest/v2/api/test_listeners_tls.py FAILED
../../../../../../../testlab/f5-openstack-lbaasv2-driver/::TLSListenersTestJSON::test_update_listener_none_tls_container <- ../../buildbot/neutron-lbaas/.tox/apiv2/local/lib/python2.7/site-packages/tempest/lib/decorators.py SKIPPED
../../../../../../../testlab/f5-openstack-lbaasv2-driver/::TLSListenersTestJSON::test_update_listener_nonexistent_tls_container <- ../../buildbot/neutron-lbaas/neutron_lbaas/tests/tempest/v2/api/test_listeners_tls.py FAILED
../../../../../../../testlab/f5-openstack-lbaasv2-driver/::TLSListenersTestJSON::test_update_listener_tls_port <- ../../buildbot/neutron-lbaas/neutron_lbaas/tests/tempest/v2/api/test_listeners_tls.py FAILED
../../../../../../../testlab/f5-openstack-lbaasv2-driver/::TLSListenersTestJSON::test_update_listener_tls_protocol <- ../../buildbot/neutron-lbaas/neutron_lbaas/tests/tempest/v2/api/test_listeners_tls.py FAILED
../../../../../../../testlab/f5-openstack-lbaasv2-driver/::TLSListenersTestJSON::test_update_tls_listener <- ../../buildbot/neutron-lbaas/neutron_lbaas/tests/tempest/v2/api/test_listeners_tls.py FAILED
../../../../../../../testlab/f5-openstack-lbaasv2-driver/::TLSListenersTestJSON::test_update_tls_listener <- ../../buildbot/neutron-lbaas/neutron_lbaas/tests/tempest/v2/api/test_listeners_tls.py ERROR
../../../../../../../testlab/f5-openstack-lbaasv2-driver/::TLSSNIListenersTestJSON::test_create_tls_sni_listener <- ../../buildbot/neutron-lbaas/neutron_lbaas/tests/tempest/v2/api/test_listeners_tls_sni.py ERROR
../../../../../../../testlab/f5-openstack-lbaasv2-driver/::TLSSNIListenersTestJSON::test_delete_tls_sni_listener <- ../../buildbot/neutron-lbaas/neutron_lbaas/tests/tempest/v2/api/test_listeners_tls_sni.py ERROR
Cause a virtual address to be left on the BIG-IP with the same address as the one used by the members and pool tests. So that results in a virtual server creation error as below:
2017-04-14 09:40:10.528 23582 DEBUG root [req-4a5a7028-fe3d-484e-b604-66d369b90cdd 987c7ef374c44a17ad352da60716c761 2e15ec1e122449938682585c79e6f1e9 - - -] post WITH uri: https://10.190.3.62:443/mgmt/tm/ltm/virtual-address/ AND suffix: AND kwargs: {'json': {'description': u'', 'autoDelete': False, 'partition': u'Project_e447e058dcb440fd
abcfa553ac11dd0d', 'trafficGroup': u'traffic-group-1', 'address': u'10.2.4.3', 'name': u'Project_66ad3ead-1eb3-4728-8fdb-a96da95cd0d4'}} wrapper /usr/lib/python2.7/site-packages/icontrol/session.py:257
2017-04-14 09:40:10.544 23582 DEBUG root [req-4a5a7028-fe3d-484e-b604-66d369b90cdd 987c7ef374c44a17ad352da60716c761 2e15ec1e122449938682585c79e6f1e9 - - -] RESPONSE::STATUS: 400 Content-Type: application/json Content-Encoding: None
Text: u'{"code":400,"message":"0107176c:3: Invalid Virtual Address, the IP address 10.2.4.3 already exists.","errorStack":[]}' wrapper /usr/lib/python2.7/site-packages/icontrol/session.py:265
2017-04-14 09:40:10.545 23582 DEBUG root [req-4a5a7028-fe3d-484e-b604-66d369b90cdd 987c7ef374c44a17ad352da60716c761 2e15ec1e122449938682585c79e6f1e9 - - -] post WITH uri: https://10.190.3.62:443/mgmt/tm/ltm/virtual/ AND suffix: AND kwargs: {'json': {'partition': u'Project_e447e058dcb440fdabcfa553ac11dd0d', 'name': u'Project_4d861ab1-919f-425c-b11f-663321af2f64', 'destination': u'10.2.4.3:80', 'enabled': True, 'profiles': ['/Common/http', '/Common/oneconnect'], 'fallbackPersistence': '', 'sourceAddressTranslation': {'type': 'automap'}, 'connectionLimit': 0, 'persist': [], 'ipProtocol': 'tcp', 'vlans': [], 'vlansDisabled': True, 'description': u''}} wrapper /usr/lib/python2.7/site-packages/icontrol/session.py:257
2017-04-14 09:40:10.578 23582 DEBUG root [req-4a5a7028-fe3d-484e-b604-66d369b90cdd 987c7ef374c44a17ad352da60716c761 2e15ec1e122449938682585c79e6f1e9 - - -] RESPONSE::STATUS: 400 Content-Type: application/json Content-Encoding: None
Text: u'{"code":400,"message":"01070726:3: Virtual Server /Project_e447e058dcb440fdabcfa553ac11dd0d/Project_4d861ab1-919f-425c-b11f-663321af2f64 in partition Project_e447e058dcb440fdabcfa553ac11dd0d cannot reference Virtual Address /Project_94a88d406df64a88bf8ba249961baadf/Project_c4f16dd8-8b42-46fb-aaf3-6fbb4bf596ad in partition Project_94a88d406df64a88bf8ba249961baadf","errorStack":[]}' wrapper /usr/lib/python2.7/site-packages/icontrol/session.py:265
2017-04-14 09:40:10.578 23582 ERROR f5_openstack_agent.lbaasv2.drivers.bigip.listener_service [req-4a5a7028-fe3d-484e-b604-66d369b90cdd 987c7ef374c44a17ad352da60716c761 2e15ec1e122449938682585c79e6f1e9 - - -] Virtual server creation error: 400 Unexpected Error: Bad Request for uri: https://10.190.3.62:443/mgmt/tm/ltm/virtual/
Text: u'{"code":400,"message":"01070726:3: Virtual Server /Project_e447e058dcb440fdabcfa553ac11dd0d/Project_4d861ab1-919f-425c-b11f-663321af2f64 in partition Project_e447e058dcb440fdabcfa553ac11dd0d cannot reference Virtual Address /Project_94a88d406df64a88bf8ba249961baadf/Project_c4f16dd8-8b42-46fb-aaf3-6fbb4bf596ad in partition Project_94a88d406df64a88bf8ba249961baadf","errorStack":[]}'
2017-04-14 09:40:10.578 23582 ERROR f5_openstack_agent.lbaasv2.drivers.bigip.listener_service Traceback (most recent call last):
2017-04-14 09:40:10.578 23582 ERROR f5_openstack_agent.lbaasv2.drivers.bigip.listener_service File "/usr/lib/python2.7/site-packages/f5_openstack_agent/lbaasv2/drivers/bigip/listener_service.py", line 66, in create_listener
2017-04-14 09:40:10.578 23582 ERROR f5_openstack_agent.lbaasv2.drivers.bigip.listener_service self.vs_helper.create(bigip, vip)
2017-04-14 09:40:10.578 23582 ERROR f5_openstack_agent.lbaasv2.drivers.bigip.listener_service File "/usr/lib/python2.7/site-packages/f5_openstack_agent/lbaasv2/drivers/bigip/resource_helper.py", line 94, in create
2017-04-14 09:40:10.578 23582 ERROR f5_openstack_agent.lbaasv2.drivers.bigip.listener_service obj = resource.create(**model)
2017-04-14 09:40:10.578 23582 ERROR f5_openstack_agent.lbaasv2.drivers.bigip.listener_service File "/usr/lib/python2.7/site-packages/f5/bigip/resource.py", line 974, in create
2017-04-14 09:40:10.578 23582 ERROR f5_openstack_agent.lbaasv2.drivers.bigip.listener_service return self._create(**kwargs)
2017-04-14 09:40:10.578 23582 ERROR f5_openstack_agent.lbaasv2.drivers.bigip.listener_service File "/usr/lib/python2.7/site-packages/f5/bigip/resource.py", line 941, in _create
2017-04-14 09:40:10.578 23582 ERROR f5_openstack_agent.lbaasv2.drivers.bigip.listener_service response = session.post(_create_uri, json=kwargs, **requests_params)
2017-04-14 09:40:10.578 23582 ERROR f5_openstack_agent.lbaasv2.drivers.bigip.listener_service File "/usr/lib/python2.7/site-packages/icontrol/session.py", line 272, in wrapper
2017-04-14 09:40:10.578 23582 ERROR f5_openstack_agent.lbaasv2.drivers.bigip.listener_service raise iControlUnexpectedHTTPError(error_message, response=response)
2017-04-14 09:40:10.578 23582 ERROR f5_openstack_agent.lbaasv2.drivers.bigip.listener_service iControlUnexpectedHTTPError: 400 Unexpected Error: Bad Request for uri: https://10.190.3.62:443/mgmt/tm/ltm/virtual/
2017-04-14 09:40:10.578 23582 ERROR f5_openstack_agent.lbaasv2.drivers.bigip.listener_service Text: u'{"code":400,"message":"01070726:3: Virtual Server /Project_e447e058dcb440fdabcfa553ac11dd0d/Project_4d861ab1-919f-425c-b11f-663321af2f64 in partition Project_e447e058dcb440fdabcfa553ac11dd0d cannot reference Virtual Address /Project_94a88d406df64a88bf8ba249961baadf/Project_c4f16dd8-8b42-46fb-aaf3-6fbb4bf596ad in partition Project_94a88d406df64a88bf8ba249961baadf","errorStack":[]}'
2017-04-14 09:40:10.578 23582 ERROR f5_openstack_agent.lbaasv2.drivers.bigip.listener_service
2017-04-14 09:40:10.580 23582 ERROR f5_openstack_agent.lbaasv2.drivers.bigip.icontrol_driver [req-4a5a7028-fe3d-484e-b604-66d369b90cdd 987c7ef374c44a17ad352da60716c761 2e15ec1e122449938682585c79e6f1e9 - - -] 400 Unexpected Error: Bad Request for uri: https://10.190.3.62:443/mgmt/tm/ltm/virtual/
Text: u'{"code":400,"message":"01070726:3: Virtual Server /Project_e447e058dcb440fdabcfa553ac11dd0d/Project_4d861ab1-919f-425c-b11f-663321af2f64 in partition Project_e447e058dcb440fdabcfa553ac11dd0d cannot reference Virtual Address /Project_94a88d406df64a88bf8ba249961baadf/Project_c4f16dd8-8b42-46fb-aaf3-6fbb4bf596ad in partition Project_94a88d406df64a88bf8ba249961baadf","errorStack":[]}'
@dflanigan: this issue can be used to ensure all neutron-lbaas api tests start with a clean slate and cleanup properly after they are done.
ERROR at setup of TestHealthMonitors.test_create_health_monitor_empty_tenant_id_field
cls = <class 'neutron_lbaas.tests.tempest.v2.api.test_health_monitor_admin.TestHealthMonitors'>
@classmethod
def resource_setup(cls):
super(TestHealthMonitors, cls).resource_setup()
if not test.is_extension_enabled('lbaas', 'network'):
msg = "lbaas extension not enabled."
raise cls.skipException(msg)
network_name = data_utils.rand_name('network-')
cls.network = cls.create_network(network_name)
cls.subnet = cls.create_subnet(cls.network)
cls.load_balancer = cls._create_load_balancer(
tenant_id=cls.subnet.get('tenant_id'),
vip_subnet_id=cls.subnet.get('id'))
cls.listener = cls._create_listener(
loadbalancer_id=cls.load_balancer.get('id'),
> protocol='HTTP', protocol_port=80)
cls = <class 'neutron_lbaas.tests.tempest.v2.api.test_health_monitor_admin.TestHealthMonitors'>
network_name = 'network--2075181212'
test_health_monitor_admin.py:53:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
base.py:314: in _create_listener
cls._wait_for_load_balancer_status(cls.load_balancer.get('id'))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
cls = <class 'neutron_lbaas.tests.tempest.v2.api.test_health_monitor_admin.TestHealthMonitors'>
load_balancer_id = '8898fb3e-62cd-4dac-b1be-f0a7347d2439'
provisioning_status = 'ACTIVE', operating_status = 'ONLINE', delete = False
ignore_operating_status = False
@classmethod
def _wait_for_load_balancer_status(cls, load_balancer_id,
provisioning_status='ACTIVE',
operating_status='ONLINE',
delete=False,
ignore_operating_status=False):
interval_time = 1
timeout = 60
end_time = time.time() + timeout
lb = {}
while time.time() < end_time:
try:
lb = cls.load_balancers_client.get_load_balancer(
load_balancer_id)
if not lb:
# loadbalancer not found
if delete:
break
else:
raise Exception(
("loadbalancer {lb_id} not"
" found").format(
lb_id=load_balancer_id))
if lb.get('provisioning_status') == provisioning_status:
if ignore_operating_status:
break
if lb.get('operating_status') == operating_status:
break
time.sleep(interval_time)
except exceptions.NotFound as e:
# if wait is for delete operation do break
if delete:
break
else:
# raise original exception
raise e
else:
if delete:
raise exceptions.TimeoutException(
("Waited for load balancer {lb_id} to be deleted for "
"{timeout} seconds but can still observe that it "
"exists.").format(
lb_id=load_balancer_id,
timeout=timeout))
else:
raise exceptions.TimeoutException(
("Wait for load balancer ran for {timeout} seconds and "
"did not observe {lb_id} reach {provisioning_status} "
"provisioning status and {operating_status} "
"operating status.").format(
timeout=timeout,
lb_id=load_balancer_id,
provisioning_status=provisioning_status,
> operating_status=operating_status))
E TimeoutException: Request timed out
E Details: Wait for load balancer ran for 60 seconds and did not observe 8898fb3e-62cd-4dac-b1be-f0a7347d2439 reach ACTIVE provisioning status and ONLINE operating status.
cls = <class 'neutron_lbaas.tests.tempest.v2.api.test_health_monitor_admin.TestHealthMonitors'>
delete = False
end_time = 1502960006.280383
ignore_operating_status = False
interval_time = 1
lb = {u'operating_status': u'OFFLINE', u'vip_address': u'10.2.4.3', u'admin_state_u...43d38fa78e6a736efbaa', u'vip_port_id': u'5b407418-b7d7-4283-b1c9-96224e9a4bee'}
load_balancer_id = '8898fb3e-62cd-4dac-b1be-f0a7347d2439'
operating_status = 'ONLINE'
provisioning_status = 'ACTIVE'
timeout = 60
base.py:307: TimeoutException
openstack_driver_mitaka_11.6.1-undercloud-vxlan.82.consoleText
TestHealthMonitors.test_create_health_monitor_empty_tenant_id_field
openstack_driver_mitaka_11.6.1-undercloud-vxlan.82.consoleText
TestHealthMonitors.test_create_health_monitor_for_another_tenant_id_field
openstack_driver_mitaka_11.6.1-undercloud-vxlan.82.consoleText
TestHealthMonitors.test_create_health_monitor_missing_tenant_id_field
5883223c2cf845e1e3f42875bde27a2c
Another instance:
cls = <class 'neutron_lbaas.tests.tempest.v2.api.test_load_balancers_admin.LoadBalancersTestAdmin'>
@classmethod
def resource_setup(cls):
super(LoadBalancersTestAdmin, cls).resource_setup()
if not test.is_extension_enabled('lbaas', 'network'):
msg = "lbaas extension not enabled."
raise cls.skipException(msg)
network_name = data_utils.rand_name('network')
cls.network = cls.create_network(network_name)
cls.subnet = cls.create_subnet(cls.network)
cls.load_balancer = cls.load_balancers_client.create_load_balancer(
vip_subnet_id=cls.subnet['id'])
> cls._wait_for_load_balancer_status(cls.load_balancer['id'])
cls = <class 'neutron_lbaas.tests.tempest.v2.api.test_load_balancers_admin.LoadBalancersTestAdmin'>
network_name = 'network-570852410'
test_load_balancers_admin.py:49:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
cls = <class 'neutron_lbaas.tests.tempest.v2.api.test_load_balancers_admin.LoadBalancersTestAdmin'>
load_balancer_id = '74274360-26b9-4045-8a83-d6337571e31c'
provisioning_status = 'ACTIVE', operating_status = 'ONLINE', delete = False
ignore_operating_status = False
@classmethod
def _wait_for_load_balancer_status(cls, load_balancer_id,
provisioning_status='ACTIVE',
operating_status='ONLINE',
delete=False,
ignore_operating_status=False):
interval_time = 1
timeout = 60
end_time = time.time() + timeout
lb = {}
while time.time() < end_time:
try:
lb = cls.load_balancers_client.get_load_balancer(
load_balancer_id)
if not lb:
# loadbalancer not found
if delete:
break
else:
raise Exception(
("loadbalancer {lb_id} not"
" found").format(
lb_id=load_balancer_id))
if lb.get('provisioning_status') == provisioning_status:
if ignore_operating_status:
break
if lb.get('operating_status') == operating_status:
break
time.sleep(interval_time)
except exceptions.NotFound:
# if wait is for delete operation do break
if delete:
break
else:
# raise original exception
raise
else:
if delete:
raise exceptions.TimeoutException(
("Waited for load balancer {lb_id} to be deleted for "
"{timeout} seconds but can still observe that it "
"exists.").format(
lb_id=load_balancer_id,
timeout=timeout))
else:
raise exceptions.TimeoutException(
("Wait for load balancer ran for {timeout} seconds and "
"did not observe {lb_id} reach {provisioning_status} "
"provisioning status and {operating_status} "
"operating status.").format(
timeout=timeout,
lb_id=load_balancer_id,
provisioning_status=provisioning_status,
> operating_status=operating_status))
E TimeoutException: Request timed out
Tests:
openstack_driver_newton_12.1.2-undercloud-vxlan.2.consoleText
LoadBalancersTestAdmin.test_create_load_balancer_empty_tenant_id_field
openstack_driver_newton_12.1.2-undercloud-vxlan.2.consoleText
LoadBalancersTestAdmin.test_create_load_balancer_missing_tenant_id_for_tenant
openstack_driver_newton_12.1.2-undercloud-vxlan.2.consoleText
LoadBalancersTestAdmin.test_delete_load_balancer_for_tenant
openstack_driver_newton_12.1.2-undercloud-vxlan.2.consoleText
LoadBalancersTestAdmin.test_update_load_balancer_description
409e0cf0e4f428f1b382a0ef7d6a8679
These massive errors seem to happen roughly half the time the tests run in nightly. The particular project this was pulled from was mitaka 11.6.1 overcloud tests, but they may fail in other deployments. The tests are as follows:
The pattern of errors is the same every time this happens. The tempest output is similar for all errors: