Loadbalancer doesn't get recreated when deleted

Bug #1824332 reported by Michal Dulko
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
kuryr-kubernetes
New
Undecided
Unassigned

Bug Description

Looks like we don't ever recreate a loadbalancer in case it gets deleted. I understand that it shouldn't happen™, but it's not really K8s thinking. In K8s we should observe the state of the system and if it's different from what it is we should try to fix it.

Moreover if an LB gets deleted and the underlying deployment gets scaled we end up with this in logs and kuryr-controller constantly restarted:

2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging Traceback (most recent call last):
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python2.7/site-packages/kuryr_kubernetes/handlers/logging.py", line 37, in __call__
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging self._handler(event)
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python2.7/site-packages/kuryr_kubernetes/handlers/retry.py", line 56, in __call__
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging self._handler(event)
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python2.7/site-packages/kuryr_kubernetes/handlers/k8s_base.py", line 75, in __call__
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging self.on_present(obj)
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python2.7/site-packages/kuryr_kubernetes/controller/handlers/lbaas.py", line 184, in on_present
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging if self._sync_lbaas_members(endpoints, lbaas_state, lbaas_spec):
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python2.7/site-packages/kuryr_kubernetes/controller/handlers/lbaas.py", line 275, in _sync_lbaas_members
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging self._add_new_members(endpoints, lbaas_state, lbaas_spec)):
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python2.7/site-packages/kuryr_kubernetes/controller/handlers/lbaas.py", line 377, in _add_new_members
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging listener_port=listener_port)
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python2.7/site-packages/kuryr_kubernetes/controller/drivers/lbaasv2.py", line 482, in ensure_member
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging self._find_member)
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python2.7/site-packages/kuryr_kubernetes/controller/drivers/lbaasv2.py", line 697, in _ensure_provisioned
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging self._wait_for_provisioning(loadbalancer, remaining, interval)
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python2.7/site-packages/kuryr_kubernetes/controller/drivers/lbaasv2.py", line 725, in _wait_for_provisioning
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging response = lbaas.get_load_balancer(loadbalancer.id)
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python2.7/site-packages/openstack/load_balancer/v2/_proxy.py", line 55, in get_load_balancer
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging return self._get(_lb.LoadBalancer, *attrs)
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python2.7/site-packages/openstack/proxy.py", line 37, in check
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging return method(self, expected, actual, *args, **kwargs)
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python2.7/site-packages/openstack/proxy.py", line 254, in _get
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging resource_type=resource_type.__name__, value=value))
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python2.7/site-packages/openstack/resource.py", line 1151, in fetch
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging self._translate_response(response, **kwargs)
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python2.7/site-packages/openstack/resource.py", line 962, in _translate_response
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging exceptions.raise_from_response(response, error_message=error_message)
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging File "/usr/lib/python2.7/site-packages/openstack/exceptions.py", line 229, in raise_from_response
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging http_status=http_status, request_id=request_id
2019-04-11 11:01:06.346 1 ERROR kuryr_kubernetes.handlers.logging ResourceNotFound: No LoadBalancer found for 7ebf0e47-28a9-4264-b265-0e466806b37b: Client Error for url: http://192.168.0.14/l
oad-balancer/v2/lbaas/loadbalancers/7ebf0e47-28a9-4264-b265-0e466806b37b, Not Found

I believe this is not acceptable.

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.