Open karsten42 opened 4 months ago
/area cluster-autoscaler
/area provider/hetzner
Certain server types might be restricted for periods of time. During that time, your previous orders are taken into account to evaluate if you can create more servers of these types. Depending on the accounts you used and the precise timing, this can always happen.
This is not a bug in the Hetzner Provider in cluster-autoscaler, but inherent behavior of our platform.
I see. Thanks for the explanation. Is there anything one could do to become unrestricted? This issue persisted for multiple hours which is very problematic if you have pods stuck in pending.
Hi, I want to ask about this issue. is there any documentation about the restriction? the restriction makes the stability on the cluster become suck.
There is a status message about the limited availability of cx
plans.
I would recommend you to use multiple node groups with different types/locations and use the priority expander to try your preferred one, but fall back to other types/locations if they preferred one is not available.
The Kubernetes project currently lacks enough contributors to adequately respond to all issues.
This bot triages un-triaged issues according to the following rules:
lifecycle/stale
is appliedlifecycle/stale
was applied, lifecycle/rotten
is appliedlifecycle/rotten
was applied, the issue is closedYou can:
/remove-lifecycle stale
/close
Please send feedback to sig-contributor-experience at kubernetes/community.
/lifecycle stale
Which component are you using?: cluster-autoscaler
What version of the component are you using?:
Component version: v1.30.1
What environment is this in?: Hetzner
What did you expect to happen?: A new node being created and added to the cluster
What happened instead?: An error saying that the node cannot be provisioned although it is possible to create an instance with the same specs and in the same location via the UI. Error:
hetzner_node_group.go:120] failed to create error: could not create server type ccx23 in region nbg1: we are unable to provision servers for this location, try with a different location or try later (resource_unavailable)