Name: cilium-operator-869df985b8-z9f56 Namespace: kube-system Priority: 2000000000 Priority Class Name: system-cluster-critical Service Account: cilium-operator Node: instance/162.253.55.110 Start Time: Wed, 04 Mar 2026 12:53:30 +0000 Labels: app.kubernetes.io/name=cilium-operator app.kubernetes.io/part-of=cilium io.cilium/app=operator name=cilium-operator pod-template-hash=869df985b8 Annotations: Status: Running IP: 162.253.55.110 IPs: IP: 162.253.55.110 Controlled By: ReplicaSet/cilium-operator-869df985b8 Containers: cilium-operator: Container ID: containerd://b2a4b48a0b0e97e356db30de8560ee97cdfa71584092df1f7f0fb91e2fbd39c9 Image: harbor.atmosphere.dev/quay.io/cilium/operator-generic:v1.14.8 Image ID: harbor.atmosphere.dev/quay.io/cilium/operator-generic@sha256:56d373c12483c09964a00a29246595917603a077a298aa90a98e4de32c86b7dc Port: Host Port: Command: cilium-operator-generic Args: --config-dir=/tmp/cilium/config-map --debug=$(CILIUM_DEBUG) State: Running Started: Wed, 04 Mar 2026 13:45:43 +0000 Last State: Terminated Reason: Error Message: fo msg=Stopping subsys=hive level=info msg="Stop hook executed" duration="66.422µs" function="*api.server.Stop" subsys=hive level=info msg="Stop hook executed" duration="48.582µs" function="identitygc.registerGC.func2 (gc.go:122)" subsys=hive level=info msg="Stop hook executed" duration="647.524µs" function="cmd.(*legacyOnLeader).onStop" subsys=hive level=info msg="Stop hook executed" duration="95.822µs" function="*resource.resource[*github.com/cilium/cilium/pkg/k8s/slim/k8s/api/networking/v1.IngressClass].Stop" subsys=hive level=info msg="Stop hook executed" duration="5.491µs" function="*resource.resource[*github.com/cilium/cilium/pkg/k8s/apis/cilium.io/v2alpha1.CiliumPodIPPool].Stop" subsys=hive level=info msg="Stop hook executed" duration="2.77µs" function="*resource.resource[*github.com/cilium/cilium/pkg/k8s.Endpoints].Stop" subsys=hive level=info msg="Stop hook executed" duration="95.803µs" function="*resource.resource[*github.com/cilium/cilium/pkg/k8s/apis/cilium.io/v2.CiliumIdentity].Stop" subsys=hive level=info msg="LB-IPAM done initializing" subsys=lbipam level=info msg="Stop hook executed" duration="187.454µs" function="*job.group.Stop" subsys=hive level=info msg="Stop hook executed" duration="98.242µs" function="*resource.resource[*github.com/cilium/cilium/pkg/k8s/slim/k8s/api/core/v1.Service].Stop" subsys=hive level=info msg="Stop hook executed" duration="79.432µs" function="*resource.resource[*github.com/cilium/cilium/pkg/k8s/apis/cilium.io/v2alpha1.CiliumLoadBalancerIPPool].Stop" subsys=hive level=info msg="Stop hook executed" duration=1.884233ms function="cmd.registerOperatorHooks.func2 (root.go:166)" subsys=hive level=info msg="Stop hook executed" duration="18.921µs" function="client.(*compositeClientset).onStop" subsys=hive level=info msg="Stopped gops server" address="127.0.0.1:9891" subsys=gops level=info msg="Stop hook executed" duration=42.798589ms function="gops.registerGopsHooks.func2 (cell.go:50)" subsys=hive level=fatal msg="Leader election lost" subsys=cilium-operator-generic Exit Code: 1 Started: Wed, 04 Mar 2026 12:53:47 +0000 Finished: Wed, 04 Mar 2026 13:45:04 +0000 Ready: True Restart Count: 1 Liveness: http-get http://127.0.0.1:9234/healthz delay=60s timeout=3s period=10s #success=1 #failure=3 Readiness: http-get http://127.0.0.1:9234/healthz delay=0s timeout=3s period=5s #success=1 #failure=5 Environment: K8S_NODE_NAME: (v1:spec.nodeName) CILIUM_K8S_NAMESPACE: kube-system (v1:metadata.namespace) CILIUM_DEBUG: Optional: true Mounts: /tmp/cilium/config-map from cilium-config-path (ro) /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-lhg77 (ro) Conditions: Type Status Initialized True Ready True ContainersReady True PodScheduled True Volumes: cilium-config-path: Type: ConfigMap (a volume populated by a ConfigMap) Name: cilium-config Optional: false kube-api-access-lhg77: Type: Projected (a volume that contains injected data from multiple sources) TokenExpirationSeconds: 3607 ConfigMapName: kube-root-ca.crt ConfigMapOptional: DownwardAPI: true QoS Class: BestEffort Node-Selectors: kubernetes.io/os=linux node-role.kubernetes.io/control-plane= Tolerations: op=Exists Events: Type Reason Age From Message ---- ------ ---- ---- ------- Warning Unhealthy 28m (x3 over 28m) kubelet Liveness probe failed: Get "http://127.0.0.1:9234/healthz": dial tcp 127.0.0.1:9234: connect: connection refused Normal Killing 28m kubelet Container cilium-operator failed liveness probe, will be restarted Normal Pulled 28m kubelet Container image "harbor.atmosphere.dev/quay.io/cilium/operator-generic:v1.14.8" already present on machine Warning Unhealthy 28m (x10 over 28m) kubelet Readiness probe failed: Get "http://127.0.0.1:9234/healthz": dial tcp 127.0.0.1:9234: connect: connection refused Normal Created 28m (x2 over 80m) kubelet Created container cilium-operator Normal Started 28m (x2 over 80m) kubelet Started container cilium-operator