The CNCF survey finds 82% of container users run Kubernetes in production. 66% of GenAI hosts use it for inference.
Kubernetes now stitches data processing, distributed training, LLM inference, and autonomous agents via Spark, Kubeflow, Kueue, KServe, and Armada.
GPU sharing and scheduling advanced with MIG, time-slicing, MPS, and DRA. Karpenter and SOCI cut node costs and shrink image startup time.










