People, stop misusing Kubernetes!
Unless you have a viable use case like edge computing
Author: Jari Timonen, Lead Cloud Architect
No matter what color of gift paper you wrap it in, Kubernetes is complex and costly.
Initially, Google developed the predecessor of Kubernetes and named it Borg. Since then, Google has open-sourced the technology to benefit the broader community and to advance the state-of-the-art in container cluster management. And just like Google envisioned, Kubernetes has become a crucial part of modern container orchestration. Virtually everything in Google’s own environments, for example, runs as a container, managed with Kubernetes. To me, this seems like solid proof of Kubernetes’ reliability and scalability for huge corporations like Google.
But seriously, how many companies in the Nordics are Google, or even come close?
The fact, namely, remains that moving from virtual machines to containers and Kubernetes is a big investment. Therefore, this step just isn’t for most companies and organizations. I am concerned that there are countless companies whose business comes closer to “man and dog” than that of the global cloud giants, which are spending their time playing with Kubernetes.
How many companies in the Nordics are Google?
However, one viable use case for Kubernetes is emerging: edge computing. As our CTO Markku Tuomala wrote in his recent blog, edge computing – processing data closer to its source – offers big benefits in terms of latency, bandwidth, and efficiency for large industrial companies, telecom operators, and electricity providers.
Justifying Kubernetes: Edge computing
In the otherwise fast-changing world of industrial technology, edge computing has been annoyingly “just around the corner” for years. Things are about to change, however, since a number of very handy technologies from Google are making the orchestration of Kubernetes clusters more achievable. In this blog, I will share my experiences on how GKE, GKE Enterprise, and Anthos can revolutionize edge computing for industries that need very low-latency online services.
Google Kubernetes Engine (GKE) is Google’s managed Kubernetes service. It’s a robust solution for building and managing the capabilities needed for edge computing. Anthos, in turn, extends GKE to manage Kubernetes clusters across multi-cloud and hybrid environments. GKE Enterprise, the newest addition to the mix of solutions, allows Kubernetes clusters to be managed in a multitenant architecture, across clouds and on-premises environments, eliminating the need for extra servers. Google Distributed Cloud, finally, combines software and hardware to provide a fully integrated system. Such an integrated system supports edge computing scenarios, among others.
A standout feature of GKE is its team management capabilities. GKE allows the distribution of clusters and assigning specific teams to manage them. For example, team members in different locations—Pertti in Seinäjoki and Petra in Stockholm—can be given access and control from the cloud, eliminating manual interventions. This centralized control ensures all necessary tools and permissions are included in the package, simplifying operations significantly.
In edge computing scenarios, GKE offers unmatched ease of management. For example, updating a cluster can be as simple as making one change and deploying it across the network. This ease of operation is crucial for environments where Kubernetes management and updates are usually difficult. For instance, the North American Major League Baseball, uses Anthos to host applications like real-time game analytics, which need to run locally in the park for performance reasons.
Telia and Codento lead the way to Edge as a Service
Over the past two years, Codento’s team has pioneered using GKE and GKE Enterprise for edge computing. I am proud to say that we have achieved something no one else in the world has yet.
Our journey with Nordic telecom giant Telia began 2.5 years ago. Telia wanted to maximize the return on their 5G network investments beyond only speed. They also wanted to test Anthos’s capabilities in multi-cluster management.
Significant improvements in multi-cluster management.
Our joint efforts have been successful. Significant improvements in multi-cluster management reduced the time needed to run system upgrades from weeks to the minute it requires to change one configuration number. The first pilot customer is already using Telia’s platform.
Eye on the ball – Kubernetes can add or dilute value
Despite its advantages, Kubernetes is still complex and costly, often rightfully seen as a last resort. Managing multiple Kubernetes clusters is labor-intensive and expensive, so it is usually for organizations with strong technology know-how and advanced cloud environments.
Today, however, the burden of management and monitoring is much lower, allowing teams to focus on innovation and growth. GKE Enterprise, with its robust features and ease of multitenant environment management, will in my opinion be a game-changer for large industrial companies and service providers looking to harness the power of edge computing. By simplifying cluster operations and offering centralized control, GKE Enterprise enables businesses – specifically the businesses that have the needed maturity to lead modern cloud teams – to deploy and manage edge computing capabilities efficiently.
When all these prerequisites are fulfilled, Kubernetes will stop being a value destroyer that sucks time and energy and become a driver of innovation and operational excellence.
Key takeaways:
- Google Kubernetes Engine (GKE), GKE Enterprise, Anthos, and Google Distributed Cloud offer a comprehensive solution for managing Kubernetes clusters across different environments.
- Kubernetes has traditionally been seen as costly and complex , but these technologies make it more accessible, enabling advanced solutions like edge computing.
- With GKE Enterprise, telecom players like Telia already offer their customers multitenant edge computing services based on Kubernetes clusters.
About the author:
Jari Timonen, is an experienced software professional with more than 20 years of experience in the IT field. Jari’s passion is to build bridges between the business and the technical teams, where he has worked in his previous position at Cargotec, for example. At Codento, he is at his element in piloting customers towards future-compatible cloud and hybrid cloud environments.
Stay tuned for more detailed information and examples of the use cases! If you need more information about specific scenarios or want to schedule a free workshop to explore the opportunities in your organization, feel free to reach out to us.