Join us

The CIO’s Guide to Kubernetes and Eventual Transition to Cloud-Native Development

17.width-730.format-webp-lossless.png

Cloud-native has emerged as the gold standard of software development, revolutionizing the way we develop, deploy, and operate software applications at scale. For non-starters, cloud-native development goes beyond just signing up for a cloud vendor and influences the design, implementation, deployment, and operation of your application.

It’s an approach of building software applications as microservices and running them on a containerized and dynamically orchestrated platform to harness the full potential of the cloud computing model.

While cloud-native development brings immense benefits to engineers developing the application, it also brings unparalleled benefits to organizations looking to cut costs and time to market.

Cloud-native development or business-as-usual?

Time to market is the key differentiator standing between your innovative sprint and closing-in competition. As an organization, you want to conceive, build, and deliver additional value to your customers without causing disruption. Modern DevOps practices focus on automation of software delivery pipelines - build, test, or deployment automation - the pipelines have undergone a transformation and are faster and more predictable than ever.

DevOps processes are at the center of today’s cloud-native development efforts. They empower end-to-end automation of delivery pipelines and open new avenues of collaboration, taking away restrictions reminiscent of the local development and manual pipeline era.

Delivering a consistent experience to your customers regardless of their device is also a contribution of the cloud. However, today that is more of a need than luxury and in a cloud-native development world, this cross-device compatibility comes granted. Much power to API-based integration to connect vast enterprise data stores with light-footed front-end apps running on all sorts of devices and browsers.

Additionally, you don’t have to abandon the decades of investment in legacy platforms. Rather, you can elongate their application lifecycles by extending access from mobile and web applications. If you believe your traditional business models are being threatened and disrupted by smaller, more agile startups, cloud-native development can give you the much-needed breathing space. But is cloud-native development all bells, no whistles?

Is cloud-native development for your organization?

All the excitement around adopting cloud-native development at the enterprise-level may fade away if you overlook the challenges arising out of reconstructing your legacy, monolithic applications to smaller microservices while adopting a cloud-native strategy. These challenges if not addressed as a part of your cloud-native migration strategy may disrupt your users’ experience down the line.

Microservices while removing some complexity from the services themselves and make your application more scalable as a whole, it has all the limitations of a distributed software system. To give you a hint, microservices are not always independent as it has been probably peddled to you. They may share dependencies between them.

Dependable microservices can create a software system of higher complexity that is harder to manage, monitor, and debug. You should always insist on independent microservices whenever possible and discoverable whenever not.

Intercommunication between microservices is crucial not when a slow or unavailable service is affecting the application performance. The distributed nature of microservices doesn’t make monitoring performance of your application any easier. Remember, now you need to monitor a system of microservices, and for each service, there might be several instances that run in parallel.

In the case of cloud-native applications or architectures, the threat of becoming too dependent on a particular cloud provider or service can be especially great, due to the ease with which workloads can be deployed in such a way that they require a particular service from a particular cloud.

Fortunately, mitigating this cloud lock-in risk is easy if you, as said, prioritize it in your cloud-native migration strategy.

To begin, stick to community-based standards (some of them are promoted by the OCI). They will ensure when it takes, your workloads are easy to move between different cloud vendors. Similarly, as you decide which cloud services to go with when going cloud-native, reconsider whether any of the services you are considering have unique features that are unavailable on other cloud vendors. Avoid those features since they can lock you into the vendor.

Cloud-native development with Kubernetes

Kubernetes is at the center of development efforts going inside the cloud-native world. No, Kubernetes isn’t something cloud-native computing can’t take off without. Yet for many organizations, Kubernetes is where their cloud-native strategies begin. Evidently, a majority of CNCF projects include Kubernetes compatibility as a first-order feature.

Cloud-native development is often defined as the practice of application development that is designed to leverage the cloud computing delivery model. In practice, this implies microservices architectures and the “new” ways of packaging services (such as containers or “serverless” functions as in AWS Lambda) drives application-lifecycle: continuous testing, monitoring, and delivery.

Depending on user preferences, the tools of cloud-native development can be in the cloud or on-premise and employ open source or proprietary tools & services.

Nevertheless, a responsible cloud-native development strategy requires complementing a container-orchestration platform such as Kubernetes with a collection of other tools: application management, CI/CD, security, monitoring, observability, networking, and service discovery. This is particularly relevant when using pure open source tools that don’t include essential components and capabilities out of the box.

Kubernetes Stories from Netflix, Airbnb’ and Spotify

The rapid rise of Netflix as the world's leading Internet entertainment service would simply not have been possible with monolithic databases and applications. It is solely through microservices that the company has been able to achieve success, with its streaming service operating three million containers per week as of April 2018.

Airbnb’s transition from a monolithic to a microservices architecture is an inspiring one. They needed to scale continuous delivery horizontally, and the objective was to make available continuous delivery to the company’s over 1,000 engineers so they could add new services and make a transition to cloud-native development. Airbnb’s adoption of Kubernetes allows it to support 1,000-plus engineers parallelly configuring and deploying over two hundred and fifty critical services to Kubernetes. Today, Airbnb can do, on an average, over 500 deploys per day.

Spotify addressed the issues with monolithic applications when microservices and containerization were still buzzwords. They saw the potential and started disintegrating their web services into microservices and putting them into Docker containers. 2017 was the year when Kubernetes was still new to the block. Spotify could have given the new container orchestration system a passé with Helios around the block. Sure, Kubernetes had a growing community, but it was nowhere the shining new way to manage containers it is today. Then again, Kubernetes had all the bells & whistles blowing that inspired Spotify to adopt microservices architecture three years ago.

Read more about other Kubernetes success stories.

Kubernetes, and a seamless transition to cloud-native but ..

Kubernetes brings a “microservices” approach to application building. To go cloud-native, now you can split your main development team into many smaller teams with a focus on a single, smaller microservice. These teams are more agile because they have a more concentrated function. APIs between these microservices decrease the amount of cross-team communication needed to build and deploy. So, eventually, you can scale numerous small teams of concentrated experts who each help maintain a sleeve of thousands of machines.

In addition, Kubernetes enables your IT teams to run large applications across various containers more resourcefully by managing many of the essential facts of maintaining container-based apps.

However, the fact that there is a change in paradigms when your team moves to Kubernetes may add complexity to your existing functional and product complexity, especially if you are developing microservices:

More moving pieces equals more touchpoints equals more room for failures and more entropy.

Read more about Kubernetes Failure Stories.

Kubernetes is not complicated, but complex. As we can observe in the landscape of distributed computing, this complexity pushes it to become invisible.

Several companies, giants, and startups are working on simplifying this complexity. The idea, in general, is to create management overlays to automate and standardize common tasks and create simplification levers.

We find this willingness to simplify the implementation and management of Kubernetes at Redhat with their OpenShift. Rancher, Mirantis, or Nutanix with Karbon. Even Google has leaned towards the same trend with Athnos.

If you're a fan of serverless, nothing after googling "Serverless Kubernetes", you'll understand that Serverless on Kubernetes, is not only trendy, but it's a whole new universe to explore. Solutions such as Kubeless, Fission, Knative, OpenFaas, and Openshift Serverless are currently available on the market.

In all cases, the offered solutions increasingly seek to hide the complexity of the Kubernetes platform in order to make it totally invisible to developers and as simple as possible to administer for IT teams.

Therefore, there is a simple fact; it is not necessarily worthwhile for companies to invest, especially in the long term, in Kubernetes skills in-house. The platform is disappearing behind layers of abstraction, service providers, IaaS, PaaS, and SaaS solutions. Turnkey solutions from specialist cloud and infrastructure players won't be the thing you're going to lack.


Only registered users can post comments. Please, login or signup.

Start blogging about your favorite technologies, reach more readers and earn rewards!

Join other developers and claim your FAUN account now!

The Chief I/O

Insights for Cloud Native Leaders

Avatar

Aymen El Amri

Founder, FAUN

@eon01
Founder of FAUN, author, maker, trainer, and polymath software engineer (DevOps, CloudNative, CloudComputing, Python, NLP)
User Popularity
2k

Influence

209k

Total Hits

38

Posts

Mentioned tools