There was a time when containers weren’t considered an industry changing effort but over a period of time, container platforms like Kubernetes is becoming the new operating system, what with cloud-native tech giants Google, AWS and Microsoft putting all their might behind it.
In the computing world, containers are defined as the operating system virtualization that enables developers to run an application and its dependencies in resource-isolated processes. All in all, containers can package an application’s code, configurations, and dependencies into easy to use building blocks that deliver environmental consistency, operational efficiency, developer productivity, and version control. This is how the AWS technical documentation sums it up.
Seema Kumar, Country Leader, Developer Ecosystem & Startups, IBM India/South Asia, in her address at Cypher 2018 defined containers as self contained units in the cloud world with the required software which work like an Operating System. Developers can run application within that unit making it easy to plug and play that application from one environment to the other, ruling out the dependency on underlying hardware and it can run on any cloud native platform.
How tech giants are backing containers — pegged as the future of IaaS & deep learning jobs
Now, why containers are important is because they give more control to developers over resources and running containers in the cloud, like Azure or AWS allows the developers to build robust, scalable applications and one only pays for the resources used (memory, disk space or CPU). In terms of computing resources, containers help developers to get the most by enabling them to run multiple applications on a similar instance. By embracing container platforms, cloud natives are helping enterprise customers get their workloads into the cloud, and embrace a multi-cloud hybrid architecture.
Today, every cloud provider is stepping up the game around containers, which has become the dominant way of packaging applications and delivering a cloud-native management experience.
Containers are believed to be a start to something bigger, with tech heavyweights like Microsoft and Google selling and running enterprise software as pre-packaged Kubernetes applications. And this is backed by a bunch of reasons. Seema Kumar, Country Leader, Developer Ecosystem & Startups, IBM India/South Asia, spoke about the importance of containers in the information architecture and how a unified architecture can enable AI applications.
In her address at Cypher 2018, she outlined why popular container standards – Kuberenete and Docker can pave the way for unified and simplified platform for a) collecting data (database on demand, federated query, fast data ingest); b) organizing data (data integration, data curation, governance & data asset lifecycle management); c) Analyze data (data visualization & exploration, machine learning & deep learning, model management & deployment).
In her talk, Kumar emphasised, since data comes from multiple sources and database – containers is the primary use cases. As more and more enterprises move towards AI and cloud, not all enterprises want to jump on the public cloud and look at building cloud native applications. Citing an example, Kumar shared there are cases where enterprises are building 10 applications, and do not want to put it on an externally managed third-party cloud. Hence, building on a cloud-native containerized architecture allows enterprises to work in a multi-cloud world or a hybrid environment. Containers and microservices have become the basic foundation for building an information architecture – a cloud-native based platform which is based on microservices.
Some of the key reasons for containers are highlighted below:
1) Enterprises want to manage their third-party applications in a similar way they manage the applications they build themselves
2) Building and packaging this way makes it easier for vendors to ship, control and continuously deliver on their applications
3) Many enterprise buyers demand control of managing their own application infrastructure.
Google annoucned Asylo – open source framework for confidential computing
And to add more firepower to containers and allow enterprises and developers to own future workloads, Google announced an open source Asylo framework for developing containerized applications that run in hardware-based trusted execution environments, and with a point release of Kubeflow, an open source project solely for running TensorFlow deep learning jobs on top of Kubernetes. Meanwhile, Microsoft too is rallying behind Kubernetes by providing an open source, controllable, composable, artificial intelligence and even serverless computing platform.
Today, every large software vendor is making money off container platforms by building out Kubernetes product lines and businesses, because there’s money to be made in the micro-service and the developer tool-space even if the enterprise is not a cloud-native thought leader. Some of the key companies that are driving container platforms are making a chunk of revenue are Google, Microsoft, Red Hat and even IBM.
But Google has gained considerable leadership in the Kubernetes space by drawing developers around the container platform and embrace and develop other open source technologies in the software-development arena. There’s a longer path to the top, wherein Google attracts developers interested in things like open source, portability and composability, and it eats away at AWS’s market share from the bottom up.
Serverless vs Containers
In many cases, serverless is still pegged as the preferred method and architecture for developers, but IT operations teams and engineering teams are going to decide how serverless is consumed. For enterprises that prefer to use a ready-made service, AWS Lambda is the definitive market leader in this space. However, in cases where businesses want some level of control at application level, a serverless platform built atop Kubernetes is going to be the preferred option.
Try deep learning using MATLAB