Google Unleashes Container Engine For Docker Workloads - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Cloud // Platform as a Service
10:35 AM
Connect Directly

Google Unleashes Container Engine For Docker Workloads

The Google Compute Cloud has gained a service for launching Docker containers and managing their lifecycles.

7 Worst Cloud Compliance Nightmares
7 Worst Cloud Compliance Nightmares
(Click image for larger view and slideshow.)

Google's strength in running large groups of containers will give its Compute Engine Infrastructure-as-a-Service (IaaS) offering added appeal to developers now that the company has taken its Google Container Engine out of its alpha phase and made it generally available.

Google Container Engine, which goes by the acronym GKE to avoid being confused with Google Compute Engine (GCE), orchestrates the launch and management of Docker containers on a cluster on Google Compute Engine.

It's basically similar to Amazon's Container Service (announced last November at Amazon Web Services' ReInvent show), which orchestrates the launch of Docker containers on EC2.

Either can build and deploy containers to a cluster and monitor their lifecycle there.

However, Google has put its Kubernetes orchestration and management system into its container engine and given it the concept of pods. Containers in a pod are all alike, which enables them to share data, provided they're deployed to a single cluster.

Google launches 2 billion containers a week and has deep experience in managing homogenous containers as a set, as opposed to the simpler task of distributing thousands of dissimilar containers across different servers and clusters.

(Image: 4x6/iStockphoto)

(Image: 4x6/iStockphoto)

Pods, when deployed correctly, can be used to help scale out microservices, allowing a set of containers to access a shared caching system or shared pool of long-term memory for speed of operation. The efficient use of containerized services leads to the rapid building and fast execution of microservice applications, sometimes referred to as next generation or "cloud native" applications.

"Everything at Google, from Search to Gmail, is packaged and run in a Linux container ... Container Engine represents the best of our experience," said Craig McLuckie, senior product manager for Google Compute Engine, in a blog announcing the change Wednesday, Aug. 26.

Google didn't start out using Docker containers, having come up with its own approach to Linux containers 10 years before Docker became popular. But McLuckie has previously been clear that Docker represents a de facto standard formatting engine for the rest of the industry, and Google will standardize on how it works, as opposed to trying to convert the world to its own approach.

At the Linux Collaboration Summit last February, McLuckie said, "Docker captured lightning in a bottle."

Google Container Engine also includes Replication Controllers, to manage the lifecycle of pods and ensure there are enough pods and containers to accomplish a given application service. It includes Services, or load balancers that abstract a set of related pods and route traffic to the right one in the set; and Labels, an identifier that Kubernetes uses to select homogenous pods to perform a common task.

"Many applications take advantage of multiple containers; for example, a Web application might have separate containers for the webserver, cache, and database. Container Engine ... makes it easy for your containers to work together as a single system," McLuckie wrote in his blog.

Container Engine is managed by Google reliability engineers, with infrastructure updates and continuous availability provided by them. A user defines the amount of CPU and memory to reserve, the number of replicas, networking, and the period of the keep-alive policy, and the engine does the rest.

Container Engine includes a scheduler, which launches commissioned containers into a virtual machine cluster, manages them based on the declared requirements, and kills them off at the end of their lifecycle.

Container Engine institutes server logging in a container cluster and container health monitoring for feedback on how well the application is running. It can also commission additional memory or CPU capacity for a given cluster to help it meet the traffic demand on an application.

[Want to learn about a recent Google cloud mishap? See Google Loses Data: Who Says Lightning Never Strikes Twice?]

Docker containers make it easier to move software around between clusters, data centers, or clouds. As it's emerged as a de facto standard, the Docker formatting approach has served largely as the model for the App C specification of the Open Container Initiative. CoreOS, a supplier of container host Linux, drew up the specification.

If CoreOS, Docker, and other container technology suppliers adhere to the spec, it will be a step toward making container operation on clouds more interoperable.

For that matter, a Docker container that is ready to be deployed by the Amazon Container Service today is also theoretically ready for operation on Google Compute Engine. Each container contains the elements needed in its deployment environment along with instructions for operating system services that it needs from the host.

At the OpenStack Silicon Valley gathering of OpenStack implementers in Mountain View, California, Wednesday, Aug. 26, McLuckie went a step further in his description of Docker. "Docker recognized the value of the stackable file system. You can just deploy it and it's great. It's good for the developer experience," he told the assembly at the Computer History Museum.

Docker creates "a really amazing first five hours" for a developer as he or she finishes writing code and gets it ready for deployments. But McLuckie added, "I'm worried about its operation for the next five years." For improvements on that front, Google and other cloud suppliers will keep working on their container management services.

Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive ... View Full Bio

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
Charlie Babcock
Charlie Babcock,
User Rank: Author
8/30/2015 | 9:45:12 PM
It's a Google-branded service
Tzubair, it's a Google-originated and branded cloud service, not a third-party offering on the Google Cloud Platform. And I think Amazon, Microsoft and Google all have the same 99.95% SLAs.  
User Rank: Ninja
8/30/2015 | 6:16:31 PM
So if an application is hosted Google Container, does the end consumer get to know that Google is behind it? Or, is it only something that's a B2B concept and Google is not interested in having it part of their consumer portfolio?
User Rank: Ninja
8/30/2015 | 6:15:13 PM
Re: Container Management Engine SLA is 99.95%
@Charles: From what I have read, this SLA up-time seems to be the highest and I don't think anyone can beat it. Further, it has Google's strong brand name attached to it which makes it seem even more reliable.
User Rank: Strategist
8/27/2015 | 5:00:36 PM
Container Management Engine SLA is 99.95%
Google Container Engine has a guaranteed availability of 99.95%, like other Google cloud services. Is that worse, as good as or better than an on-premises container management system?
How to Create a Successful AI Program
Jessica Davis, Senior Editor, Enterprise Apps,  10/14/2020
Think Like a Chief Innovation Officer and Get Work Done
Joao-Pierre S. Ruth, Senior Writer,  10/13/2020
10 Trends Accelerating Edge Computing
Cynthia Harvey, Freelance Journalist, InformationWeek,  10/8/2020
White Papers
Register for InformationWeek Newsletters
Current Issue
[Special Report] Edge Computing: An IT Platform for the New Enterprise
Edge computing is poised to make a major splash within the next generation of corporate IT architectures. Here's what you need to know!
Flash Poll