Deploying Containerized Workloads Using Google Cloud Kubernetes Engine
This course deals with the Google Kubernetes Engine, the most robust and seamless way to run containerized workloads on the GCP. Cluster creation, the use of volume storage abstractions, and ingress and service objects are all covered in this course.
What you'll learn
Running Kubernetes clusters on the cloud involves working with a variety of technologies, including Docker, Kubernetes, and GCE Compute Engine Virtual Machine instances. This can sometimes get quite involved.
In this course, Deploying Containerized Workloads Using Google Cloud Kubernetes Engine, you will learn how to deploy and configure clusters of VM instances running your Docker containers on the Google Cloud Platform using the Google Kubernetes Service.
First, you will learn where GKE fits relative to other GCP compute options such as GCE VMs, App Engine, and Cloud Functions. You will understand fundamental building blocks in Kubernetes, such as pods, nodes and node pools, and how these relate to the fundamental building blocks of Docker, namely containers. Pods, ReplicaSets, and Deployments are core Kubernetes concepts, and you will understand each of these in detail.
Next, you will discover how to create, manage, and scale clusters using the Horizontal Pod Autoscaler (HPA). You will also learn about StatefulSets and DaemonSets on the GKE.
Finally, you will explore how to share states using volume abstractions, and field user requests using service and ingress objects. You will see how custom Docker images are built and placed in the Google Container Registry, and learn a new and advanced feature, binary authorization.
When you’re finished with this course, you will have the skills and knowledge of the Google Kubernetes Engine needed to construct scalable clusters running Docker containers on the GCP.
Table of contents
- Module Overview 2m
- Prerequisites and Course Outline 3m
- Introducing Containers 5m
- Introducing Kubernetes 7m
- Clusters, Nodes, Node Pools, and Node Images 4m
- Pods 7m
- Kubernetes as an Orchestrator 4m
- Replication and Deployment 4m
- Services 2m
- Volume Abstractions 4m
- Load Balancers 3m
- Ingress 2m
- StatefulSets and DaemonSets 2m
- Horizontal Pod Autoscaler 4m
- Pricing 2m
- Module Overview 1m
- Web Console: Creating a GKE Cluster 7m
- Web Console: Creating Deployments 5m
- Web Console: Exposing a Service 2m
- Behind the Scenes Firewall Rules and the Load Balancer 3m
- Web Console: Deleting the Service Deployment and the Cluster 2m
- gcloud: Creating a Regional Cluster 5m
- gcloud: Creating and Updating Zonal Clusters 4m
- Configuring Custom Node Pools and Resizing Clusters 4m
- Using kubectl to Manage Deployments and Expose Services 5m
- Using kubectl to Update Clusters Delete Services and Deployments 2m
- Configure a Horizontal Pod Autoscaler Using kubectl 4m
- Scaling Clusters Using the Horizontal Pod Autoscaler 3m
- Module Overview 1m
- Creating a Custom Docker Image 3m
- Registering a Custom Image with the Google Container Registry 2m
- Deploying a Custom Image and Exposing a Service 3m
- Creating and Configuring Ingress Objects 3m
- Working with Ingress Objects 3m
- Persistent Volume Claims 4m
- Deploying and Exposing Multiple Services 6m
- Exploring Services Pods Using kubectl 2m
- Binary Authorization 6m
- Create and Deploy a Custom Test Container 3m
- Enable Binary Authorization and Secure Cluster with a Policy 2m
- Create a Note and Associate It with an Attestor 4m
- Generate and Associate a Public Key with the Attestor 4m
- Deploy Signed Containers 3m