How to use GitHub Actions to automate Terraform
Use GitHub Actions to automate Terraform for a quick, easy CI/CD solution. See how as we create an AWS S3 website with Terraform and GitHub Actions.
Jun 08, 2023 • 7 Minute Read
In this blog post, we'll take a look at an example of using GitHub Actions to automate Terraform and give you a quick and easy CI/CD solution.
I'm sure most admins would agree, Terraform is a very powerful and efficient tool to deploy and manage infrastructure in your environment. But did you know you can make it a bit better and automate it using GitHub Actions?
GitHub Actions adds continuous integration to GitHub repositories to automate your software builds, tests, and deployments.
When automating Terraform with CI/CD, it will enforce configuration best practices, promote collaboration, and automate the Terraform workflow. In this blog, I'll walk you through creating an AWS S3 website using both Terraform and GitHub Actions.
Accelerate your career
Get started with ACG and transform your career with courses and real hands-on labs in AWS, Microsoft Azure, Google Cloud, and beyond.
Prerequisites
So first off you will need these three things setup and configured before being able to use GitHub Actions to automate Terraform:
- Terraform CLI downloaded and installed.
- You will need an AWS account and have an AWS Access Key and Secret Key as well as granting the user the AmazonS3FullAccess IAM permissions. You will also need to create an S3 bucket to store Terraform state remotely.
- Finally, you will need to set up a GitHub repository with the following structure:
- A directory called src which you will store your website code.
- A .github directory which you will then create a workflows directory in and store your GitHub Action configuration files.
- A terraform directory where you will store your Terraform configuration files.
Looking for a handy list of all the basic commands you need to get the most from Terraform? Check out our Terraform cheat sheet.
Configure your AWS Provider and Remote State
Assuming that you have all the prerequisites configured, the next thing you must do is configure Terraform to reference your AWS account.
You would need to create a file in your terraform directory in your GitHub repo called main.tf with the following configuration:
terraform {
required_providers {
aws = {
source = "hashicorp/aws"
Version = "~>3.27"
}
}
required_version = ">=0.14.9"
}
provider "aws" {
version = "~>3.0"
region = "east-us-1"
}
This code block will tell Terraform that we want to provision AWS resources and that we’re defaulting the resource creation to the east-us-1 region within AWS. (You can change this to whichever region works for you.)
Next, you will want to add the remote state to our configuration. You will need to add the following code block to the main.tf file right after the required_version value like below and swap out the placeholder value with your bucket name and your bucket key:
required_version = ">=0.14.9"
backend "s3" {
bucket = "[Remote_State_S3_Bucket_Name]"
key = "[Remote_State_S3_Bucket_Key]"
region = "east-us-1"
}
}
State is what Terraform uses to compare the current state of your infrastructure against the desired state. By default, Terraform stores state locally.
This is fine for testing and whatnot, but for most production level or large infrastructures, you should really look into storing state remotely for safekeeping and collaboration purposes.
Also since we are using GitHub Actions, the state will need to be created remotely. Without the remote state, Terraform would generate a local file, but it would not commit it to GitHub and in turn, we would lose the state data. This could cause all sorts of issues with automation since Terraform relies so much on the state data.
Finishing Up Your Terraform Configuration
Now that you have your remote backend added to your configuration, we can now add the resource block that will define what resources you would like Terraform to deploy to your infrastructure. You will need to add the following code block to create the S3 resource necessary for your website:
resource "aws_s3_bucket" "s3Bucket" {
bucket = "[BUCKET_NAME_HERE]"
acl = "public-read"
policy = <<EOF
{
"id" : "MakePublic",
"version" : "2012-10-17",
"statement" : [
{
"action" : [
"s3:GetObject"
],
"effect" : "Allow",
"resource" : "arn:aws:s3:::[BUCKET_NAME_HERE]/*",
"principal" : "*"
}
]
}
EOF
website {
index_document = "index.html"
}
}
Setting Up GitHub Actions
Now that we have that squared away, we can now move on to setting up our GitHub Actions.
Now you might be asking, "Why should I add our Terraform configuration to GitHub Actions or other CI/CD Pipelines?" Here are some reasons:
- Pipelines create more visibility. When working on managing infrastructure with Terraform on a team, you can easily see what is running and when it ran. When running locally, only you have that visibility.
- Pipelines create traceability. When running Terraform in a pipeline, they usually store logs. This allows you to review old builds and their outputs at your convenience.
- Pipelines create repeatability. When you configure a pipeline, it should do the same action every time. This will make debugging and troubleshooting much easier.
- And finally, they create simplicity. A pipeline can essentially take the place of your local instance. This way you don’t have to set up local dependencies and whatnot.
The first thing you'll need to do before your GitHub Actions can run is to add your AWS credentials to the repository. To do this you will need to follow these steps:
- Navigate to your repository and select the Settings tab.
- Once there you should see on the left a Secrets section third from the bottom of the list, click on that.
- Click on the New repository secret button.
- Add your AWS_SECRET_ACCESS_KEY and click the Add secret button.
- Repeat step 3 and add your AWS_ACCESS_KEY_ID and click the Add secret button.
Next, create the actions.yaml file in the .github/workflows directory with the following code:
name: Deploy Infrastructure
on:
push:
branches:
- master
jobs:
tf_fmt:
name: Deploy Site
runs-on: ubuntu-latest
steps:
- name: Checkout Repo
uses: actions/checkout@v1
- name: Terraform Init
uses: hashicorp/terraform-github-actions/init@v0.4.0
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
TF_ACTION_WORKING_DIR: 'terraform'
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
- name: Terraform Validate
uses: hashicorp/terraform-github-actions/validate@v0.3.7
- name: Terraform Apply
uses: hashicorp/terraform-github-actions/apply@v0.4.0
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
TF_ACTION_WORKING_DIR: 'terraform'
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
- name: Sync S3
uses: jakejarvis/s3-sync-action@master
env:
SOURCE_DIR: './src'
AWS_REGION: 'us-east-1'
AWS_S3_BUCKET: '[BUCKET_NAME_HERE]'
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
Here we are declaring that anytime there is a push to the src directory it will kick off the action which will have Terraform deploy the changes made to your website. You'll need to make sure you update the yaml file with your bucket name. You'll also need to make sure you have created the src directory and have added an index.html file to it.
Success!
That’s it! You have successfully created a CI/CD Pipeline using Terraform and GitHub Actions. Just think of the possibilities and how much time a process like this can save you especially as your site or infrastructure grows. Using Terraform and GitHub Actions can really help create a streamlined, easy to manage, repeatable process that saves time and headaches. Until next time, gurus!
Learn more about using Terraform to manage applications and infrastructure
Check out my new course, Using Terraform to Manage Applications and Infrastructure, for an exciting journey into the wonderful world of Terraform!
We'll explore how admins can use Terraform to easily deploy infrastructure to a variety of providers. Whether it’s a single, simple configuration or a more complex configuration with multiple providers, this course will demonstrate how simple it is to manage infrastructure from one place.
Get the Cloud Dictionary of Pain
Speaking cloud doesn’t have to be hard. We analyzed millions of responses to ID the top concepts that trip people up. Grab this cloud guide for succinct definitions of some of the most painful cloud terms.