Thursday, December 23, 2021

Jenkins Terraform Integration | How do you integrate Terraform with Jenkins | Automate Infrastructure setup using Terraform and Jenkins | Remote Store in S3 Bucket

We will be learning how to provision resources in AWS cloud using Terraform and Jenkins. We will also learn how to store terraform state info remotely in AWS S3 bucket.

We will create S3 bucket for storing terraform state info and Dynamo DB table for locking capability. 

We will try to create an EC2 instance and S3 Bucket using Terraform and Jenkins in AWS cloud. Look at the diagram that describes the whole flow. 

Watch these steps in action in YouTube channel:



Pre-requisites:
  • Create S3 bucket for storing TF state
  • Create dynamo DB table for providing lock capability
  • Jenkins is up and running
  • Terraform is installed in Jenkins
  • Terraform files already created in your SCM
  • Make sure you have necessary IAM role created with right policy and attached to Jenkins EC2 instance. see below for the steps to create IAM role.
I have provided my public repo as an example which you can use. You can fork my repo and start making changes in your repo. How do you fork my repo?


Click on Fork and create a new fork



Step # 1 - Create S3 Bucket for storing Terraform state info
Login to AWS, S3. Click on create S3 bucket.

Give unique name to the bucket, name needs to be unique.

Block all public access, enable bucket versioning as well.

Enable encryption.


Step # 2 - Create DynamoDB Table
Create a new table with LockID as partition Key



Step - 3 Create IAM role to provision EC2 instance in AWS 



Select AWS service, EC2, Click on Next Permissions


Type EC2 and choose AmazonEC2FullAccess as policy and type S3 and add AmazonS3FullAccess, type Dynamo

Attach three policies

 

Click on Next tags, Next Review
give some role name and click on Create role.



Step 4 - Assign IAM role to EC2 instance

Go back to Jenkins EC2 instance, click on EC2 instance, Security, Modify IAM role


Type your IAM role name my-ec2-terraform-role and Save to attach that role to EC2 instance.




Step 5 - Create a new Jenkins Pipeline

Give a name to the pipeline you are creating.



Step 6 - Add parameters to the pipeline

Click checkbox - This project is parameterized, choose Choice Parameter


Enter name as action
type apply and enter and type destroy as choices as it is shown below(it should be in two lines)


Go to Pipeline section

Add below pipeline code and modify per your GitHub repo configuration.

pipeline {
    agent any

    stages {
        stage('Checkout') {
            steps {
            checkout scm
            }
        }
        
        stage ("terraform init") {
            steps {
                sh ('terraform init -reconfigure') 
            }
        }
        stage ("terraform plan") {
            steps {
                sh ('terraform plan') 
            }
        }
                
        stage ("terraform Action") {
            steps {
                echo "Terraform action is --> ${action}"
                sh ('terraform ${action} --auto-approve') 
           }
        }
    }
}
Click on Build with Parameters and choose apply to build the infrastructure or choose destroy if you like to destroy the infrastructure you have built. 



Click on Build With Parameters,
choose apply from the dropdown
Now you should see the console output if you choose apply.



Pipeline will look like below:


Login to AWS console


Login to S3 Bucket, you should see terraform state info is also added


How to Destroy all the resources created using Terraform?

run the Jenkins Pipeline with destroy option.

No comments:

Post a Comment

How to Configure GitHub Advanced Security for Azure DevOps | How to Perform Security scan for Azure Repos using GitHub Advanced Security

GitHub Advanced Security for Azure DevOps brings the  secret scanning, dependency scanning  and  CodeQL code scanning  solutions already ava...