Sunday, December 26, 2021

Azure DevOps Terraform Integration | How do you integrate Terraform with Azure DevOps | Automate Infrastructure setup using Terraform and Azure DevOps | Remote Store in S3 Bucket

We will be learning an interesting use case to provision resources in AWS cloud using Terraform and Azure DevOps. We will also learn how to store terraform state info remotely in AWS S3 bucket.

We will create S3 bucket for storing terraform state info and Dynamo DB table for providing state lock capability. 

We will try to create an EC2 instance and S3 Bucket using Terraform and Azure DevOps in AWS cloud. Look at the diagram that describes the whole flow. 


Watch the steps in YouTube channel:
    Pre-requisites:
    • Azure DevOps organization
    • Add Azure pipelines Terraform tasks 
    • Create AWS service connection in Azure DevOps for Terraform to use
    • Create service connection for connecting to GitHub
    • Create S3 bucket for storing TF state
    • Create dynamo DB table for providing lock capability
    • I have provided my public repo as an example which you can use.

    Step # 1 - Create S3 Bucket:
    Login to AWS, S3. Click on create S3 bucket.

    Give unique name to the bucket, name needs to be unique.

    Block all public access, enable bucket versioning as well.

    Enable encryption.


    Step # 2 - Create DynamoDB Table
    Create a new table with LockID as partition Key



    Step # 3 - Create Service connection to connect to AWS from Azure DevOps
    Go to Azure Devops, select your project. Project Settings

    Click Service Connections




    Select AWS for Terraform
    Enter Access Key, Secret Key and region code, enter name for service connection and choose Grant Access to all pipelines.


    Click Save.

    Create Service connection for connecting to GitHub

    Save

    Step 4 - Create a new Release Pipeline
    Click on Releases, New, choose New Release pipeline


    Select empty job. 

    Click on Add Artifacts


    Choose GitHub, select Github service connection, select the repo


    Click on Add tasks, type terraform
    choose the task


    Add task, search for install terraform and select installer task

    It should show something like this:
    Add another terraform task for init

    Search for terraform and add task

    select AWS from drop down, choose init as command and add -reconfigure as additional command arguments. Select AWS service connection and enter bucket name


    Add another task for plan, select right values.
    enter -out dev-plan 


    Add another task by cloning for apply.
    enter "dev-plan" as additional arguments


    Click on Save. 
    Create Release, now job should be running.





    Login to AWS--> S3 Bucket, you should see terraform state info 


    How to destroy all the resources created using Terraform?

    Clone the current infra setup release pipeline.

    Modify the pipeline name.

    Add 
    Modify the apply task to as shown in the diagram
    enter -destroy as additional argument.


    Click on Create Release to make sure all the resources are destroyed.


    Thursday, December 23, 2021

    Jenkins Terraform Integration | How do you integrate Terraform with Jenkins | Automate Infrastructure setup using Terraform and Jenkins | Remote Store in S3 Bucket

    We will be learning how to provision resources in AWS cloud using Terraform and Jenkins. We will also learn how to store terraform state info remotely in AWS S3 bucket.

    We will create S3 bucket for storing terraform state info and Dynamo DB table for locking capability. 

    We will try to create an EC2 instance and S3 Bucket using Terraform and Jenkins in AWS cloud. Look at the diagram that describes the whole flow. 

    Watch these steps in action in YouTube channel:



    Pre-requisites:
    • Create S3 bucket for storing TF state
    • Create dynamo DB table for providing lock capability
    • Jenkins is up and running
    • Terraform is installed in Jenkins
    • Terraform files already created in your SCM
    • Make sure you have necessary IAM role created with right policy and attached to Jenkins EC2 instance. see below for the steps to create IAM role.
    I have provided my public repo as an example which you can use. You can fork my repo and start making changes in your repo. How do you fork my repo?


    Click on Fork and create a new fork



    Step # 1 - Create S3 Bucket for storing Terraform state info
    Login to AWS, S3. Click on create S3 bucket.

    Give unique name to the bucket, name needs to be unique.

    Block all public access, enable bucket versioning as well.

    Enable encryption.


    Step # 2 - Create DynamoDB Table
    Create a new table with LockID as partition Key



    Step - 3 Create IAM role to provision EC2 instance in AWS 



    Select AWS service, EC2, Click on Next Permissions


    Type EC2 and choose AmazonEC2FullAccess as policy and type S3 and add AmazonS3FullAccess, type Dynamo

    Attach three policies

     

    Click on Next tags, Next Review
    give some role name and click on Create role.



    Step 4 - Assign IAM role to EC2 instance

    Go back to Jenkins EC2 instance, click on EC2 instance, Security, Modify IAM role


    Type your IAM role name my-ec2-terraform-role and Save to attach that role to EC2 instance.




    Step 5 - Create a new Jenkins Pipeline

    Give a name to the pipeline you are creating.



    Step 6 - Add parameters to the pipeline

    Click checkbox - This project is parameterized, choose Choice Parameter


    Enter name as action
    type apply and enter and type destroy as choices as it is shown below(it should be in two lines)


    Go to Pipeline section

    Add below pipeline code and modify per your GitHub repo configuration.

    pipeline {
        agent any

        stages {
            stage('Checkout') {
                steps {
                checkout scm
                }
            }
            
            stage ("terraform init") {
                steps {
                    sh ('terraform init -reconfigure') 
                }
            }
            stage ("terraform plan") {
                steps {
                    sh ('terraform plan') 
                }
            }
                    
            stage ("terraform Action") {
                steps {
                    echo "Terraform action is --> ${action}"
                    sh ('terraform ${action} --auto-approve') 
               }
            }
        }
    }
    Click on Build with Parameters and choose apply to build the infrastructure or choose destroy if you like to destroy the infrastructure you have built. 



    Click on Build With Parameters,
    choose apply from the dropdown
    Now you should see the console output if you choose apply.



    Pipeline will look like below:


    Login to AWS console


    Login to S3 Bucket, you should see terraform state info is also added


    How to Destroy all the resources created using Terraform?

    run the Jenkins Pipeline with destroy option.

    How to Configure GitHub Advanced Security for Azure DevOps | How to Perform Security scan for Azure Repos using GitHub Advanced Security

    GitHub Advanced Security for Azure DevOps brings the  secret scanning, dependency scanning  and  CodeQL code scanning  solutions already ava...