Saturday, March 19, 2022

Install Jenkins on Ubuntu 22.0.4 using Docker Compose | Setup Jenkins on AWS EC2 Ubuntu instance | How to setup Jenkins in Ubuntu EC2 instance using Docker?

Please follow the steps to install Jenkins using Docker compose on Ubuntu 22.0.4 instance. 

What is Docker Compose?
Compose is a tool for defining and running multi-container Docker applications. With Compose, you use a YAML file to configure your application’s services. Then, with a single command, you create and start all the services from your configuration.
 
The purpose of docker-compose is to function as docker cli but to issue multiple commands much more quickly. To make use of docker-compose, you need to encode the commands you were running before into a docker-compose.yml file
 
Run docker-compose up and Compose starts and runs your entire app.

Change Host Name to Jenkins
sudo hostname Jenkins

Perform update first
sudo apt update

Now lets start Docker. compose installation first:

Install Docker-Compose
sudo apt-get install docker-compose -y

Add current user to docker group
sudo usermod -aG docker $USER

Create directory
mkdir ~/jenkins

Jenkins Setup

Create docker-compose.yml
this yml has all configuration for installing Jenkins
sudo vi docker-compose.yml 

version: '3.3'
services:
  jenkins:
    image: jenkins/jenkins:lts
    restart: unless-stopped
    privileged: true
    user: root
    ports:
      - 8080:8080
    container_name: jenkins
    volumes:
      - ~/jenkins:/var/jenkins_home
      - /var/run/docker.sock:/var/run/docker.sock
      - /usr/local/bin/docker:/usr/local/bin/docker

Now execute the compose file using Docker compose command:
sudo docker-compose up -d 



Make sure Jenkins is up and running
sudo docker-compose logs --follow
You can also get the admin password


How to get Jenkins admin password in another way?
Identify Docker container name

sudo docker ps


Get admin password by executing below command
sudo docker exec -it jenkins cat /var/jenkins_home/secrets/initialAdminPassword


Access Jenkins in web browser

Now Go to AWS console. Click on EC2, click on running instances link. Select the checkbox of EC2 you installed Jenkins. Click on Action. Copy the value from step 4 that says --> Connect to your instance using its Public DNS:

Now go to browser. enter public dns name or public IP address with port no 8080.

Unlock Jenkins
You may get screen, enter the below command in Git bash( Ubuntu console)

Copy the password and paste in the browser.
Then click on install suggested plug-ins. 
Also create user name and password.
enter everything as admin. at least user name as admin password as admin
Click on Save and Finish. Click on start using Jenkins. Now you should see a screen like below:



That's it. You have setup Jenkins successfully using Docker compose. 
Please watch the steps in our YouTube channel.

Thursday, March 3, 2022

How to store Terraform state file in Azure Storage | How to manage Terraform state in Azure Blob Storage | Terraform Remote state in Azure Blob storage | Terraform backend

One of the amazing features of Terraform is, it tracks the infrastructure that you provision. It does this through the means of state. By default, Terraform stores state information locally in a file named terraform.tfstate. This does not work well in a team environment where if any developer wants to make a change he needs to make sure nobody else is updating terraform in the same time. You need to use remote storage to store state file.

With remote state, Terraform writes the state data to a remote data store, which can then be shared between all members of a team. Terraform supports storing state in many ways including the below:

  • Terraform Cloud
  • HashiCorp Consul
  • Amazon S3
  • Azure Blob Storage
  • Google Cloud Storage
  • Alibaba Cloud OSS
  • Artifactory or Nexus 

We will learn how to store state file in Azure Blob storage. We will be creating Azure storage account and container.

Watch the steps in YouTube channel:

Pre-requisites:

Steps:

Logging into the Azure Cloud

Login into the Azure Cloud using Azure CLI using:

az login

enter your microsoft username and password to login to Azure cloud

Create main.tf

terraform { required_providers { azurerm = { source = "hashicorp/azurerm" version = "=2.63.0" } } } provider "azurerm" { features {} } resource "azurerm_resource_group" "demo-rg" { name = "demo-resource-group" location = "eastus" }



terraform init 
terraform plan 
this will show it will create one resource


terraform apply 
Now this will create a local terraform state file in your machine.

How to store Terraform state file remotely?

Step # 1 - Configure Azure storage account

Before you use Azure Storage as a backend, you must create a storage account. We will create using shell script:

#!/bin/bash
RESOURCE_GROUP_NAME=tfstate
STORAGE_ACCOUNT_NAME=tfstate$RANDOM
CONTAINER_NAME=tfstate
# Create resource group
az group create --name $RESOURCE_GROUP_NAME --location eastus
# Create storage account
az storage account create --resource-group $RESOURCE_GROUP_NAME --name $STORAGE_ACCOUNT_NAME --sku Standard_LRS --encryption-services blob
# Create blob container
az storage container create --name $CONTAINER_NAME --account-name $STORAGE_ACCOUNT_NAME


This should have created resource group, storage account and container in Azure portal.


Step # 2 - Configure terraform backend state 

To configure the backend state, you need the following Azure storage information which we created above:

    • resource_group_name: name of the resource group under which all resources will be created.
    • storage_account_name: The name of the Azure Storage account.
    • container_name: The name of the blob container.
    • key: The name of the state store file to be created.
    Create backend.tf file
    We need to create a backend file.

    terraform {
    backend "azurerm" {
    resource_group_name = "tfstate"
    storage_account_name = "<storage_acct_name>"
    container_name = "tfstate"
    key = "terraform.tfstate"
    }
    }

    terraform init --reconfigure


    type yes

    This should have created backend file called terraform.tfstate in a container inside azure storage.

    You can view remote state file info:


    This is how you can store terraform state information remotely in Azure storage. 

    Now let's make changes to main.tf to create more resources

    edit main.tf

    resource "azurerm_container_registry" "acr" {
      name                = "myacr563123"
      resource_group_name = azurerm_resource_group.demo-rg.name
      location            = azurerm_resource_group.demo-rg.location
      sku                 = "Standard"
      admin_enabled       = false
    }

    terraform plan

    terraform apply --auto-approve

    This will update terraform state file remotely in azure blob container.

    How to Configure GitHub Advanced Security for Azure DevOps | How to Perform Security scan for Azure Repos using GitHub Advanced Security

    GitHub Advanced Security for Azure DevOps brings the  secret scanning, dependency scanning  and  CodeQL code scanning  solutions already ava...