Thursday, February 25, 2021

Automate Docker builds using Jenkins Pipelines | Dockerize Python App | Upload Images into AWS ECR

We will learn how to automate Docker builds using Jenkins. We will use Python based application. I have already created a repo with source code + Dockerfile. We will see how to create Docker image and upload into AWS ECR successfully. We will not be using AWS access keys to upload image into ECR, we will be using IAM role and attach to Jenkins instance to access ECR.


- Automating builds
- Automating Docker image builds
- Automating Docker image upload into AWS ECR
- Automating Docker container provisioning
 
Watch here for YouTube channel:
 
Pre-requistes:
1. Jenkins is up and running
2. Docker installed on Jenkins instance. Click here to for integrating Docker and Jenkins
3. Docker and Docker pipelines plug-in are installed
4. Repo created in ECR, Click here to know how to do that.
5. port 8096 is opened up in firewall rules. 
6. Create an IAM role with AmazonEC2ContainerRegistryFullAccess policy, attach to Jenkins EC2 instance

Step # 1 - Create a pipeline in Jenkins, name can be anything

Step # 2 - Copy the pipeline code from below
Make sure you change red highlighted values below:
Your account_d should be updated and repo should be updated.

pipeline {
    agent any
    environment {
        registry = "acct_id.dkr.ecr.us-east-2.amazonaws.com/
your_ecr_repo"
    }
   
    stages {
        stage('Cloning Git') {
            steps {
                checkout([$class: 'GitSCM', branches: [[name: '*/master']], doGenerateSubmoduleConfigurations: false, extensions: [], submoduleCfg: [], userRemoteConfigs: [[credentialsId: '', url: 'https://github.com/akannan1087/myPythonDockerRepo']]])     
            }
        }
  
    // Building Docker images
    stage('Building image') {
      steps{
        script {
          dockerImage = docker.build registry
        }
      }
    }
   
    // Uploading Docker images into AWS ECR
    stage('Pushing to ECR') {
     steps{  
         script {
                sh 'aws ecr get-login-password --region us-east-2 | docker login --username AWS --password-stdin
acct_id.dkr.ecr.us-east-2.amazonaws.com'
                sh 'docker push
acct_id.dkr.ecr.us-east-2.amazonaws.com/your_ecr_repo:latest'
         }
        }
      }
   
         // Stopping Docker containers for cleaner Docker run
     stage('stop previous containers') {
         steps {
            sh 'docker ps -f name=mypythonContainer -q | xargs --no-run-if-empty docker container stop'
            sh 'docker container ls -a -fname=mypythonContainer -q | xargs -r docker container rm'
         }
       }
      
    stage('Docker Run') {
     steps{
         script {
                sh 'docker run -d -p 8096:5000 --rm --name mypythonContainer
acct_id.dkr.ecr.us-east-2.amazonaws.com/your_ecr_repo:latest'
            }
      }
    }
    }
}

Step # 3 - Click on Build - Build the pipeline
Once you create the pipeline and changes values per your ECR account ID, click on Build now.
Steps # 4 - Check Docker images are uploaded into ECR
Login to ECR, click on your repo, now you should see the image got uploaded.



Steps # 5 - Access PythonApp in the browser which is running inside docker container
Once build is successful, go to browser and enter http://public_dns_name:8096
You should see page like below:



Monday, February 22, 2021

Automate Docker builds using Jenkins Pipelines | Dockerize PHP App | Upload Docker Image from Jenkins into Nexus Docker Registry

We will learn how to automate Docker builds using Jenkins Pipeline. We will use PHP based application. I have already created a repo with source code + Dockerfile. We will see how to create Docker image and upload docker image into Nexus Docker registry successfully. 

- Automating builds
- Automating Docker image builds
- Automating Docker image upload into Nexus docker registry
- Automating Docker container provisioning
 
Watch here for YouTube channel:

Pre-requisites:
1. Jenkins is up and running
2. Docker installed on Jenkins instance. Click here to for integrating Docker and Jenkins
3. Docker and Docker pipelines plug-in are installed
4. Nexus is up and running and docker registry is configured. Click here to know how to do that.
5. port 80 is opened up in firewall rules to access phpApp running inside Docker container

Create an entry in Manage Credentials for connecting to Nexus
Go to Jenkins --> Manage Jenkins--> Click on Manage Credentials.

Enter Nexus user name and password with ID as nexus
Click on Save.

Step # 1 - Create a pipeline in Jenkins, name can be anything



Step # 2 - Copy the pipeline code from below
Make sure you change red highlighted values below:
Your account_d should be updated and repo should be updated.

pipeline {
    
    agent any
    
    environment {
        imageName = "myphpapp"
        registryCredentials = "nexus"
        registry = "ec2-13-58-223-172.us-east-2.compute.amazonaws.com:8085"
        dockerImage = ''
    }
    
    stages {
        stage('Code checkout') {
            steps {
                checkout([$class: 'GitSCM', branches: [[name: '*/master']], doGenerateSubmoduleConfigurations: false, extensions: [], submoduleCfg: [], userRemoteConfigs: [[credentialsId: '', url: 'https://bitbucket.org/ananthkannan/phprepo/']]])                   }
        }
    
    // Building Docker images
    stage('Building image') {
      steps{
        script {
          dockerImage = docker.build imageName
        }
      }
    }

    // Uploading Docker images into Nexus Registry
    stage('Uploading to Nexus') {
     steps{  
         script {
             docker.withRegistry( 'http://'+registry, registryCredentials ) {
             dockerImage.push('latest')
          }
        }
      }
    }
    
    // Stopping Docker containers for cleaner Docker run
    stage('stop previous containers') {
         steps {
            sh 'docker ps -f name=myphpcontainer -q | xargs --no-run-if-empty docker container stop'
            sh 'docker container ls -a -fname=myphpcontainer -q | xargs -r docker container rm'
         }
       }
      
    stage('Docker Run') {
       steps{
         script {
                sh 'docker run -d -p 80:80 --rm --name myphpcontainer ' + registry + imageName
            }
         }
      }    
    }
}

Step # 3 - Click on Build - Build the pipeline
Once you create the pipeline, click on Build now.


Steps # 4 - Check Docker images are uploaded into Nexus Registry
Login to Nexus, click on your repo, now you should see the image got uploaded.


Steps # 5 - Access PHP App in the browser which is running inside docker container
Once build is successful, go to browser and enter http://public_dns_name
You should see page like below:




Tuesday, February 16, 2021

How to create Ubuntu 18.0.4 Virtual Machine (VM) in Azure? | Create Ubuntu 18.0.4 VM in Azure

How to Create Virtual Machines(VM) in Azure portal? Creating Virtual Machine is easy and straight forward. Let us see how to do that. 

Steps to create Virtual Machine in Azure

 
1. Login to Azure portal, go to https://portal.azure.com/
2. Click on Virtal Machines.

3. Click on Add virtual machine.

 


4. Now enter the details as below or give values per your subscription and requirements. Select Ubuntu 18.0.4 VM


5. choose authentication type as SSH public key, enter azureuser as user name, enter key pair name.
This step will eventually create SSH keys and allow you to download in your machine.



6. Under Networking


Go with Allow selected ports - SSH port 22
And also select Delete public IP and NIC when VM is deleted option


7. Click on Review, it may take a few mins to finish the validations. If all good, it should pass the validations. Click on Create.

8.  Now download the SSH keys and save it locally.
 

9. Once created, Click on virtual machines.

10. You should see the new VM is running like below:


How to connect to Azure VM?

11. Now select that instance, click on connect


Then choose SSH 


12. Copy the value as it shows below in your local terminal(iTerm for Apple laptop) or Git bash for Windows laptop.

13. Make sure your SSH keys is not accessible by others, by executing the below command:

chmod 400 myUbuntuVM_key.pem
14. Now ssh into VM from your local machine using the key

ssh -i myUbuntuVM_key.pem azureuser@your_ip_address


Now it should show you that you are connected to Azure.

Tuesday, February 2, 2021

How to install Puppet 7 on Ubuntu 18.0.4 | Puppet Master install Ubuntu 18.0.4 | Setup Puppet 7 on Ubuntu

Puppet is a configuration management tool, similar to Ansible, Chef and SaltStack. Puppet also can be used for automating infrastrcture as well.

  • Configuration management tool
  • Used for infrastructure automation as well 
  • An open source, written in Ruby
  • Uses Domain Specific Language (DSL) to describe system configuration
  • Comes in two version - Open source & Puppet Enterprise 
  • Puppet is based in client/server model.  
The server does all the automation of tasks on nodes/servers that have a client(agent) installed. The work of the Puppet agent is to send facts to the puppet master and request a catalog based on certain interval level(default time 30 mins). Once it receives a catalog, Puppet agent applies it to the node by checking each resource the catalog describes. It makes relevant changes to attain the desired state. The work of the Puppet master is to control configuration information.  Each managed agent node requests its own configuration catalog from the master.

We will see how to setup Puppet Master in Ubuntu 18.0.4.



Watch the steps for setting up Puppet Master:

Pre-requisites:
Install Puppet master on new Ubuntu with medium instance
port 8140 needs to be opened.

Steps:
First let us see how to install Puppet 7.x on Ubuntu 18.0.4.

Steps for Puppet Master Installation
Modify Puppet Master Hosts file to add hostname of Puppet Master
sudo vi /etc/hosts

Download puppet installables

sudo curl -O https://apt.puppetlabs.com/puppet7-release-bionic.deb
sudo dpkg -i puppet7-release-bionic.deb
sudo apt-get update

Install Puppet

sudo apt-get install puppetserver -y

sudo systemctl enable puppetserver.service


 

(the above command is to start the service during starting the Ubuntu instance)

Start Puppet Server

sudo systemctl start puppetserver.service  

(The above command is for starting the server and this may take some time)         









sudo systemctl status puppetserver.service


Now press q to come out of window.
Add puppet master ip address and puppet next to it like shown below

This confirms that Puppet Master is installed successfully.

Verify which version of Puppet installed by executing below command:
apt policy puppetserver

Set Puppet into Path
sudo cp /opt/puppetlabs/bin/puppet /usr/bin/ -v
sudo cp /opt/puppetlabs/puppet/bin/gem  /usr/bin/ -v

Check Puppet version
puppet --version
It should displays something like below:
7.3.0

Install AWS SDK Gems
You need to install the aws-sdk-core and retries gems as root (or superuser):
sudo gem install aws-sdk-core retries
Done installing documentation for retries after 0 seconds
6 gems installed
Also install AWS SDK for accessing resources in AWS
sudo gem install aws-sdk -v 2.0.42

Done installing documentation for retries after 0 seconds
4 gems installed
Install puppet-labs-aws module
sudo puppet module install puppetlabs-aws

That's it. Puppet Master is setup successfully!!!! You can watch the above steps in YouTube as well. 

Ansible Vs Terraform - What is the difference between Ansible and Terraform - Ansible Vs Terraform

This is one of the common DevOps interview questions. What is the difference between Ansible and Terraform? When will you choose Ansible over Terraform?

Ansible handles configuration management and application deployment, while Terraform is responsible for provisioning and managing the underlying infrastructure. The choice between Ansible and Terraform depends on the specific needs of the task or project at hand.


Factor Ansible Terraform
Type Configuration mgmt
Provisioning
Infrastructure mutable Immutable
Language Procedural Declarative
Written in Python Go
Architecture client only
client only
State Management
No    
Yes
Cloud
All    
All
Syntax YAML
JSON
UI/CLI Has both UI(Ansible Tower) and CLI
only CLI based

What is GitHub Advanced Security for Azure DevOps | Configure GitHub Advanced Security for Azure DevOps

GitHub Advanced Security for Azure DevOps brings the  secret scanning, dependency scanning  and  CodeQL code scanning  solutions already ava...