Saturday, December 31, 2022

How to run Ansible playbook from Jenkins pipeline job | Automate EC2 provisioning in AWS using Jenkins and Ansible Playbook | Create new EC2 instance in AWS cloud using Ansible Playbook and Jenkins Pipeline

We will learn how to create new EC2 instance using Ansible playbook and automate using Jenkins Pipeline. 

Watch Steps in YouTube Channel:

Pre-requisites:

  • Ansible is installed and Boto is also installed on Jenkins instance
  • Ansible plug-in is installed in Jenkins. 
  • Make sure you create an IAM role with AmazonEC2FullAccess policy and attach the role to Jenkins EC2 instance.
  • Playbook for creating new EC2 instance needs to be created but you can refer my GitHub Repo
Steps:

Create Ansible playbook for provisioning EC2 instance

(Sample playbook is available in my GitHub Repo, you can use that as a reference)

Create Jenkins Pipeline 
pipeline {
    agent any

    stages {
        
        stage ("checkout") {
            steps {
                        checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [],                                                     userRemoteConfigs: [[url: 'https://github.com/akannan1087/myAnsibleInfraRepo']]])         
            }
        }
        stage('execute') {
            steps {
                //to suppress warnings when you execute playbook    
                sh "pip install --upgrade requests==2.20.1"
                // execute ansible playbook
                ansiblePlaybook playbook: 'create-EC2.yml'
            }
        }
    }
}

Execute Pipeline


Pipeline Console output


Tuesday, December 27, 2022

Ansible playbook for LAMP Installation on Ubuntu | How to Install LAMP stack using Ansible on Ubuntu 22.0.4

LAMP Stack comprises the following open-source software applications.

    • Linux – This is the operating system hosting the Applications.
    • Apache – Apache HTTP is a free and open-source cross-platform web server.
    • MySQL– Open Source relational database management system.
    • PHP – Programming/Scripting Language used for developing Web applications.
    Watch the steps in YouTube Channel:

    Pre-requisites:
    Steps to setup SSH keys:
    1. Login to Ansible management server/machine. Create SSH keys in Ansible host machine by executing the below command: (if you already have keys created, please skip this step)
    ssh-keygen 

    enter three times..now you will see keys successfully created.
    2.  Execute the below command on Ansible management node and copy the public key content:
    sudo cat ~/.ssh/id_rsa.pub

    copy the above output.
    3. Now login into target node where you want to install LAMP stack, execute the below command to open the file
    sudo vi /home/ubuntu/.ssh/authorized_keys
    type shift A and then enter now 
        and paste the key in the above file. please do not delete any existing values in this file.

    4. Now go back to Ansible mgmt node, do changes in /etc/ansible/hosts file to include the node you will be installing software. Make sure you add public or private IP address of target node as highlighted below in red color:
    sudo vi /etc/ansible/hosts
    [LAMP_Group]  
    xx.xx.xx.xx ansible_ssh_user=ubuntu ansible_ssh_private_key_file=~/.ssh/id_rsa  ansible_python_interpreter=/usr/bin/python3

    Ansible playbook for installing LAMP(Linux Apache MySQL PHP) stack on Ubuntu

    sudo vi installLAMP.yml
    ---
    - hosts: LAMP_Group
      tasks:
        - name: Task # 1 - Update APT package manager repositories cache
          become: true
          apt:
            update_cache: yes
        - name: Task # 2 - Install LAMP stack using Ansible
          become: yes
          apt:
            name: "{{ packages }}"
            state: present
          vars:
            packages:
               - apache2
               - mysql-server
               - php

    ansible-playbook installLAMP.yml


    This is the execution result of the playbook.

    Now go to browser and use target node DNS to confirm if Apache is installed. make sure port 80 is opened in security firewall rules.


    Now login to target EC2 instance, type below commands to verify PHP and MySql versions:

    php --version

    mysql --version

    Ansible playbook for Tomcat Installation on Ubuntu | Ansible Tomcat Playbook on Ubuntu 18.0.4/20.0.4

    Ansible Playbook for installing Tomcat on Ubuntu 18.0.4

    Pre-requisites:
    Steps to setup SSH keys:
    1. Login to Ansible management server/machine. Create SSH keys in Ansible host machine by executing the below command: (if you already have keys created, please skip this step)
    ssh-keygen 

    enter three times..now you will see keys successfully created.
    2.  Execute the below command on Ansible management node and copy the public key content:
    sudo cat ~/.ssh/id_rsa.pub

    copy the above output.
    3. Now login into target node where you want to install Tomcat, execute the below command to open the file
    sudo vi /home/ubuntu/.ssh/authorized_keys
    type shift A and then enter now 
        and paste the key in the above file. please do not delete any existing values in this file.

    4. Now go back to Ansible mgmt node, do changes in /etc/ansible/hosts file to include the node you will be installing software. Make sure you add public or private IP address of target node as highlighted below in red color:
    sudo vi /etc/ansible/hosts
    [My_Group]  
    xx.xx.xx.xx ansible_ssh_user=ubuntu ansible_ssh_private_key_file=~/.ssh/id_rsa  ansible_python_interpreter=/usr/bin/python3

    5. Create a Playbook for setting up Tomcat 9

    sudo vi installTomcat.yml

    ---
    - hosts: My_Group
      tasks:
        - name: Task # 1 Update APT package manager repositories cache
          become: true
          apt:
            update_cache: yes
        - name: Task # 2 - Install Tomcat using Ansible
          become: yes
          apt:
            name: "{{ packages }}"
            state: present
          vars:
            packages:
               - tomcat9
               - tomcat9-examples
               - tomcat9-docs

    6. Execute Playbook:

    sudo ansible-playbook installTomcat.yml
    This is the execution result of Ansible playbook.


    Now access Tomcat on port 8080 in the target machine where you have installed it.



    Thursday, November 17, 2022

    How to Deploy Springboot App into AKS cluster using Jenkins Pipeline and Kubectl CLI Plug-in | Deploy Microservices into AKS cluster using Jenkins Pipeline

    We are going to learn how to Automate build and deployment of Springboot Microservices App into Azure Kubernetes Cluster(AKS) using Jenkins pipeline. 

    Sample springboot App Code:

    I have created a sample Springboot App setup in GitHub. Click here to access code base in GitHub. 

    Jenkins pipeline will:

    - Automate maven build(jar) using Jenkins
    - Automate Docker image creation
    - Automate Docker image upload into Azure container registry
    - Automate Deployments to Azure Kubernetes Cluster

    Watch Steps in YouTube Channel:

    Pre-requisites:

    1. AKS cluster needs to be up running. You can create AKS cluster using any of one of the below options:

    2. Jenkins instance is setup and running
    3. Make sure to Install Docker, Docker pipeline and Kubectl CLI plug-ins are installed in Jenkins

    4.  Install Docker in Jenkins and Jenkins have proper permission to perform Docker builds
    5. Install Kubectl on Jenkins instance
    6. ACR is also setup in Azure cloud. 
    8. Dockerfile created along with the application source code for springboot App.
    9. Modify K8S manifest file per acr, image name for AKS Deployment.
    10. Install Azure CLI on your local machine. (We will be creating the AKS cluster from our local machine)

    The Code for this video is here:
    and make necessary changes in jenkins-aks-deploy-from-acr.yaml file after you fork into your account.

    Step # 1 - Create Credentials to connect to ACR from Jenkins

    Go to Azure Portal console, go to container registry
    Settings--> Access keys
    Get the username and password 
    Go to Jenkins-> Manage Jenkins. Create credentials.


    Enter ID as ACR and enter some text for description and Save

    Step #2 - Create Credentials for connecting to AKS cluster using Kubeconfig

    Go to Jenkins UI, click on Credentials -->


    Click on Global credentials
    Click on Add Credentials

    use secret file from drop down.

    you should see the nodes running in EKS cluster.

    kubectl get nodes


    Execute the below command to get kubeconfig info, copy the entire content of the file:
    cat ~/.kube/config




    Open your text editor or notepad, copy and paste the entire content and save in a file.
    We will upload this file.

    Enter ID as K8S and choose File and upload the file and save.


    Step # 3 - Create a pipeline in Jenkins
    Create a new pipeline job.

    Step # 4 - Copy the pipeline code from below
    Make sure you change values as per your settings highlighted in yellow below:

    pipeline {
      tools {
            maven 'Maven3'
        }
        agent any
            environment {
            //once you create ACR in Azure cloud, use that here
            registryName = "myacrrepo3210"
            //- update your credentials ID after creating credentials for connecting to ACR
            registryCredential = 'ACR'
            dockerImage = ''
            registryUrl = 'myacrrepo3210.azurecr.io'
        }
        
        stages {
            stage('checkout') {
                steps {
                    checkout([$class: 'GitSCM', branches: [[name: '*/main']], extensions: [], userRemoteConfigs: [[url: 'check_out_from_your_repo_after_forking_my_repo']]])
                }
            }
            
            stage ('Build') {
            steps {
                sh 'mvn clean install'           
            }
         }
         
        stage ('Build Docker image') {
            steps {
                    script {
                        dockerImage = docker.build registryName
                    }
                }
            }
            
        // Uploading Docker images into ACR
            stage('Upload Image to ACR') {
             steps{   
                 script {
                    docker.withRegistry( "http://${registryUrl}", registryCredential ) {
                    dockerImage.push()
                    }
                }
              }
            }
            
            stage ('K8S Deploy') {
              steps {
                script {
                    withKubeConfig([credentialsId: 'K8S', serverUrl: '']) {
                    sh ('kubectl apply -f  jenkins-aks-deploy-from-acr.yaml')
                    }
                }
            }
         }
        }
    }

    Step # 5 - Build the pipeline


    Step # 6 - Verify deployments to AKS

    kubectl get pods

    kubectl get services

    Steps # 7 - Access Springboot App Deployed in AKS cluster
    Once deployment is successful, go to browser and enter above load balancer URL mentioned above

    You should see page like below:


    Clean up the Cluster:

    To avoid charges from Azure, you should clean up unneeded resources. When the cluster is no longer needed, use the az group delete command to remove the resource group, container service, and all related resources. 

    az group delete --name myResourceGroup --yes --no-wait

    Saturday, November 5, 2022

    How to Enable Web hooks in Azure Pipeline in Azure DevOps | Enable Web hooks in Azure Pipeline in Azure DevOps | Enable Automate Build in ADO

    Webhooks allows developers to triggers jobs in CI server (such as Jenkins or Azure DevOps) for every code changes in SCM. In this article, we will learn how to trigger Azure Pipeline build jobs instantly for every code change in SCM.




    Pre-requisites:
    1. Azure Build pipeline is already configured. If you dont know how to create Azure build pipeline, click on this link.
    2. SCM repo have been setup, either in GitHub or Bitbucket or any SCM

    Watch Steps in YouTube

    Steps to Enable Webhooks in Azure Build Pipeline

    Go to Azure DevOps project dash board.

    Go to Pipelines


    Click on Pipelines

    Click on Edit


    Click on Triggers tab, Click Continuous Integration checkbox to enable Webhooks.


    Click on Save the Job. You don't have to Queue the job.

    Now go to your SCM and make a code change, you will see pipeline job will trigger immediately.

    Friday, November 4, 2022

    How to solve No hosted parallelism has been purchased or granted in Azure Devops Pipeline | Azure DevOps Pipeline Error Resolution

     

    Root cause and Fix:

    Microsoft has temporarily disabled the free grant of parallel jobs for public projects and for certain private projects in new organizations. However, you can request this grant by submitting a request. Submit a ticket using below url to request increased parallelism in Azure DevOps. 

    Monday, October 10, 2022

    Manual Process vs Automation | CICD Process Flow Diagram | How to Implement CICD using Jenkins and Other DevOps tools

    Agile Development - Manual Process with No Automation


    CICD Process Flow Diagram - How to Implement CICD (Automation) in Agile Development?




    What is Continuous Integration?

    Continuous integration is a DevOps software development practice where developers regularly merge their code changes into a central repository, after which automated builds and tests are run.

    The key goals of continuous integration are to find and address bugs quicker, improve software quality, and reduce the time it takes to validate and release new software updates.

    Jenkins is a popular continuous integration tool. Jenkins can integrate with other tools using plug-ins.

    How does Continuous Integration Work?

    Developers frequently commit to a shared repository using a version control system such as Git. Prior to each commit, developers may choose to run local unit tests on their code as an extra verification layer before integrating. A continuous integration service automatically builds and runs unit tests on the new code changes to immediately surface any errors.

    Benefits of Continuous Integration
    • Improve Developers productivity 
    • Find bugs early in the software development stage
    • Deliver products into market place sooner
    • Improve the feedback loop
    What is Continuous Delivery?

    Continuous delivery is a software development practice where code changes are automatically prepared for a release to production. Continuous delivery is the next extension of continuous integration. The delivery phase is responsible for packaging an artifact together to be delivered to end-users. This phase runs automated building tools to generate this artifact.

    Benefits of Continuous Delivery
    • Automate the Software Release Process
    • Improve Developer Productivity
    • Find bugs early in the software development stage
    • Deliver updates faster

    Tuesday, October 4, 2022

    How to Recover SonarQube Admin password | How to unlock SonarQube admin password in Postgres SQL

    Let's say you have setup SonarQube using Docker or Docker Compose, you have forgotten the admin password for SonarQube. This article helps you to reset/recover the admin password. If you changed and then lost the admin password, you can reset it using the following steps.

    Watch Steps in YouTube channel:


    Pre-requisites:

    As we have configured SonarQube using Docker compose, We need to login to PostgreSQL running inside postgres docker container and execute update command to reset to default password.

    Step 1: Login to PostgreSQL docker container

    type below command to see the list of containers running in your SonarQube instance.

    sudo docker ps

    Copy the container ID from above command. 

    Now login into PostgresSQL docker container

    docker exec -it <container_id> /bin/bash

    Step 2:  Connect to PostgreSQL database by executing below command:

    psql -p 5432 -d sonar -U sonar -h <container_id>

    now enter the password for sonarqube database:

    from my lab exercise, password for sonar user is admin123

    Make sure it shows sonarqube which is your database schema inside PostgresSQL db.

    Step 3: Execute the below query to change admin password to default password which is also admin

    update users set crypted_password='100000$t2h8AtNs1AlCHuLobDjHQTn9XppwTIx88UjqUm4s8RsfTuXQHSd/fpFexAnewwPsO6jGFQUv/24DnO55hY6Xew==', salt='k9x9eN127/3e/hf38iNiKwVfaVk=', hash_method='PBKDF2', reset_password='true', user_local='true' where login='admin';

    Step 4: Login to SonarQube UI and login as admin/admin

    Login as admin/admin

    Now it will immediately ask you to change the default admin password to something else:

    That's it! That is how you recover SonarQube admin password.

    References:

    https://docs.sonarqube.org/latest/instance-administration/security/

    Thursday, September 22, 2022

    Create Freestyle job in Jenkins | How to create build job in Jenkins to automate Java build and deployment of WAR into Tomcat | Bitbucket Jenkins Integration

    Jenkins is popular open source Continuous integration tool. It was written entirely in Java. Jenkins is a self-contained automation server used for automating builds, tests and deployment.


    See below the steps for configuring Jenkins to automate the build and deployment for the Java Web project we already set up in BitBucket into 



    pre-requisites:
    • Also install deploy to container Jacoco plugins under Jenkins --> Manage Jenkins --> Manage plug-ins

    Click on Available, type Deploy to container, select it. enter Jacoco, select it. Click on Install without restart.

    Deploy to container


    JaCoCo

    Click on without restart.


    steps to automate MyWebApp project in Jenkins:

    1. Login to Jenkins. Click on New item.

    2. Enter an item name --> select Free style project.
    enter name as myFirstAutomateJob. click OK.
    3. under source code mgmt, click git. enter Bitbucket URL
    Click on your repo, Copy the url from the browser. Paste the url as Repository URL below.

    under credentials --> click Add- > select Jenkins -->  enter your Bitbucket username and App passwordDO NOT use BitBucket password as it is removed from March 1st, 2022. Click to here to learn how to generate App Password in Bitbucket. Add description as my SCM credentials.



    4. select that from drop down.


    5. Enter main as branch specifier or which ever branch you want to check out.

    6. under build trigger click on poll scm, enter this value to check

    for every 2 mins --> H/02 * * * *

    7. Build --> Add build step --> invoke top level maven targets -->

    select Maven3 from drop down and goal as clean install


    8. Click on advanced, enter the path of POM file as --> MyWebApp/pom.xml


    9. click on Add post build action, select Record Jacoco Code coverage report
      
    10. click on Add post build action, select deploy war/ear to container.

          for WAR/EAR files enter as 
              **/*.war


    in WAR/EAR files, leave context path empty

       11. click on Add container , select Tomcat 9.x

       12. click on add credentials, enter tomcat as user name and password as password.
          select it from drop down.
     


    13. tomcat url should be --> http://your_public_dns_name:8080


    click Apply, click Save
    click on build now..It should build.
    if there is any error, please check the console output. Most of the common error would be checking the path of Maven installation, valid credentials for GitHub or Tomcat. Also make sure you install the plug-ins.

    After successful deployment, please make sure you check the output in Tomcat by going to browser and enter below URL



    This is how you automate the builds and deployments using Jenkins. 

    Code Coverage Report can be seen in Jenkins as well. Click on Job name.


    Please watch the steps in YouTube channel:

    How to Configure GitHub Advanced Security for Azure DevOps | How to Perform Security scan for Azure Repos using GitHub Advanced Security

    GitHub Advanced Security for Azure DevOps brings the  secret scanning, dependency scanning  and  CodeQL code scanning  solutions already ava...