Sunday, November 17, 2024

How to Configure GitHub Advanced Security for Azure DevOps | How to Perform Security scan for Azure Repos using GitHub Advanced Security

GitHub Advanced Security for Azure DevOps brings the secret scanning, dependency scanning and CodeQL code scanning solutions already available for GitHub users and natively integrates them into Azure DevOps to protect your Azure Repos and Pipelines. 


How to Set up dependency scanning?

Dependency scanning is a pipeline-based scanning tool. Results are aggregated per repository. It's recommended that you add the dependency scanning task to all the pipelines you'd like to be scanned.

Add the task Advanced Security Dependency Scanning task (AdvancedSecurity-Dependency-Scanning@1) directly to your YAML pipeline file or select the Advanced Security Dependency Scanning task from the task assistant.

How to Set up code scanning

Code scanning is also a pipeline-based scanning tool where results are aggregated per repository.

Add the tasks in the following order:

  1. Advanced Security Initialize CodeQL (AdvancedSecurity-Codeql-Init@1)
  2. Your custom build steps
  3. Advanced Security Perform CodeQL Analysis (AdvancedSecurity-Codeql-Analyze@1)

Pipeline YAML Code for scanning Java code using GitHub Advanced Security


trigger:
- main

resources:
- repo: self

variables:
tag: '$(Build.BuildId)'

stages:
- stage: Build
displayName: Build image
jobs:
- job: Build
displayName: Build
pool:
vmImage: ubuntu-latest
steps:
- task: AdvancedSecurity-Codeql-Init@1
inputs:
languages: 'java'
- task: Maven@4
inputs:
mavenPomFile: 'pom.xml'
goals: 'install'
publishJUnitResults: true
testResultsFiles: '**/surefire-reports/TEST-*.xml'
javaHomeOption: 'JDKVersion'
mavenVersionOption: 'Default'
mavenAuthenticateFeed: false
effectivePomSkip: false
sonarQubeRunAnalysis: false
- task: AdvancedSecurity-Dependency-Scanning@1
- task: AdvancedSecurity-Codeql-Analyze@1

Where to see the Scan resule in Azure DevOps?


Monday, November 4, 2024

What is GitHub Advanced Security for Azure DevOps | How to Enable GitHub Advanced Security for Azure DevOps ?

GitHub Advanced Security for Azure DevOps brings the secret scanning, dependency scanning and CodeQL code scanning solutions already available for GitHub users and natively integrates them into Azure DevOps to protect your Azure Repos and Pipelines.


These scanning tools will natively embed automated security checks into the Azure DevOps platform, allowing developers to secure their code, secrets and supply chain without leaving their workflow.

Azure DevOps Advanced Security provides below security features to help organizations identify and address security vulnerabilities in their development processes.

  • Secret Scanning push protection: check if code pushes include commits that expose secrets such as credentials
  • Secret Scanning repo scanning: scan your repository and look for exposed secrets that were committed accidentally
  • Dependency Scanning – search for known vulnerabilities in open source dependencies (direct and transitive)
  • Code Scanning – use CodeQL static analysis engine to identify code-level application vulnerabilities such as SQL injection and authentication bypass.
Scope of GitHub Advanced Security for Azure DevOps
  • only available for Git repositories
  • only available for Azure DevOps services, not available in Azure DevOps Server(old TFS) 
Enable GitHub Advanced Security
You can enable Advanced Security at the organization, project, or repository level.

Repository-level onboarding
  1. Go to your Project settings for your Azure DevOps project.
  2. Select Repos > Repositories.
  3. Select the repository you want to enable Advanced Security for.
  4. Select Enable and Begin billing to activate Advanced Security. A shield icon now appears in the repository view for any repository with Advanced Security enabled.


Project-level onboarding

  1. Go to your Project settings for your Azure DevOps project.
  2. Select Repos.
  3. Select the Settings tab.
  4. Select Enable all and see an estimate for the number of active committers for your project appear.
  5. Select Begin billing to activate Advanced Security for every existing repository in your project.
  6. Optionally, select Automatically enable Advanced Security for new repositories so that any newly created repositories have Advanced Security enabled upon creation.

Organization-level onboarding

  1. Go to your Organization settings for your Azure DevOps organization.
  2. Select Repositories.
  3. Select Enable all and see an estimate for the number of active committers for your organization appear.
  4. Select Begin billing to activate Advanced Security for every existing repository in each project in your organization.
  5. Optionally, select Automatically enable Advanced Security for new repositories so that any newly created projects have Advanced Security enabled upon creation.

Setup Secret Scanning 

Secret scanning push protection and repository scanning are automatically enabled when you turn on Advanced Security. You can enable or disable secret push protection from the repository settings page.

Screenshot of enabling push protection.

As mentioned, secret scanning repository scanning is automatically kicked off upon enabling Advanced Security for a selected repository.





Friday, November 1, 2024

DevOps Bootcamp Nov 2024 Schedule | DevOps & AWS Azure Cloud Coaching by Coach AK | DevOps and Cloud Computing Online Classes

Lot of new topics covered like GitHub Actions, Helm, Prometheus and Grafana..)

The DevOps requirements in the IT market space is expected to grow by 35% by 2024. Getting a DevOps education now is a great investment into your future, which will pay off very fast!

You are in the right place to kick start your career in DevOps. DevOps is one of the top and hot IT skills right now. Currently almost all the employers are struggling to get right resources in their teams who can do the DevOps and automation work..You could be that person by attending this coaching program.

DevOps Coaching schedule - Nov 2024 (promotions are on, please contact Coach AK)

DateTimeTypeWhen?
Nov 16th11:35 AM CST - 01:30 PM CST on Saturdays
01:30 PM CST - 03:30 PM CST on Sundays    
WeekendsSat/Sundays
Nov 21th6:00 to 8:00 PM CSTWeekdaysTuesdays/Thursdays    

DevOps Coaching Highlights:
Comprehensive hands on knowledge on Git, GitHub, Jenkins, Maven, SonarQube, Nexus, Terraform, Ansible, Docker, Kubernetes, Helm, Prometheus, Docker registry, AWS and Azure cloud platform.

To join DevOps Coaching classes, please contact Coach AK below:
Contact no# : +1 (469)733-5248
WhatsApp #: +1 (469)733-5248

Email id: contact.devopscoaching@gmail.com
Contact Name: Coach AK


Thursday, October 31, 2024

Deploy Python App into Kubernetes Cluster using kubectl Jenkins Pipeline | Containerize Python App and Deploy into EKS Cluster | Kubectl Deployment using Jenkins

We will learn how to automate Docker builds using Jenkins and Deploy into Kubernetes Cluster in AWS Cloud. We will use kubectl command to deploy Docker images into EKS cluster. We will use Python based application. I have already created a repo with source code + Dockerfile. The repo also have Jenkinsfile for automating the following:

- Automating builds using Jenkins
- Automating Docker image creation
- Automating Docker image upload into Elastic container registry
- Automating Deployments to Kubernetes Cluster using kubectl CLI plug-in



Pre-requisites:
1. EKS Cluster is setup and running. Click here to learn how to create EKS cluster.
2. Jenkins Master is up and running.
3. Install Docker in Jenkins.
4. Docker, Docker pipeline and Kubectl CLI plug-ins are installed in Jenkins





5. ECR repo created to store docker images.

The Code for this video is here:
and make necessary changes in eks-deploy-from-ecr.yaml file after you fork into your account.

Step #1 - Create Credentials for connecting to EKS cluster using Kubeconfig
Go to Jenkins UI, click on Credentials -->


Click on Global credentials
Click on Add Credentials

use secret file from drop down.

execute the below command to login as jenkins user.
sudo su - jenkins

you should see the nodes running in EKS cluster.

kubectl get nodes


Create namespace to deploy containers
kubectl create namespace python-app-ns
kubectl get ns

Execute the below command to get kubeconfig info, copy the entire content of the file:
cat /var/lib/jenkins/.kube/config




Open your text editor or notepad, copy and paste the entire content and save in a file.
We will upload this file.

Enter ID as K8S and choose File and upload the file and save.


Step # 2 - Create a pipeline in Jenkins
Create a new pipeline job.


Step # 3 - Copy the pipeline code from below
Make sure you change values as per your settings highlighted in yellow below:

pipeline {
    agent any

    environment {
        registry = "account_id.dkr.ecr.us-east-1.amazonaws.com/coachak/my-docker-repo"
    }
    stages {
        stage('checkout') {
            steps {
                checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [], userRemoteConfigs: [[url: 'https://github.com/akannan1087/myPythonDockerRepo']]])
            }
        }
        
        stage ("build image") 
        {
            steps {
                script {
                    dockerImage = docker.build registry
                      dockerImage.tag("$BUILD_NUMBER")
                    }
                }
        }
        
        stage ("upload ECR") {
            steps {
                script {
                    sh "aws ecr get-login-password --region us-east-2 | docker login --username AWS --password-stdin account_id.dkr.ecr.us-east-2.amazonaws.com"
                sh 'docker push account_id.dkr.ecr.us-east-1.amazonaws.com/coachak/my-docker-repo:$BUILD_NUMBER'
                }
            }
        }
        
    // Avoid latest tag image and pass build ID dynamically from Jenkins pipeline
       stage('K8S Deploy') {
        steps{   
            script {
                withKubeConfig([credentialsId: 'K8S', serverUrl: '']) {
                echo "Current build number is: ${env.BUILD_ID}"
               // Replace the placeholders in the deployment.yaml file 
                sh """ 
                sed -i 's/\${BUILD_NUMBER}/${env.BUILD_ID}/g' k8s-deployment.yaml
                """ 
                sh ('kubectl apply -f  k8s-deployment.yaml -n springboot-app-ns')
                }
            }
        }
       }
    }    
}

Step # 4 - Build the pipeline



Step # 5 - Verify deployments to EKS

kubectl get pods


kubectl get deployments
kubectl get services


Steps # 6 - Access Python App in K8S cluster
Once deployment is successful, go to browser and enter above load balancer URL 

You should see page like below:



Sunday, October 27, 2024

Deploy Springboot Microservices App into Amazon EKS Cluster using Jenkins Pipeline and Kubectl CLI Plug-in | Containerize Springboot App and Deploy into EKS Cluster using Jenkins Pipeline

 We will learn how to create CICD pipeline to deploy springboot microservices using Jenkins pipeline into EKS Cluster with help of Kubernetes CLI plug-in.

We will use Springboot Microservices based Java application. I have already created a repo with source code + Dockerfile. The repo also have Jenkinsfile for automating the following:

- Automating builds using Jenkins Pipeline
- Automating Docker image creation and tagging
- Automating Docker image upload into AWS ECR
- Automating Docker Containers Deployments to Kubernetes Cluster
 




Watch steps in YouTube channel:


Same Code for this video is here:

Pre-requisites:
1. Amazon EKS Cluster is setup and running. Click here to learn how to create Amazon EKS cluster.
5. Docker, Docker pipeline and Kubernetes CLI plug-ins are installed in Jenkins

Install Kubernetes CLI plug-in:

6. Install kubectl on your instance

Step # 1 - Create Maven3 variable under Global tool configuration in Jenkins
Make sure you create Maven3 variable under Global tool configuration. 


Step #2 - Create Credentials for connecting to Kubernetes Cluster using kubeconfig
Click on Add Credentials, use Kubernetes configuration from drop down.

use secret file from drop down.


execute the below command to login as jenkins user.
sudo su - jenkins

you should see the nodes running in EKS cluster.

kubectl get nodes


Create namespace to deploy the containers
kubectl create namespace springboot-app-ns
kubectl get ns


Execute the below command to get kubeconfig info, copy the entire content of the file:
cat /var/lib/jenkins/.kube/config


Open your text editor or notepad, copy and paste the entire content and save in a file.
We will upload this file.

Enter ID as K8S and choose File and upload the file and save.


Enter ID as K8S and choose enter directly and paste the above file content and save.

Step # 3 - Create a pipeline in Jenkins
Create a new pipeline job.


Step # 4 - Copy the pipeline code from below
Make sure you change red highlighted values below as per your settings:
Your docker user id should be updated.
your registry credentials ID from Jenkins from step # 1 should be copied

pipeline {
   tools {
        maven 'Maven3'
    }
    agent any
    environment {
        registry = "account_id.dkr.ecr.us-east-1.amazonaws.com/coachak/my-docker-repo"
    }
   
    stages {
        stage('Cloning Git') {
            steps {
                checkout([$class: 'GitSCM', branches: [[name: '*/main']], doGenerateSubmoduleConfigurations: false, extensions: [], submoduleCfg: [], userRemoteConfigs: [[credentialsId: '', url: 'https://github.com/akannan1087/springboot-app']]])     
            }
        }
      stage ('Build') {
          steps {
            sh 'mvn clean install'           
            }
      }
    // Building Docker images
    stage('Building image') {
      steps{
        script {
          dockerImage = docker.build registry 
          dockerImage.tag("$BUILD_NUMBER")
        }
      }
    }
   
    // Uploading Docker images into AWS ECR
    stage('Pushing to ECR') {
     steps{  
         script {
                sh 'aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin account_id.dkr.ecr.us-east-1.amazonaws.com'
                sh 'docker push account_id.dkr.ecr.us-east-1.amazonaws.com/coachak/my-docker-repo:$BUILD_NUMBER'
         }
        }
      }
    // Avoid latest tag image and pass build ID dynamically from Jenkins pipeline
       stage('K8S Deploy') {
        steps{   
            script {
                withKubeConfig([credentialsId: 'K8S', serverUrl: '']) {
                echo "Current build number is: ${env.BUILD_ID}"
               // Replace the placeholders in the deployment.yaml file 
                sh """ 
                sed -i 's/\${BUILD_NUMBER}/${env.BUILD_ID}/g' eks-deploy-k8s.yaml
                """ 
                sh ('kubectl apply -f  eks-deploy-k8s.yaml -n springboot-app-ns')
                }
            }
        }
       }
    }
}

Step # 5 - Build the pipeline
Once you create the pipeline and changes values per your configuration, click on Build now:


Step # 6 - Verify deployments to K8S

kubectl get deployments -n springboot-app-ns


kubectl get pods -n springboot-app-ns


kubectl get services -n springboot-app-ns


If you see any errors after deploying the pods, you can check the pod logs.
kubectl logs <pod_name> -n spring-app-ns

Steps # 7 - Access SpringBoot App in K8S cluster
Once build is successful, go to browser and enter master or worker node public ip address along with port number mentioned above
http://loadbalancer_ip_address

You should see page like below:



Note:

and make changes in eks-deploy-k8s.yaml to pull Docker image from your AWS ECR repo.

How to Configure GitHub Advanced Security for Azure DevOps | How to Perform Security scan for Azure Repos using GitHub Advanced Security

GitHub Advanced Security for Azure DevOps brings the  secret scanning, dependency scanning  and  CodeQL code scanning  solutions already ava...