Table of contents
- Prerequisites
- Setting Up the Infrastructure
- Step-by-Step Process:
- Writing YAML Pipeline
- Step-01: Create a project in Azure DevOps
- Step-02: Clone the Repo in Azure DevOps
- Step-03: To configure Self-hosted Linux agents in Azure DevOps
- Step-04: Setup AKS Cluster
- Step-05: Install SonarQube Extension.
- Step-06: Setup SonarQube.
- Step-07: Create Service Principle Account
- Step-08: Configure the Service Connection for "Azure Resource Manager"
- **Step-09: Configure the Service Connection for"SonarQube, Docker registory and AKS Cluster **
- Step-10: Step-by-Step YAML configuration
- Step-11: Clean up the images and container registry using the pipeline.
- Step-12: Environment Cleanup:
- Conclusion
- Additional Resources
This document provides a step-by-step guide on setting up a multi-stage YAML CI/CD pipeline in Azure DevOps. The pipeline includes setting up an Azure account, creating an organization, setting up a project, writing a YAML pipeline, and deploying an application to an AKS cluster.
Prerequisites
Before diving into this project, here are some skills and tools you should be familiar with:
[x] Setting Up Azure Account
Go to Azure Portal.
Create a new Microsoft account if you don't have one.
Provide necessary details like name, mobile number, and payment information.
[x] Creating Azure DevOps Organization
Go to Azure DevOps.
Create a new organization.
Set up a new project within the organization.
[x] Clone repository for terraform code
Note: Replace resource names and variables as per your requirement in terraform code- Update
terraform.tfvars
.
- Update
[x] Azure Account: You'll need an Azure account to create resources like virtual machines, AKS clusters, and manage pipelines.
[x] Terraform Knowledge: Familiarity with Terraform in provisioning, managing, and cleaning up infrastructure.
[x] Basic Kubernetes (AKS): A basic understanding of Kubernetes, especially Azure AKS, to deploy and manage containers.
[x] Docker Knowledge: Basic knowledge of Docker for containerizing applications.
[x] GitHub: Experience with GitHub for version control and managing repositories.
[x] Command-Line Tools: Basic comfort with using the command line for managing infrastructure and services.
[x] Basic CI/CD Knowledge: Some understanding of Continuous Integration and Deployment is recommended.
[x] Azure Container Registry (ACR): Set up ACR to store your Docker images.
[x] Linux VM: Docker must be installed on a Linux virtual machine to run containers.
Setting Up the Infrastructure
I have created a Terraform code to set up the entire infrastructure, including the installation of required applications, tools, and the AKS cluster automatically created.
Note ⇒ AKS cluster
creation will take approx. 10 to 15 minutes.
⇒ Virtual machines will be created named as
"devopsdemovm"
⇒ Docker Install
⇒ Azure Cli Install
⇒ KubeCtl Install
⇒ AKS Cluster Setup
⇒ Sonarqube Install
⇒ Trivy Install
⇒ Maven Install
Virtual Machine creation
First, we'll create the necessary virtual machines using terraform
code.
Below is a terraform Code:
Once you clone repo and run the terraform command.
$ ls -l
Mode LastWriteTime Length Name
---- ------------- ------ ----
dar--l 26/12/24 7:16 PM pipeline
dar--l 23/12/24 3:38 PM scripts
-a---l 25/12/24 2:31 PM 600 .gitignore
-a---l 26/12/24 9:29 PM 6571 EC2.tf
-a---l 26/12/24 9:29 PM 892 main.tf
-a---l 26/12/24 9:29 PM 567 output.tf
-a---l 26/12/24 9:29 PM 269 provider.tf
-a---l 26/12/24 9:30 PM 223 terraform.tfvars
-a---l 26/12/24 9:30 PM 615 variable.tf
You need to run the following terraform command.
Now, run the following command.
terraform init
terraform fmt
terraform validate
terraform plan
terraform apply
# Optional <terraform apply --auto-approve>
Once you run the terraform command, we will verify the following things to ensure everything is set up via a terraform.
Inspect the Cloud-Init
logs:
Once connected to VM then you can check the status of the user_data
script by inspecting the log files
# Primary log file for cloud-init
sudo tail -f /var/log/cloud-init-output.log
or
sudo cat /var/log/cloud-init-output.log | more
If the user_data script runs successfully, you will see output logs and any errors encountered during execution.
If there’s an error, this log will provide clues about what failed.
Verify the Installation
- [x] Docker version
ubuntu@ip-172-31-95-197:~$ docker --version
Docker version 24.0.7, build 24.0.7-0ubuntu4.1
docker ps -a
ubuntu@ip-172-31-94-25:~$ docker ps
- [x] kubectl version
ubuntu@ip-172-31-89-97:~$ kubectl version
Client Version: v1.31.1
Kustomize Version: v5.4.2
- [x] Azure CLI version
azureuser@devopsdemovm:~$ az version
{
"azure-cli": "2.67.0",
"azure-cli-core": "2.67.0",
"azure-cli-telemetry": "1.1.0",
"extensions": {}
}
- [x] Trivy version
azureuser@devopsdemovm:~$ trivy --version
Version: 0.58.1
Vulnerability DB:
Version: 2
UpdatedAt: 2024-12-31 18:16:49.801727569 +0000 UTC
NextUpdate: 2025-01-01 18:16:49.801727188 +0000 UTC
DownloadedAt: 2024-12-31 23:32:58.310973832 +0000 UTC
Java DB:
Version: 1
UpdatedAt: 2024-12-30 05:06:53.947339247 +0000 UTC
NextUpdate: 2025-01-02 05:06:53.947339087 +0000 UTC
DownloadedAt: 2025-01-01 00:25:50.06486541 +0000 UTC
}
- [x] Maven version
azureuser@devopsdemovm:~$ mvn --version
Apache Maven 3.6.3
Maven home: /usr/share/maven
Java version: 17.0.13, vendor: Ubuntu, runtime: /usr/lib/jvm/java-17-openjdk-amd64
Default locale: en, platform encoding: UTF-8
OS name: "linux", version: "6.5.0-1025-azure", arch: "amd64", family: "unix"
azureuser@devopsdemovm:~$
Step-by-Step Process:
1. Setting up Azure DevOps Pipeline:
Create an Azure DevOps Organization: If you don’t have one already, create a new Azure DevOps organization where your repositories and pipelines will reside.
Set up Agent Pool: Define an agent pool in Azure DevOps, which will handle the execution of your pipelines.
Install Docker on the Agent: Docker needs to be installed on the machine that will run the pipeline. Without Docker, you won’t be able to build or push Docker images.
2. Creating the Pipeline for Build and Push:
Define Pipeline YAML: Write a YAML configuration file to define the pipeline. This file specifies stages like build and push, and Docker is used to automate the creation of images.
Configure Docker Build: In the build stage, Docker is used to build images from your Dockerfile. The push stage uploads these images to the Azure Container Registry (ACR).
3. Testing the Pipeline:
Make Changes and Test: After setting up the pipeline, make minor changes to test whether the pipeline triggers as expected. For instance, adding a space to a Dockerfile or modifying a JavaScript file in the results directory should trigger a build for that microservice only.
Verify Docker Image Creation: Ensure the Docker images are being built and pushed to ACR without issues.
Handle Platform-Specific Builds: If the pipeline fails due to architecture issues, ensure the correct platform (Linux/ARM64) is specified in the Dockerfile.
Writing YAML Pipeline
Trigger Configuration: Define the trigger for the pipeline.
Pool Definition: Specify the agent pool and agent.
Stages:
Compile Stage: Compile the application using Maven.
Test Stage: Run unit tests using Maven.
Trivy File System Scan: Scan the file system using Trivy.
SonarQube Analysis: Perform code analysis using SonarQube.
Publish Artifacts: Publish build artifacts to Azure Artifacts.
Docker Build: Build Docker image.
Docker Publish: Publish Docker image to ACR.
Deploy to AKS: Deploy the application to AKS.
Step-01: Create a project in Azure DevOps
Open the Azure UI and DevOps portal
https://dev.azure.com/<name>
Create a new project
Step-02: Clone the Repo in Azure DevOps
Clone the repo:
click on Import a repository
Create/Configure a pipeline in Azure DevOps.
Click on Pipeline:
follow the below instruction
Click on pipeline and build it
It will ask you to log in with your Azure account. Please use the same login credentials that you have set up for the Azure portal.
Select the container registry
Note: Key concept overview of pipeline.
you will see the following pipeline yaml and we have to modify it accordingly.
First, we will create a folder in the repo called 'scripts' and update the sh file as shown below. We will create a shell script to get an updated image tag if it is creating a new image.
Don't forget to update the container registroty name in the script file.
Step-03: To configure Self-hosted Linux agents in Azure DevOps
Self-hosted Linux agents/integrate to azure DevOps
select the project and project setting:
Select the agent pools name, you can choose any name
devops-demo_vm
Run the following command as part of setting up the agent server. We have already installed the agent.
azureuser@devopsdemovm:~/myagent$ ls -l
total 144072
drwxrwxr-x 26 azureuser azureuser 20480 Nov 13 10:54 bin
-rwxrwxr-x 1 azureuser azureuser 3173 Nov 13 10:45 config.sh
-rwxrwxr-x 1 azureuser azureuser 726 Nov 13 10:45 env.sh
drwxrwxr-x 7 azureuser azureuser 4096 Nov 13 10:46 externals
-rw-rw-r-- 1 azureuser azureuser 9465 Nov 13 10:45 license.html
-rw-rw-r-- 1 azureuser azureuser 3170 Nov 13 10:45 reauth.sh
-rw-rw-r-- 1 azureuser azureuser 2753 Nov 13 10:45 run-docker.sh
-rwxrwxr-x 1 azureuser azureuser 2014 Nov 13 10:45 run.sh
-rw-r--r-- 1 root root 147471638 Nov 13 12:22 vsts-agent-linux-x64-4.248.0.tar.gz
- Configure the agent
~/myagent$ ./config.sh
Type 'Y'
#Server URL
Azure Pipelines: https://dev.azure.com/{your-organization}
# https://dev.azure.com/mrbalraj
Need to create a PAT Access Token-
Go to azure devops user setting and click on PAT.
Give any name for the agent
agent-1 # I used this in pipeline.
or
devops-demo_vm
Agent is still offline.
- We have to Optionally run the agent interactively. If you didn't run as a service above:
~/myagent$ ./run.sh &
Now, Agent is online ;-)
Need to update- the following info as well
Step-04: Setup AKS Cluster
Go Azure UI and select the AKS cluster
Take a Putty session of the Azure VM and follow these instructions to log in to Azure and Kubernetes.
Azure login: Missing Browser on Headless Servers, Use the --use-device-code flag to authenticate without a browser:
az login --use-device-code
To list out all the account
az account list --output table
- To get resource details
az aks list --resource-group "resourceGroupName" --output table
[x] Verify the AKS cluster.
To get credentials for AKS
az aks get-credentials --name "Clustername" --resource-group "ResourceGroupName" --overwrite-existing
kubectl config current-context
kubectl get nodes
kubectl get nodes -o wide
Step-05: Install SonarQube Extension.
Now, we will integrate SonarQube into the same pipeline. We need to search for it in the marketplace and select SonarQube as shown below.
Select the organization
Step-06: Setup SonarQube.
Note the agent's public IP address and try to access it on port
9000
.-
Step-06.1: Generate the SonarQubeToken
Step-07: Create Service Principle Account
Will create a service principal account.
Take the putty session of Agent VM and do the following
az login --use-device-code
To create a SP account
az ad sp <name_of_SP> --role="contributor" --scope="subscriptions/SUBSCRIPTION_ID"
Step-08: Configure the Service Connection for "Azure Resource Manager
"
- Steps to configure connection for Azure Resource Manager:
**Step-09: Configure the Service Connection for"SonarQube
, Docker registory
and AKS Cluster
**
Select the project and project setting:
Steps to configure connection for SonarQube:
Steps to configure connection for Docker registry:
Steps to configure connection for Kubernetes:
A popup will appear asking for login credentials. Use the same credentials you used for the UI portal login.
Step-10: Step-by-Step YAML configuration
Configure the Maven (Package)
Stage: To add Maven compile
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
trigger:
- none
pool:
name: devops-demo_vm
demands: agent.name -equals agent-1
stages:
- stage: CompileJob
displayName: 'Maven Compile'
jobs:
- job: maven_compile
displayName: 'maven_compile'
steps:
- task: Maven@4
inputs:
azureSubscription: 'azure-conn'
mavenPomFile: 'pom.xml'
goals: 'compile'
publishJUnitResults: true
testResultsFiles: '**/surefire-reports/TEST-*.xml'
javaHomeOption: 'JDKVersion'
mavenVersionOption: 'Default'
mavenAuthenticateFeed: false
effectivePomSkip: false
sonarQubeRunAnalysis: false
Stage To add maven Test
- stage: Test
displayName: 'Maven Test'
jobs:
- job: maven_test
displayName: 'Unit_Test'
steps:
- task: Maven@4
inputs:
azureSubscription: 'azure-conn'
mavenPomFile: 'pom.xml'
goals: 'test'
publishJUnitResults: true
testResultsFiles: '**/surefire-reports/TEST-*.xml'
javaHomeOption: 'JDKVersion'
mavenVersionOption: 'Default'
mavenAuthenticateFeed: false
effectivePomSkip: false
sonarQubeRunAnalysis: false
Run the pipeline to validate it.
- Stage: To add for Trivy
# Add Trivy FS scan image.
- stage: Trivy_FS_Scan
displayName: 'Trivy_FS_Scan'
jobs:
- job: Trivy_FS_Scan
displayName: 'Trivy FS Scan'
steps:
- task: CmdLine@2
inputs:
script: 'trivy fs --format table -o trivy-fs-report.html .'
Stage: To Add for SonarQube
taking help from helper task as below to complete the stage for sonarqube
# Add SonarQube
- stage: SonarQube
displayName: 'SonarAnalysis'
jobs:
- job: SonarQube_analysis
steps:
- task: SonarQubePrepare@7
inputs:
SonarQube: 'sonar-conn'
scannerMode: 'cli'
configMode: 'manual'
cliProjectKey: 'bankapp'
cliProjectName: 'bankapp'
cliSources: '.'
extraProperties: 'sonar.java.binaries=.'
- task: SonarQubeAnalyze@7
inputs:
jdkversion: 'JAVA_HOME_17_X64'
Run the pipeline to validate it.
Stage To add build package and publish artifacts
Add build package stage and publish artifacts into
Azure artifacts
.click on the setting and change the permission as below.
build the project and then add maven Authentication
# To Publish the Artifacts.
- **Stage** Publish_Artifact
displayName: 'Publish_Build_Artifacts'
jobs:
- job: publish_artifacts
displayName: 'Publish_build_Artifacts'
steps:
- task: MavenAuthenticate@0
inputs:
artifactsFeeds: 'store_artifact_maven'
- task: Maven@4
inputs:
azureSubscription: 'azure-conn'
mavenPomFile: 'pom.xml'
goals: 'deploy'
publishJUnitResults: true
testResultsFiles: '**/surefire-reports/TEST-*.xml'
javaHomeOption: 'JDKVersion'
mavenVersionOption: 'Default'
mavenAuthenticateFeed: false
effectivePomSkip: false
sonarQubeRunAnalysis: false
next step would be, to click the
artifacts
Run the pipeline and the status of the pipeline as below-
I am getting the below error message.
Solution:
- Add the repo to both your pom.xml's
repositories
anddistributionManagement
sections.
replace with the below and commit it.
ReRun the pipeline and status of pipeline as below-
Validate the artifacts:
Stage To Add Stage for
Docker build image
# To Docker image build and push
- stage: Docker_Build
displayName: 'Docker_Build'
jobs:
- job: docker_build
displayName: 'docker_build'
steps:
- task: CmdLine@2
inputs:
script: 'mvn package'
- task: Docker@2
inputs:
containerRegistry: 'docker-conn'
repository: 'dev'
command: 'build'
Dockerfile: '**/Dockerfile'
tags: 'latest'
- Stage To Add Stage for
Trivy Image scan
# Add Trivy Image scan .
- stage: Trivy_image_Scan
displayName: 'Trivy_image_Scan'
jobs:
- job: Trivy_image_Scan
displayName: 'Trivy image Scan'
steps:
- task: CmdLine@2
inputs:
script: 'trivy image --format table -o trivy-image-report.html aconreg6700da08.azurecr.io/dev:latest'
- Stage To Add Stage for
Docker Push Image
# To Docker push Image
- stage: Docker_Publish
displayName: 'Docker_Publish'
jobs:
- job: docker_Publish
displayName: 'docker_Publish'
steps:
- task: Docker@2
inputs:
containerRegistry: 'docker-conn'
repository: 'dev'
command: 'push'
tags: 'latest'
Stage To Add Stage for
Deploy on AKS
# To Deploy on K8s
- stage: deploy_to_k8s
displayName: 'Deploy to AKS'
jobs:
- job: deploy_to_k8s
displayName: 'Deploy to AKS'
steps:
- task: KubernetesManifest@1
inputs:
action: 'deploy'
connectionType: 'kubernetesServiceConnection'
kubernetesServiceConnection: 'k8s-conn'
namespace: 'default'
manifests: 'ds.yml'
- complete YAML pipeline
Pipeline status
update the image name in manifest file.
View status in Azure UI: AKS Cluster
az login
azureuser@devopsdemovm:~$ kubectl get all
NAME READY STATUS RESTARTS AGE
pod/bankapp-7ddf494bdd-2ljrl 1/1 Running 2 (17m ago) 18m
pod/mysql-5dcf64c95c-xqhdn 1/1 Running 0 18m
NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE
service/bankapp-service LoadBalancer 10.0.95.5 <pending> 80:31781/TCP 18m
service/kubernetes ClusterIP 10.0.0.1 <none> 443/TCP 3h36m
service/mysql-service ClusterIP 10.0.151.134 <none> 3306/TCP 18m
NAME READY UP-TO-DATE AVAILABLE AGE
deployment.apps/bankapp 1/1 1 1 18m
deployment.apps/mysql 1/1 1 1 18m
NAME DESIRED CURRENT READY AGE
replicaset.apps/bankapp-7ddf494bdd 1 1 1 18m
replicaset.apps/mysql-5dcf64c95c 1 1 1 18m
azureuser@devopsdemovm:~$ kubectl get nodes -o wide
NAME STATUS ROLES AGE VERSION INTERNAL-IP EXTERNAL-IP OS-IMAGE KERNEL-VERSION CONTAINER-RUNTIME
aks-system-18075837-vmss000000 Ready <none> 3h35m v1.31.2 10.224.0.4 52.183.119.235 Ubuntu 22.04.5 LTS 5.15.0-1075-azure containerd://1.7.23-1
azureuser@devopsdemovm:~$
Now, try to get all resources, and you will notice there is an error related to "ImagePullBackoff".
kubectl get all
I am getting the below error message.
Solution: As we are using a private registry and we need to use 'imagepullsecrets
'
Go to Azure registry and get the password which will be used in the below command
- Command to create
ACRImagePullSecret
kubectl create secret docker-registry <secret-name> \
--namespace <namespace> \
--docker-server=<container-registry-name>.azurecr.io \
--docker-username=<service-principal-ID> \
--docker-password=<service-principal-password>
Explanation of the Command:
kubectl create secret docker-registry:
- This creates a new Kubernetes secret of type docker-registry.
<secret-name>:
- The name of the secret being created. For example, acr-credentials.
--namespace <namespace>:
- Specifies the namespace in which the secret will be created. If omitted, it defaults to the default namespace.
Replace <namespace> with the desired namespace name.
--docker-server=<container-registry-name>.azurecr.io:
- The URL of your container registry. For Azure Container Registry (ACR), the format is <container-registry-name>.azurecr.io.
Replace <container-registry-name> with your ACR name.
--docker-username=<service-principal-ID>:
- The username to authenticate with the container registry. For Azure, this is typically a service principal's application (client) ID.
--docker-password=<service-principal-password>:
- The password (or secret) associated with the service principal used for authentication.
To get a token, click on container registory
- To get secret details
kubectl get secret
- To create secret
kubectl create secret docker-registry acr-credentials \
--namespace default \
--docker-server=aconregee7b05ba.azurecr.io \
--docker-username=aconregee7b05ba \
--docker-password=<token>
- Command to delete
secret
kubectl delete secret acr-credentials --namespace default
now, we will update the service-deployment.yaml as below:
here is the updated service status
- To check the deployment
kubectl get deploy vote -o yaml
- Verify services and try to access the application
kubectl get svc
kubectl get node -o wide
kubectl describe node aks-system-18075837-vmss000000
Then access it at http://52.183.119.235:31781
.
If a page is not opening then we have to open a port in NSG.
On the Azure portal, navigate to the server with
VMSS
and select theVMSS
.
Now, we need to try to access it again http://PublicIPAddress:31781
.
Congratulations :-) the application is working and accessible.
Note:- I have updated fully automated CI_pipeline: Bank_App
Step-11: Clean up the images and container registry using the pipeline.
First, create Service Connection in Azure Devops.
Once you create a connection, make sure to note down
connection ID
, as this ID will be used in the pipeline.On the agent machine, ensure you are logged in with Azure and that the connection is active. If not, log in using the following steps.
az login --use-device-code
Using Azure DevOps CLI
- Install the Azure DevOps CLI extension if not already installed:
az extension add --name azure-devops
Sign in and configure the Azure DevOps organization:
az devops configure --defaults organization=https://dev.azure.com/{organization} project={project}
- Replace {organization} and {project} with your details.
List service connections:
az devops service-endpoint list --query "[].{Name:name, ID:id}"
This command will return a list of service connections with their names and IDs.
Delete all the images along with the repository.
- Delete all images from the CR Repository
Step-12: Environment Cleanup:
As we are using Terraform, we will use the following command to delete
Delete all deployment/Service first
kubectl delete service/bankapp-service kubectl delete service/mysql-service kubectl delete deployment.apps/bankapp kubectl delete deployment.apps/mysql
Now, time to delete the
AKS Cluster and Virtual machine
.Terraform destroy --auto-approve
Conclusion
Following these steps, you can set up a complete CI/CD pipeline in Azure DevOps, from setting up the infrastructure to deploying an application to an AKS cluster.
Ref Link: