Devops Record
Devops Record
Devops Record
ELURU
CERTIFICATE
EXTERNAL EXAMINER
DevOps Dept. of CSE
INDEX
development.
writing.
practices.
9
Create a pipeline view of the Jenkins
pipeline used in Exercise 8. Configure it
with user defined messages.
EXERCISE - 1
Get an understanding of the stages in software development lifecycle the process models,
values and principles of agility and the need for agile software development. This will enable
you to work in projects following an agile approach to software development.
Introduction
DevOps is not a new tool or new technology in the market. It is a new philosophy or culture or
process to develop, release and maintenance of software or application or product with higher
quality in very faster way.
Development Group
Ex:
Business Analyst
System Analysis
Design Architect
Operations Group
The people who are involving in
Release
Deployment
Operate
Monitoring are considered as Operations Group
Ex:
Release Engineers
Configuration Engineers
System Admin
Fig: DevOps
To understand this new DevOps culture, we have to aware already existing Software
Development Life Cycle (SDLC) Models.
Waterfall Model
Prototype Model
Incremental or Iterative Model
Waterfall Model
Waterfall refers to the linear or sequential approach of developing software.
Under the Waterfall model, Software Development Life Cycle (SDLC) is divided into
different phases like requirements gathering, analysis, coding, testing, and delivery.
Agile Model
Co-location of the team and client is required for efficient communication which is
not always possible.
DevOps
DevOps Vs Agile:
DevOps and Agile both are different models.
Similarities:
Differences:
QUIZ-1
Q1. Which of the following benefits does Agile NOT offer in comparison to Waterfall
approach?
B) Final product is visible at the end of the project only in Agile method of software
Development.
C) Testing after each iteration ensures that bugs are caught early in Agile method of
Software development.
development.
B and D
A and B
B, C and D
A, B and D
Q2. In Scrum, I am responsible for the return on investment, goals and the vision of the
project. I am responsible for the product backlog and the release date. Who am I?
Product owner
Scrum master
Project manager
Q3. The MOST efficient and effective method of conveying information to and within a
development team is .
Phone call
Q4. Which core practice of Kanban helps understand the activities being done and the
various stages that lead to completion?
Set WIP
Q5. Consider the following scenario and choose the statement which you think are
TRUE.
The IT team management at Pura Vida company has decided to adopt DevOps and has drawn
a roadmap for the journey. This information is communicated to all the team members
(developers, testers, architects, operations team, (includes infrastructure, system
administration and deployment). Choose the option(s) which is/are TRUE.
The implementation is the responsibility of the managers who should drive this.
Managers express that the buy-in should be there from the business(customers) and the top
management in Pura Vida.
IT Management team members feel that additional roles will be required (other than Scrum
master, Product owner and Dev team) to execute this
The development and test teams feel that this is the responsibility of Operations and they
have no role to play here
Q6. Which of the following statement(s) are CORRECT about Continuous Integration
(CI)?
Q7. Tom, a Dev team member, has mentioned in a daily scrum meeting that he is unable
to proceed with his work due to unavailability of a software library. He also stated that
the library is available in another project team within the company. What would be the
most appropriate corrective action in this scenario?
Product Owner should speak to peer project team to make the software library available
Team should use internet and try to download a free version of the software library
Team should use internet and try to download a free version of the software library
Scrum Master should get the issue resolved by speaking to the other project team and
Q8. Which of the following XP practice ensures 100% code coverage, review and
ensures that no extra line of code is written.
Refactoring
Continuous integration
Q9. Which of the following statements are TRUE with respect to Kanban?
B) WIP limit ensures that the team does not commit beyond capacity
B) Unit testing
C) Code coverage
D) Deployment to production
EXERCISE-2
QUIZ-2
Q1. What are some good use cases for extreme programming?
D) When teams need to get away from meetings and “just code”
A)Exploration
B)Promotion
C)Steering
D)Release
B) It reduces risk
Question 4
Which pair programming strategy involves one developer creating a test and the other
developer creating code to satisfy the test?
A)Unstructured pairing
B)Driver/navigator pairing
C) Distributed pairing
D)Ping-pong pairing
Question 5
What are some benefits of test-driven development?
Question 6
What is the second phase of the test-driven development cycle?
Confirm test fails
Refactor
Question 7
Which statement best describes the importance of the customer role in XP?
Only the customer knows the budget
Question 8
Which statement best describes the difference between source control and version
control?
A) Source control specifically manages code, while version control applies versioning
B) Version control specifically manages code, while source control includes other types
of files like binaries
C) Version control specifically manages code, while source control applies versioning
D) Source control specifically manages code, while version control includes other types
of files like binaries
Question 9
Which operation is used to merge code from one branch to another?
a. Push
b. Clone
c. Branch
d. Pull
Question 10
Which statement best describes the difference between continuous integration (CI) and
continuous deployment (CD)?
distribution
CI manages automated builds and deployments, while CD is a process for customer
feedback
Question 11
Which operations are used to implement continuous integration in GitHub?
A) Actions
B) Releases
C)Builds
D)Pull requests
Question 12
Which category of coding standards can be left in violation if there are agreed upon
reasons to do so?
A) Recommendation
B) Guideline
C) Optional
D) Mandatory
Question 13
What are some potential negatives to collective code ownership?
Slower learning
Reliance on team expertise
Decreased motivation
Increased costs
Question 14
What are some benefits of code refactoring?
A) Increases performance
B)Increases maintainability
C)Reduces costs
D)Increases extensibility
Question 15
What are some effective refactoring strategies?
a.Reduce duplication
b.Reduce method l
Question 16
What is the maximum amount of time recommended between small releases in Agile
software development?
One day
Two weeks
There is no maximum
One week
Question 17
What are some benefits of system metaphors?
Question 18
What should be the first step when implementing a 40-hour work week?
A) Collect metrics
B)Experiment
D) Offer overtime pay for anyone that works more than 40 hours
EXERCISE-3
Code
The development teams use some tools and plugins like Git to streamline the
development process
Build
In this stage, once developers finish their task, they commit the code to the shared
code repository using build tools like Maven and Gradle.
Test
Once the build is ready, it is deployed to the test environment first to perform several
types of testing like user acceptance test, security test, integration testing,
performance testing, etc., using tools like JUnit, Selenium, etc., to ensure software
quality.
Release:
Once the build passes all tests, the operations team schedules the releases or deploys
multiple releases to production, depending on the organizational needs.
Deploy
In this stage, Infrastructure-as-Code helps build the production environment and then
releases the build with the help of different tools like ansible, puppet, chef, docker,
kubernetes etc.
Operate
The release is live now to use by customers.
The operations team at this stage takes care of server configuring and provisioning
using tools like Chef.
Monitor
In this stage, the DevOps pipeline is monitored based on data collected from customer
behavior, application performance, etc.
Continuous Development
🞂 This phase focuses on project planning and coding.
🞂 Project requirements are gathered and discussed with stakeholders.
Tools
🞂 There are no specific tools for planning
🞂 The development team requires some tools like GitLab, GIT, TFS, SVN, Mercurial,
Jira, Bit Bucket, Confluence, and Subversion are a few tools used for version control.
Continuous Integration
🞂 In this phase, updated code or add-on functionalities and features are developed and
integrated into existing code.
🞂 Bugs are detected and identified in the code during this phase at every step
through unit testing, and then the source code is modified accordingly.
Tools:
🞂 Jenkin, Bamboo, GitLab CI, Buddy, TeamCity, Travis, and Circle CI are a few
DevOps tools used to make the project workflow smooth and more productive.
Continuous Testing
🞂 Quality analysts continuously test the software for bugs and issues during this stage
using Docker containers.
🞂 In case of a bug or an error, the code is sent back to the integration phase for
modification.
Tools
🞂 JUnit, Selenium, TestNG, and TestSigma are a few DevOps tools for continuous
testing.
🞂 Selenium is the most popular open-source automation testing tool that supports
multiple platforms and browsers.
Continuous deployment
🞂 The final code is deployed on production servers.
QUIZ-3
Part A
Q1 of 4
a) People
b) Process
c) Technology
1) a, b and c
2)only a
3)only b
4)b and c only
Q2 of 4
Choose the business drivers for adoption of DevOps (multiple response question)
Q3 of 4
Match the scenarios with the feature/capability that can be applied.
Capability:
a. Feature toggle
b. Micro services
c. Big room planning
d. Service virtualization
e. Infrastructure as code
Scenarios
A bank is introducing the online fixed deposit scheme. If this feature has to be deployed
in production, the accounts service module which provides the customer account details
online would need to be used and also updated. The updating would disrupt the account
service module. This cannot be afforded by the bank. However, the new feature needs to be
tested
An online audio and video steaming company receives a million calls every day from
different types of devices for different services. They need an architectural style which
consists of lightweight components
A support team receives a ticket from the customer that a specific server is not
reachable. The support staff try out quick fixes, but is not working and the server crashes. It
needs to now be reconfigured. The support staff face this situation very often and are wasting
a lot of time doing reconfiguration manually all the time
Team A has completed working on a feature. However, they are waiting for some
related features from B and C so that deployment can be done together. Customer is keen on
having feature A urgently
A software services company brings all its stakeholders right from developers to
supportteams together for effective execution of projects
Q4 of 4
Match the stakeholder and what capabilities they need to build while embarking on the
DevOps implementation journey.
a. Business
b. Dev Team
c. Testing Team
d. Infra team
e. Ops team
f. Organization
1. Continuous integration
Part-B
Q1 of 4
The customer insists a Dev team to use Jenkins and construct an automated continuous
integration pipeline. The team accepts this request and constructs a CI pipeline orchestrated
by Jenkins. They schedule daily integration. After a month of implementation, the customer
finds that the bugs that are released to production are increasing. When they inspect the
pipeline stages they find the following stages-
A) The automated pipeline should have in-built quality with static code analysis included
with a good number of quality rules and gating conditions for quality
B) Unit tests should be automated and included so that they can be repeatedly invoked
C) Team should have constructed the pipeline with a proprietary orchestration tool
Q2 of 4
A development team which is implementing CI using an orchestration tool are doing the
following activities. Choose the ones which may not be good practices.
A) If the QA tests fail, the developers make the changes in the server where the QA tests run,
compile and run the tests again
B) The team auto-trigger the CI pipeline whenever a team member completes the work and
push code to the central version control repository
C) If the CI pipeline is broken, the teams continue with the features they planned during that
day instead of fixing the pipeline as it might take a long time to do it
D) The development team run the code quality tests and unit tests locally before pushing
them to the central CI pipeline
Q3 of 4
Choose the statement(s) that are TRUE with respect to choosing tool stack for
automating the CICD pipeline
Q4 of 4
Choose the statement(s) that are TRUE with respect to choosing tool stack for
automating the CICD pipeline.
EXERCISE-4
Configure the web application and Version control using Git using Git commands and
version control operations.
🞂 Storing Versions
🞂 Version control system is used to maintain the changes made to an artifact over time.
Working Directory
🞂 Where developers are required to create/modify files.
🞂 Here version control system is not applicable. Here we won’t use the work like
version-1,version-2 etc.
Repository:
🞂 Where we have to store files and metadata.
Commit
Checkout
🞂 With every version/commit we can maintain metadata like - commit message, who
did changes, when he did change, what changes he did etc.
Benefits of CVCS
🞂 Easy to learn and manage
🞂 More control over users and their access.
Examples:
🞂 CVS
🞂 SVN
🞂 TFS etc.
Drawbacks of CVCS
🞂 It is not locally available, which means we must connect to the network to perform
operations.
🞂 During the operations, if the central server gets crashed, there is a high chance of
losing the data.
Distributed Version Control System (DVCS)
🞂 In DVCS, there is no need to store the entire data on our local repository.
🞂 The User needs to update for the changes to be reflected in the local repository.
Benefits of DVCS
🞂 Except for pushing and pulling the code, the user can work offline in DVCS
🞂 DVCS is fast compared to CVCS because you don't have to contact the central server
for every command
GIT
Introduction
🞂 Git is a DevOps tool used for source code management.
🞂 It is a free and open-source version control system used to handle small to very large
projects efficiently.
Before Git
🞂 Developers used to submit their codes to the central server without having copies of
their own.
🞂 There was no communication between any of the developers.
After Git
🞂 Every developer has an entire copy of the code on their local systems.
🞂 Any changes made to the source code can be tracked by others.
🞂 There is regular communication between the developers.
🞂 config
🞂 init Git clone command
🞂 add
🞂 commit
🞂 status
🞂 push
🞂 pull
🞂 branch
🞂 merge
🞂 log
🞂 remote
🞂 The Git config command is the first and necessary command used on the Git
command line.
🞂 This command sets the author’s name and email address to be used with your
commits.
Syntax
$ git config --global user.name “Abcde"
$ git config --global user. Email “[email protected]"
Syntax
🞂 To add one file
$ git add Filename
🞂 To add more than one file
$ git add* (or) $ git add.
Syntax
$ git commit -m " Commit Message"
Git commit –a
This command commits any files added in the repository with git add and also
commits any files you've changed since then.
Syntax
$ git commit -a
Git status command
🞂 It is used to display the state of the working directory and the staging area.
Syntax
$ git status
Git push Command
Syntax
$ git push [variable name] master
Syntax
$ git pull URL
Syntax
$ git branch
This command is used to merge the specified branches history into the current branch.
Syntax
$ git merge Branch name
Syntax
$ git log
Syntax
$ git clone URL
Syntax
$ git remote add origin URL
Ex:
Syntax:
git checkout [branch name]
This command creates a new branch and also switches to it.
Syntax:
git checkout –b [branch name]
EXERCISE-5
Module Name: Implementation of CICD with Java and open-source stack
Configure a static code analyzer which will perform static analysis of the web application
code and identify the coding practices that are not appropriate. Configure the profiles and
dashboard of the static code analysis tool.
Technical debt
● Is a metaphor developed by Ward Cunningham (similar to financial debt)
● Would need extra effort to fix the “dirty” parts in future (similar to interest payments)
● Team can choose to continue putting in extra effort due to the dirty pieces or refactor
SonarQube features
o comments
o coding rules
o code complexity
o duplication in code
Working of SonarQube
1. SonarQube has a list of built-in rules for different languages
o Build script
3. When these profiles are applied to a project, analysis is performed and a dashboard is
created
o Bugs in code
o Code smells
5. Quality gates can be applied to ensure that code that does not pass the quality
conditions do not move forward to the next stage.
● The original number of lines of code is multiplied with original effort. Sonar considers
the original effort as 30 minutes for each line of code to flow through the entire SDLC
Practical tips
● Create profiles with increasing number of rules so that teams are not overwhelmed with
The development team at "Pura Vida" will have their challenges mitigated with SonarQube for
the following reasons:
● Code quality will be ensured from design and clean coding perspectives which will go a
long way in ensuring that code is maintainable and able to adapt to changes quickly
EXERCISE-6
Maven tool:
Maven is a popular open-source build tool developed by the Apache Group to build,
publish, and deploy several projects at once for better project management.
Maven Commands
1. maven clean
This command cleans the maven project by deleting the target directory. The command
output relevant messages are shown below.
$ mvn clean
...
...
2. maven compiler:compile
This command compiles the java source classes of the maven project.
$ mvncompiler:compile
...
...
3. maven compiler:testCompile
This command compiles the test classes of the maven project.
$ mvncompiler:testCompile
...
...
4. maven package
This command builds the maven project and packages them into a JAR, WAR, etc.
$ mvn package
...
...
[INFO]
TESTS
Running com.journaldev.maven.classes.AppTest
Results :
[INFO]
[INFO] -
5. maven install
This command builds the maven project and installs the project files (JAR, WAR, pom.xml,
etc) to the local repository.
$ mvn install
...
...
...
...
...
...
...
...
6. maven deploy
This command is used to deploy the artifact to the remote repository. The remote repository
should be configured properly in the project pom.xml file distributionManagement tag
7. maven validate
This command validates the maven project that everything is correct and all the necessary
information is available.
$ mvndependency:tree
...
[INFO] com.journaldev.mockito:Mockito-Examples:jar:1.0-SNAPSHOT
[INFO] +- org.junit.platform:junit-platform-runner:jar:1.2.0:test
[INFO] | +- org.apiguardian:apiguardian-api:jar:1.0.0:test
[INFO] | +- org.junit.platform:junit-platform-launcher:jar:1.2.0:test
[INFO] | \- org.junit.platform:junit-platform-suite-api:jar:1.2.0:test
[INFO] | \- org.junit.platform:junit-platform-commons:jar:1.2.0:test
[INFO] +- org.junit.jupiter:junit-jupiter-engine:jar:5.2.0:test
[INFO] | +- org.junit.platform:junit-platform-engine:jar:1.2.0:test
[INFO] | | \- org.opentest4j:opentest4j:jar:1.1.0:test
[INFO] | \- org.junit.jupiter:junit-jupiter-api:jar:5.2.0:test
[INFO] +- org.mockito:mockito-junit-jupiter:jar:2.19.0:test
[INFO] | \- org.mockito:mockito-core:jar:2.19.0:test
[INFO] | +- net.bytebuddy:byte-buddy:jar:1.8.10:test
[INFO] | +- net.bytebuddy:byte-buddy-agent:jar:1.8.10:test
[INFO] | \- org.objenesis:objenesis:jar:2.6:test
[INFO] \- org.testng:testng:jar:6.14.3:test
[INFO] +- com.beust:jcommander:jar:1.72:test
[INFO]
mvndepen\- org.apache-extras.beanshell:bsh:jar:2.0b6:test
9. maven dependency:analyze
This command analyses the maven project to identify the unused declared and used
undeclared dependencies. It’s useful in reducing the build size by identifying the unused
dependencies and then remove it from the pom.xml file.
$ mvndependency:analyze
...
[WARNING] org.junit.jupiter:junit-jupiter-api:jar:5.2.0:test
[WARNING] org.mockito:mockito-core:jar:2.19.0:test
[WARNING] org.junit.platform:junit-platform-runner:jar:1.2.0:test
[WARNING] org.junit.jupiter:junit-jupiter-engine:jar:5.2.0:test
[WARNING] org.mockito:mockito-junit-jupiter:jar:2.19.0:test
...
$
10. mvnarchetype:generate
Maven archetypes is a maven project templating toolkit. We can use this command to
generate a skeleton maven project of different types, such as JAR, web application, maven
site, etc. Recommended Reading: Creating a Java Project using Maven Archetypes
11. mvnsite:site
This command generates a site for the project. You will notice a “site” directory in the target
after executing this command. There will be multiple HTML files inside the site directory
that provides information related to the project.
EXERCISE-7
What is Jenkins?
Jenkins is an opensource automation tool written in Java programming language that allows
continuous integration.
Jenkins builds and tests our software projects which continuously making it easier for
developers to integrate changes to the project, and making it easier for users to obtain a fresh
build.
It also allows us to continuously deliver our software by integrating with a large number of
testing and deployment technologies.
Jenkins workflow
1. A build script containing the various targets for executing the build cycle activities is
available (pl. refer earlier section on build automation)
2. These targets are used by Jenkins for orchestration
3. Jenkins is configured -
4. Paths to the executables of tools are provided
o Users are created with permissions
o Environmental variables are set (ex. Java_ HOME, MVN_ Home)
o Plugins for the required tools are uploaded
o Email configurations are done
5. The frequency interval for integration (i.e. start of orchestration) is configured
6. The repository from which the code and test cases are to be pulled is configured
7. The jobs (upstream and downstream) (invoker and invoked respectively) are configured
as per the build lifecycle
8. Gating conditions are configured
9. Mailer configuration (to list, mail body and when) is done so that notifications can be
made (ex. when build is broken)
History of Jenkins
Kosuke Kawaguchi, who is a Java developer, working at SUN Microsystems, was tired of
building the code and fixing errors repetitively. In 2004, he created an automation server
called Hudson that automates build and test task.
Let’s consider a scenario where the complete source code of the application was built
and then deployed on test server for testing. It sounds like a perfect way to develop
software, but this process has many problems.
o Developer teams have to wait till the complete software is developed for the test
results.
o There is a high prospect that the test results might show multiple bugs. It was tough
for developers to locate those bugs because they have to check the entire source code
of the application.
o It slows the software delivery process.
o Continuous feedback pertaining to things like architectural or coding issues, build
failures, test status and file release uploads was missing due to which the quality of
software can go down.
o The whole process was manual which increases the threat of frequent failure.
Advantages of Jenkins
o It is an opensource tool.
o It is free of cost.
o It does not require additional installations or components. Means it is easy to install.
o Easily configurable.
o It supports 1000 or more plugins to ease your work. If a plugin does not exist, you can
write the script for it and share with community.
Disadvantages of Jenkins
o Its interface is out dated and not user friendly compared to current user interface
trends.
o Not easy to maintain it because it runs on a server and requires some skills as server
administrator to monitor its activity.
o CI regularly breaks due to some small setting changes. CI will be paused and
therefore requires some developer's team attention.
Hardware Requirements
Disk Space We need at least 1 GB of space in our hard drive for Jenkins
Software Requirements
JDK We need either Java Development (JDK) or Java Runtime Environment (JRE)
Java The WAR (Web Application Resource) file can be run in any container that
Container supports Servlet 2.4/JSP 2.0 or later. (For example Tomcat 5).
To download the Java Click here. Select file according to your platform.
When you click the given link, you will get the home page of the Jenkins official website as
given below:
Starting Jenkins
Open the command prompt and go to the directory where the Jenkins. war file is located. And
then run the following command:
When you run this command, various tasks will run, one of which is the extraction of the war
file which is done by an embedded webserver called winstone.
When you run this command, various tasks will run, one of which is the extraction of the war
file which is done by an embedded webserver called winstone.
Accessing Jenkins
Now you can access the Jenkins. Open your browser and type the following url on your
browser:
1. http://localhost:8080
EXERCISE-8
A textbox will appear with a hook URL. This is the Hook URL at which Jenkins will listen
for POST requests. Copy this URL and go to the next step.
We now have to provide the Hook URL we got from Jenkins in the previous step.
Click ‘Settings’ on the navigation bar on the right-hand side of the screen.
Click ‘Webhooks & services’ on the navigation bar on the left-hand side of the screen.
Paste the URL you copied in the previous step as the ‘Payload URL’.
You can select the events for which you want the Jenkins build to be triggered. We will select
‘Just the push event’ because we want to run the build when we push our code to the
repository.
Alternatively, you can click on ‘Let me select individual events’ to get a list of all the events
that you can select to trigger your Jenkins build.
You should now see the webhook you just added in the list of Webhooks for that repository
like this.
We now have Jenkins configured to run builds automatically when code is pushed to central
repositories. However, Jenkins doesn’t run all builds for all projects. To specify which project
builds, need to run, we have to modify the project configuration.
In Jenkins, go to the project configuration of the project for which you want to run an
automated build.
In the ‘Build Triggers’ section, select ‘Build when a change is pushed to GitHub’.
Jenkins will now run the build when you push your code to the GitHub repository.
<template>
<div>
<b-row class="header-row">
<b-col co1s="6"><h1>logo</h1><lb-co1>
<b-btn class="login-btn"variant="primary">login</b-btn>
<b-col>
</brow></div>
</twmplate>
<br/><b-container>
<b-carouse1
id="carouse-1"
v-model="slide"
interval="4000"
controls
indicators
background="#ababab"
img-width="1024"
img-height="480"
<b-carouses-slide
caption="first slide"
img-src="https://picsum.photos/1024/4801?image=52">
</b-carousel-slide>
<b-carousel-slide img-ssrc="https://picsum,photos/1024/4801image=54"
<h1>hello world1</h1>
<b-carouse-slide>
<b-carousre-slide imh-scr="https://picsum,photos/1024/480/image=58"
<b-carousel-slide>
<img
slot="img"
class="d-b;ockimg-fluid w-100"
width="1024"
height="480"
src="https://picsum.photos/1024/48-/?img=55"
alt="image slot">
</b-carousel-slides>
<p>
</p>
</b=carousel-slide>
</b-carousel>
<br/>
<b-row>
<b-col>
<b-card
titel="card title"
img-src="https://picsum.photos/680/300/?image=35"
img-art="image"
img-top
tag="article"
style="max-width:20 rem;"
class="mb-2">
<b-card-text>
some quick example text to build on the card title and make up the bulk of card's content
</b-card-text>
</b-card>
</b-col>
<b-card
title="card title"
img-src="https://picsum.photos/680/300/?image=25"
img-alt="image"
img-top
tag="article"
style="max-width:20 rem;"
class="mb-2">
<b-card-text>
some quick example text to biuld on the card title and make up the bulk of card's content
</b-cardd-text>
</b-card>
</b-col>
<b-col>
<b-card
title="card title"
img-src="https:://pucsum,phots/600/300/?image=25"
img-alt="image"
img-top
tag="article"
style="max-width:20 rem;"
class="mb-2">
<b-card-text>
some quick example text to build on the card title and make up the bulk of card's content
</b-card-text>
</b-card>
</b-col>
EXERCISE-9
Create a pipeline view of the Jenkins pipeline used in Exercise 8. Configure it with user
defined messages
Step 1: In your Terminal or CLI, Start and enable Jenkins and docker
Step 2: In your Jenkins console click on New Item from where you will create your first job.
Step 3: After you click on New Item, You need to choose an option freestyle project with
Step 4: In the configuration section select SCM and you will add the git repo link and save it.
Step 5: Then you will select Build option and choose to Execute shell
Step 6: Provide the shell commands. Here it’ll build the archive file to induce a war file. After
that, it’ll get the code that already forces then it uses wiz to put in the package. So, it merely
installs the dependencies and compiles the applying.
Step 8: Click on the. freestyle project and save it with the proper name.
Step 9: Again, repeat step 4, In the configuration section select SCM and you will add the Git
Step 10: Repeat step 5, You will select Build option and choose to Execute shell
Step 11: You will now write the shell module commands as for int phase and build the
container.
Step 12: Again, you will create a new job as before in previous steps.
Step 13: Select freestyle project and provide the item name (here I have given Job3) and click
on OK.
Step 14: Again, repeat step 4, In the configuration section select SCM and you will add
Step 15: Repeat step 10, You will select Build option and choose to Execute shell.
Step 16: Write the shell commands, now it will verify the container files and the deployment
Step 17: Now, you will choose job 1 and click to configure.
Step 18: From the build actions, you will choose post-build and click on build other projects
Step 19: You will need to provide the name of the project to build after the job 1 and then
click save
Step 20: Now, you will choose job 2 and click to configure.
Step 21: From the build actions, you will choose post-build and click on build other projects
Step 22: You will need to provide the name of the project to build after the job 2 and then
click save
Step 24: Now, you will choose and select a build Pipeline view and add the name.
Step 26: let's RUN it and start the CICD process now
Step 27: After you build the job, to verify open the link in your browser cal host: 8180/sample.
text, This is the port where your app is running
EXERCISE-10
smells, bugs, vulnerabilities, and poor test coverage. Rather than manually analyzing
the reports, why not automate the process by integrating SonarQube with your Jenkins
continuous integration pipeline? This way, you can configure a quality gate based on your
own requirements, ensuring bad code always fails the build. SonarQube is an excellent tool
for measuring code quality, using static analysis to find code
You’ll learn exactly how to do that in this article, through a full worked example
where weadd SonarQube analysis and SonarQube quality gate stages to a Jenkins pipeline.
SonarQube refresher
SonarQube works by running a local process to scan your project, called the SonarQube
scanner. This sends reports to a central server, known as the SonarQube server.
The SonarQube server also has a UI where you can browse these reports. They look like
this:
Quality gates
In SonarQube a quality gate is a set of conditions that must be met in order for a project
to be marked as passed. In the above example the project met all the conditions.
Here’s an example where things didn’t go so well.
Here you can see here that a condition failed because the maintainability rating was
a D rather than A.
Running a SonarQube scan from a build on your local workstation is fine, but a robust
solution needs to include SonarQube as part of the continuous integration process. If you
add SonarQube analysis into a Jenkins pipeline, you can ensure that if the quality gate fails
then the pipeline won’t continue to further stages such as publish or release. After all, nobody
wants to release crappy code into production.
To do this, we can use the SonarQube Scanner plugin for Jenkins. It includes two features
that we’re going to make use of today:
1. SonarQube server configuration – the plugin lets you set your SonarQube server
location and credentials. This information is then used in a SonarQube analysis
pipeline stage to send code analysis reports to that SonarQube server.
2. SonarQube Quality Gate webhook – when a code analysis report is submitted to
SonarQube, unfortunately it doesn’t respond synchronously with the result of whether
the report passed the quality gate or not. To do this, a webhook call must be
configured in SonarQube to call back into Jenkins to allow our pipeline to continue (or
fail). The SonarQube Scanner Jenkins plugin makes this webhook available.
2. the SonarQube scanner is run against a code project, and the analysis report is sent to
SonarQube server
3. SonarQube finishes analysis and checking the project meets the configured Quality
Gate
4. SonarQube sends a pass or failure result back to the Jenkins webhook exposed by the
plugin
5. the Jenkins pipeline will continue if the analysis result is a pass or optionally
otherwise fail
one that runs against a codebase with zero issues (I wish all my code was like
this
one that runs against a codebase with bad code issues
You’ll need to make sure you have Docker installed before carrying on.
Fast track: to get up and running quickly check out this GitHub repository. Everything is
setup through configuration-as-code, except the steps under Configure SonarQube below.
What better way to start these two services than with Docker Compose? Create the following
file docker-compose.yml:
version: "3"
services:
sonarqube:
image: sonarqube:lts
ports:
- 9000:9000
networks:
- mynetwork
environment:
- SONAR_FORCEAUTHENTICATION=false
jenkins:
image: jenkins/jenkins:2.319.1-jdk11
ports:
- 8080:8080
networks:
- myn network
networks:
myn network:
Running docker-compose up in the directory containing the file will start Jenkins
on http://localhost:8080 and SonarQube on http://localhost:9000. Awesomeness!
Grab the Jenkins administrator password from the Jenkins logs in the console output of the
Docker Compose command you just ran.
On the next page choose Select plugins to install and install only the pipeline and git plugins.
The SonarQube Scanner plugin we’ll have to install afterwards since this Getting Started page
doesn’t give us the full choice of plugins.
In the final steps you’ll have to create a user and confirm the Jenkins URL
of http://localhost:8080.
Once complete head over to Manage Jenkins > Manage Plugins > Available and search
for sonar. Select the SonarQube Scanner plugin and click Install without restart.
Go to Manage Jenkins > Configure System and scroll down to the SonarQube server’s
section. This is where we’ll add details of our SonarQube server so Jenkins can passits details to our
project’s build when we run it.
Click the Add SonarQube button. Now add a Name for the server, such as SonarQube.
The Server URL will be http://sonarqube:9000. Remember to click Save.
Configuring SonarQube
Let’s jump over to SonarQube. Click Log in at the top-right of the page, and log in with the
default credentials of admin/admin. You’ll then have to set a new password.
Now go to Administration > Configuration > Web hooks. This is where we can add web
hooks that get called when project analysis is completed. In our case we need to configure
SonarQube to call Jenkins to let it know the results of the analysis.
Click Create, and in the popup that appears give the web hook a name of Jenkins, set the
URL to http://jenkins:8080/sonarqube-webhook and click Create.
In this case, the URL has the path SonarQube-webhook which is exposed by the SonarQube
SonarQube comes with its own Sonar way quality gate enabled by default. If you click
It’s all about making sure that new code is of a high quality. In this example we want to
check the quality of existing code, so we need to create a new quality gate.
Click Create, then give the quality gate a name. I’ve called mine Tom Way
Click Save then on the next screen click Add Condition. Select On Overall Code. Search
for the metric Maintainability Rating and choose worse than A. This means that if existing
code is not maintainable then the quality gate will fail. Click Add Condition to save the
condition.
Finally click Set as Default at the top of the page to make sure that this quality gate will
apply to any new code analysis.
Back in Jenkins click New Item and give it a name of SonarQube-good-code, select
the Pipeline job type, then click OK.
Scroll down to the Pipeline section of the configuration page and enter the following
declarative pipeline script in the Script textbox:
pipeline {
agent any
stages {
stage('Clone sources'){
steps {
stage('SonarQube analysis'){
steps {
withSonarQubeEnv('SonarQube'){
sh"./gradlewsonarqube"
stage("Quality gate"){
steps {
waitForQualityGateabortPipeline: true
1. in the Clone sources stage code is cloned from the GitHub repository mentioned
earlier
2. in the SonarQube analysis stage we use the withSonarQubeEnv('Sonarqube') method
exposed by the plugin to wrap the Gradle build of the code repository. This
provides all the configuration required for the build to know where to find
SonarQube. Note that the project build itself must have a way of running
SonarQube analysis, which in this case is done by running ./gradlewsonarqube. For
more information about running SonarQube analysis in a Gradle build see this
article
3. in the Quality gate stage we use the waitForQualityGate method exposed by the
plugin to wait until the SonarQube server has called the Jenkins webhook.
The abortPipeline flag means if the SonarQube analysis result is a failure, we abort
the pipeline.
Click Save to save the pipeline.
SonarQube magic: all the withSonarQubeEnv method does is export some environment
variables that the project’s build understands. By adding a pipeline step which runs the
command printenv wrapped in withSonarQubeEnv, you’ll be able to see environment
variables such as SONAR_HOST_URL being set. These get picked up by the Gradle build of the
code project to tell it which SonarQube server to connect to.
Create another pipeline in the same way, but name it SonarQube-bad-code. The pipeline
script is almost exactly the same, except this time we need to check out the bad-code branch
of the same repository.
pipeline {
agent any
stages {
stage('Clone sources'){
steps {
stage('SonarQube analysis'){
steps {
withSonarQubeEnv('SonarQube'){
sh"./gradlewsonarqube"
stage("Quality gate"){
steps {
waitForQualityGateabortPipeline: true
In the Clone sources stage, we’re now also specifying the branch attribute to point to
the bad-code branch
If we head over to SonarQube we can see that indeed our project has passed the quality gate.
Now let’s run the SonarQube-bad-code pipeline. Remember this is running against some
really bad code!
You’ll be able to see that the Quality gate stage of the pipeline has failed. Exactly what we
wanted, blocking any future progress of this pipeline.
In the build’s Console Output, you’ll see the message ERROR: Pipeline aborted due toquality gate
failure: ERRORwhich shows that the pipeline failed for the right reason.
Over in SonarQube you’ll see that this time it’s reporting a Quality Gate failure.
Final thoughts
You’ve seen that integrating SonarQube quality gates into Jenkins is straightforward using
the SonarQube Scanner Jenkins plugin. To apply this to a production setup, I suggest also to:
For full details about setting up SonarQube analysis in a Gradle code project, see How To
Measure Code Coverage Using SonarQube and Jacoco. If you’re using Maven, check out
this documentation from SonarQube.
EXERCISE-11
In the configured Jenkins pipeline created in Exercise 8 and 9, implement quality gates for static
unit testing.
Jenkins provides an out of box functionality for Junit, and provides a host of plugins for
unit testing for other technologies, an example being MS Test for .Net Unit tests.
If you go to the link https://wiki.jenkins-ci.org/display/JENKINS/xUnit+Plugin it will give
the list of Unit Testing plugins available.
Step 1 − Go to the Jenkins dashboard and Click on the existing HelloWorld project and
choose the Configure option
Step 2 − Browse to the section to Add a Build step and choose the option to Invoke Ant.
Step 4 − In the build file section, enter the location of the build.xml file.
Step 5 − Next click the option to Add post-build option and choose the option of “Publish
Junit test result report”
Step 6 − In the Test reports XML’s, enter the location as shown below. Ensure that Reports is
a folder which is created in the HelloWorld project workspace. The “*.xml” basically tells
Jenkins to pick up the result xml files which are produced by the running of the Junit test
cases.
These xml files which then be converted into reports which can be viewed later.
Step 7 − Once saved, you can click on the Build Now option.
Once the build is completed, a status of the build will show if the build was successful or not.
In the Build output information, you will now notice an additional section called Test Result.
In our case, we entered a negative Test case so that the result would fail just as an example.
One can go to the Console output to see further information. But what’s more interesting is
that if you click on Test Result, you will now see a drill down of the Test results.
EXERCISE-12
Module name: Course end assessment.
In the configured Jenkins pipeline created in Exercise 8 and 9, implement quality gates for
code coverage.
Code analysis in the agile product development cycle is one of the important and necessary
items to avoid possible failures and defects arising out of the continuous changes in the source
codes. There are few good reasons to include this in our development lifecycle.
It can help to find vulnerabilities in the distant corners of your application, which are
not even used, then also static analysis has a higher probability of finding those
vulnerabilities.
You can define your project specific rules, and they will be ensured to follow without
any manual intervention.
It can help to find the bug early in the development cycle, which means less cost to fix
them.
More importantly this you can include in your build process once and use it always
withouthaving to do any manual steps.
Challenge
Now let’s talk about the actual the challenge. SonarQube does help us to gain
visibility into our code base. However, soon you will realize that having visibility into code
isn't enough and in order to take the actual advantage of code analysis, we need to make the
use of different data insights that we get with SonarQube.
One way was to enforce the standards and regulate them across all teams within the
organization. Quality Gates exactly what we needed here and are the best way to ensure that
standards are met and regulated across all the projects in your organization.
Quality Gates can be defined as a set of threshold measures set on your project like
Code Coverage, Technical Debt Measure, Number of Blocker/Critical issues, Security
Rating/ Unit Test Pass Rate and more.
Failing your build jobs when the code doesn’t meet criteria set in Quality Gates should
be the way to go. We were using Jenkins as our CI tool and therefore we wanted to setup
Jenkins job to fail if the code doesn’t meet quality gates.
Here is the snapshot of the job that currently passing build before Quality Gates setup.
Let’s setup Quality gate metrics in the SonarQube server. We are going to create
quality gate only for the metrics “Code coverage” for demo purpose. But there are more
metrics availablethat you should be selecting while creating quality gates.
Select the project from the available list to which you want to associate this quality
gate. Wehave selected sample miqp project for which we have set up Jenkins job.
Now go to the Jenkins job and configure the quality gate validation. Click on the job
and go to Post-build Actions and provide the project details you have associated with
Quality Gatecreated in the earlier steps.
Run the Jenkins job again and verify the build status post quality check enabled.
As we could see that code passed the build, however, it doesn't pass quality gate check.
Therefore, build fails in the end. We can verify the same with the project status in SonarQube
server