Software Verification and Validation

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 21

Software Verification and Validation

Lecture-02
Software Quality
 The degree to which a system, component,
or process meets specified requirements.

OR

 The degree to which a system, component


or process meets customer or user needs or
expectations.
Quality Assurance

 Product and software quality does not


happen by accident, and is not something
that can be added on after the fact.
 To achieve quality, we must plan for it from
the beginning, and continuously monitor it
day to day
 This requires discipline
 Methods and disciplines for achieving quality
results are the study of Quality Assurance or
QA
Quality Assurance

 Three General Principles of QA


 Know what you are doing
 Know what you should be doing
 Know how to measure the difference
Software Quality Assurance
QA Principle 1: Know What You Are Doing
 In the context of software quality, this means
continuously understanding what it is you are
building, how you are building it and what it currently
does
 This requires organization, including having a
management structure, reporting policies, regular
meetings and reviews, frequent test runs, and so on
 We normally address this by following a software
process with regular milestones, planning,
scheduling, reporting and tracking procedures
Software Quality Assurance
QA Principle 2: Know What You Should be Doing
 In the context of software quality, this means having
explicit requirements and specifications
 These must be continuously updated and tracked as
part of the software development and evolution cycle
 We normally address this by requirements and use-
case analysis, explicit acceptance tests with
expected results, explicit prototypes, frequent user
feedback
 Particular procedures and methods for this are
usually part of our software process
Software Quality Assurance
QA Principle 3: Know How to Measure the Difference

 In the context of software quality, this means having


explicit measures comparing what we are doing to
what we should be doing.
 Achieved using four complementary methods:
• Formal Methods - consists of using mathematical models or
methods to verify mathematically specified properties
• Testing - consists of creating explicit inputs or environments to
exercise the software, and measuring its success
• Inspection- consists of regular human reviews of requirements,
design, architecture, schedules and code
• Metrics- consists of instrumenting code or execution to measure
a known set of simple properties related to quality
Formal Methods

 Formal methods include formal verification (proofs of


correctness), abstract interpretation (simulated execution in
a different semantic domain, e.g., data kind rather than
value), state modelling (simulated execution using a
mathematical model to keep track of state transitions), and
other mathematical methods
 Traditionally, use of formal methods requires mathematically
sophisticated programmers, and is necessarily a slow and
careful process, and very expensive
 In the past, formal methods have been used directly in
software quality assurance in a small (but important) fraction
of systems
 Primarily safety critical systems such as onboard flight
control systems, nuclear
Testing
Focus of the Course
 The vast majority (over 99%) of software quality assurance
uses testing, inspection and metrics instead of formal
methods
 Example: at the Bank of Nova Scotia, over 80% of the total
software development effort is involved in testing!
Testing
 Testing includes a wide range of methods based on the idea
of running the software through a set of example inputs or
situations and validating the results
 Includes methods based on requirements (acceptance
testing), specification and design (functionality and interface
testing), history (regression testing), code structure (path
testing), and many more
Inspection
 Inspection includes methods based on a
human review of the software artifacts

 Includes methods based on requirements


reviews, design reviews, scheduling and
planning reviews, code walkthroughs, and so
on
 Helps discover potential problems before
they arise in practice
Metrics
 Software metrics includes methods based on using tools
to count the use of features or structures in the code or
other software artifacts, and compare them to standards
 Includes methods based on code size (number of source
lines), code complexity (number of parameters,
decisions, function points, modules or methods),
structural complexity (number or depth of calls or
transactions), design complexity, and so on.

 Helps expose anomalous or undesirable properties that


may reduce reliability and maintainability
Achieving Software Quality

Software Process
 Software quality is achieved by
applying these techniques in the
framework of a software process.
 There are many software processes
proposed, of which extreme
Programming is one of the more
recent.
SQA in SDLC
 Requirements
 Architectural design
 Detailed design
 Implementation
 Testing
Requirements Phase
 Senior QA/Manager ensures that the user/client
requirements are captured correctly
 Find out the risks in the requirement and decide how
the system will be tested.
 Properly expressed as functional, performance and
interface requirements.
 Review the requirement document and other
deliverables meeting the standard
 Prepare the formal test plan including the test tools
are being used in the project.
Architectural Design Phase
 Ensure that architectural design meets standards as
designated in the Project Plan
 Verify all captured requirement are allocated to
software components
 Verify all the design documents are completed on
time according to the project plan and kept in project
repository (ER Diagram, Process diagram, Use
Case, etc).
 Prepare the design test report and submit to the
project manager.
Detailed Design Phase
 Prepare the test objectives from the requirement and
design document created.
 Design a verification matrix or Check list and update
on regular basis
 Send the test documents to project manager for
approval and keep them in repository
Implementation Phase
 Verify the results of coding and design activities
including the schedule available in the project plan
 Check the status of all deliverable items and verify
that all are maintaining the standard.
 Getting updated with the tools and technologies
used in the projects and provide the feedback to the
team if any better solution is available.
 Complete writing the check list/ test cases to start
testing.
 Verify that the components are ready to start test or
not.
Testing Phase
 Start testing individual module and stat reporting
bugs
 Verify that all tests are run according to test plans
 Verify all the bugs available in the bug tracking
system are resolved.
 Compile the test reports and verify that the report is
complete and correct
 Certify that testing is complete according to the plan
 Start creating the documentation and verify that all
documents are ready for delivery
Verification & Validation (V&V)
 Verification:
"Are we building the product right?"
The software should conform to its specification.
 Validation:
"Are we building the right product?"
The software should do what the user really
requires.
V&V Goals
 Verification and validation should establish
confidence that the software is fit for its
purpose.
• This does NOT mean completely free of
defects.
• Rather, it must be good enough for its
intended use. The type of use will
determine the degree of confidence that
is needed.
Summary
 SQA activities in different SDLC phases
• Requirements phase
• Architectural design phase
• Detailed design phase
• Implementation phase
• Testing phase

You might also like