63af3e6337665S0WE02 LectureSlides
63af3e6337665S0WE02 LectureSlides
63af3e6337665S0WE02 LectureSlides
DAY 02
What you will learn Today
2
SQA
Table of Content
►Testing Levels Types and Methodologies
►Software Verification and Validation
►Testing Throughout SDLC
►Skills required to be a good Software Tester
►Perform Functional Testing and Exploratory Testing
►Intro of Test Design Techniques
►Lab Exercise
3
SQA
4
Testing Levels, Types and Approaches
Component
Big Bang
Top Down
Integration Bottom Up
Smoke
Sanity
Regression
Software Functional
...
Testing
System Performance
Non-Functional
Usability
Security
Alpha
Recovery
Beta
Installation & Upgrade
UAT Regularity
...
Contract
Operation
5
A Y A
X B B
TOP DOWN BOTTOM-UP BOTH MODULE IS READY SO
STUB: Called Dummy piece of DRIVER: Write Dummy piece of JUST TEST INTEGRATION
Code Code to call actual module
6
Acceptance Black Box Conformance
Security
Grey Box
Internationalization
Performance Load
Usability
System GUI Beta
Stress
Functional White Box Ad-hoc
Compatibility
Sanity
Exploratory Integration
Unit
Capacity
Recovery Smoke
Error-Handling
Regression Alpha
Install/uninstall
Localization Negative
Confirmation
7
SQA
Levels of
Testing
Testing Levels
8
Acceptance
System
Integration
Component (Unit)
Testing on the Component (Unit) Test Level is called Component (Unit) testing
9
Acceptance
System
Integration
Component (Unit)
10
Acceptance
System
Integration
Component (Unit)
11
Acceptance
System
Integration
Component (Unit)
Typical Defect or Failures • Incorrect functionality (e.g., not as described in design specifications)
• Data flow problems
• Incorrect code and logic
13
Integration Testing
Objective • Same as Component Testing
Test Basis • Software and system design, Sequence diagrams, Interface and communication protocol
specifications, Use cases, Architecture at component or system level, Workflows and External interface
definitions
Typical Defect or Failures Typical defects and failures for Component Integration (CI) and System Integration (CI) testing include:
• Incorrect sequencing or timing of interface calls (CI)
• Incorrect data, missing data, or incorrect data encoding (CI and SI)
• Incorrect assumptions about the meaning, units, or boundaries of the data being passed between
components or Systems(CI and SI)
• Interface Mismatch, Unhandled or improperly handled communication failures between component or
Systems (CI and SI)
• Failure to comply with mandatory security regulations (SI)
• Inconsistent message structures between systems (SI)
14
System Testing
Objective • Same as Component Testing or Integration Testing
Test Basis • System and software requirement specifications (functional and non-functional), Risk analysis reports,
Use cases, Epics and user stories, Models of system behavior, State diagrams, System and user
manuals
Test Objects • Applications, Hardware/software systems, Operating systems, System under test (SUT), System
configuration and configuration data
Typical Defect or Failures • Incorrect calculations, Incorrect or unexpected system functional or non-functional behavior, Incorrect
control and/or data flows within the system, Failure to properly and completely carry out end-to-end
functional tasks, Failure of the system to work properly in the production environment(s), Failure of the
system to work as described in system and user manuals
15
UAT Testing
Objective • Establishing confidence in the quality of the system as a whole
• Validating that the system is complete and will work as expected
• Verifying that functional and non-functional behaviors of the system are as specified
• Perform User acceptance testing, Operational acceptance testing, Contractual and regulatory acceptance
testing, Alpha and beta testing.
Test Basis • Business processes, User or business requirements, Regulations, legal contracts and standards, Use cases,
System requirements, System or user documentation, Installation procedures, Risk analysis reports, Backup
and restore procedures, DR procedures ,Non-functional requirements, Operations documentation,
Deployment and installation instructions, Performance targets, Database packages, Security standards or
regulations
Test Objects • System under test, System configuration and configuration data, Business processes for a fully integrated
system, Recovery systems and hot sites (for business continuity and disaster recovery testing), Operational
and maintenance processes, Forms, Reports, Existing and converted production data
Typical Defect or Failures • System workflows do not meet business or user requirements, Business rules are not implemented correctly,
System does not satisfy contractual or regulatory requirements,
Security vulnerabilities, Performance efficiency under high loads, or improper operation on a supported
platform.
16
Testing Types
Testing of function Testing based on an analysis of the specification of the
(Functional testing) functionality of a component or system.
17
Testing Types
• Executed every time the new build is ready
Smoke
18
SQA
19
Functional Testing
Task: Test “Save” feature of Notepad application.
20
Functional Testing
Task: Test new version of a Notepad application.
21
Test Approaches
• Proactive and Reactive
• Manual and Automated
• Black-box, White-box and Grey-box
• Scripted and Unscripted
• Static and Dynamic
22
Test Approaches
23
Test Approaches
24
Test Approaches
Proactive Reactive
Defect detection after
Prevention of Failure
Failure
Inspections Rework
Review Scrap
Training of Staff Warranty Repair
Quality Audits Retest
Reject
Replacement
25
Test Approaches
► Static Testing
► Testing of a software development artifact, e.g.,
requirements, design or code, without execution
of these artifacts, e.g., reviews or static analysis
► Dynamic Testing
► Testing that involves the execution of the
software of a component or system.
26
Activity
Open any Website and Perform Functional Testing
https://opensource-
demo.orangehrmlive.com/web/index.php/auth/login
Or
https://adactinhotelapp.com/
27
SQA
28
SQA
Verification &
Validation
►Verification
►Confirmation by examination and
through provision of objective
evidence that specified
requirements for each phase of
SDLC have been fulfilled
►Validation
►Confirmation by examination and
through provision of objective
evidence that the requirements for
a specific intended use or
application have been fulfilled.
29
Verification
Definition: The process of evaluating work-products (not the actual final product) of a development phase to
determine whether they meet the specified requirements for that phase.
Objective: To ensure that the product is being built according to the requirements and design specifications. In
other words, to ensure that work products meet their specified requirements.
Evaluation Items: Plans, Requirement Specs, Design Specs, Code, Test Cases
30
Validation
Definition: The process of evaluating software or final product during or at the end of the development
process to determine whether it satisfies specified business requirements.
Objective: To ensure that the product meets the user’s needs and that the specifications were correct in the first
place.
31
SQA
Exploratory Testing
Session based Exploratory Testing
32
SQA
33
SQA
34
SQA
35
SQA
36
SQA
37
SQA
Lab Exercise
Reading Exercises
Exercises of different Testing Types
Perform Functional & Exploratory Testing
38
SQA
Q&A
Instructor Notes
39