Lecture9 10

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 41

Click icon to add picture

SOFTWARE TESTING

By Sabeen Amjad
Outline
 Formal Definition of Software Testing
 What is a Bug?
 Test Case
 Software Testing Life Cycle
 Waterfall Model
 V Model
 Modified V Model
 Spiral Model
 Agile Model
Software Testing

 Testing is the process of executing a program with intention of finding


errors.
Formal Definition of Software Testing

 Software testing is a formal process carried out by a specialized testing


team in which a software unit, several integrated software units or an
entire software package are examined by running the programs on a
computer. All the associated tests are performed according to approved
test procedures on approved test cases.
Formal Definition of Software Testing

 Formal:
 Software test plans are part of project’s development and quality plans,
scheduled in advance
 It is often signed between developer and customer
 Ad hoc examination by colleague and or regular checks by the
programming team leader cannot be considered software tests
Formal Definition of Software Testing

 Specialized testing team:


 An independent team or external consultants who specialize in testing
are assigned to perform these tasks to
 Eliminate bias
 Guarantee effective testing
 Tests performed by the developers themselves will yield poor results
 Unit tests continue to be performed by developers in many
organizations
Formal Definition of Software Testing

 Running the programs:


 Any form of quality assurance activity that does not involve running the
software, for example code inspection cannot be considered as a test.
 Approved test procedures:
 The testing process performed according to a test plan and testing
procedures
 These are approved SQA procedures adopted by the developing
organizations
 Approved test cases:
 The test cases to be examined are defined in full by the test plan.
 No omissions or additions are expected to occur during testing.
What is a Bug?
 Informally, it is “what happens when software fails”, whether the failure
was
o Inconvenient
o Catastrophic
 Terms for software failure
* Fault * Anomaly * Problem
* Inconsistency * Failure * Incident
* Error * Defect
* Variance * Bug
What is a Bug?
 Formally, we say that a software bug occurs when one or more of the
following five rules is true: when the software
o doesn't do something that the product specification says it should do.
o does something that the product specification says it shouldn't do.
o does something that the product specification doesn't mention.
o doesn't do something that the product specification doesn't mention
but should.
o is difficult to understand, hard to use, slow, or will be viewed by the end
user as just pain not right.
Test Case

 Test Case
 A set of
 Input values
 Execution preconditions
 Expected results
 Execution post conditions
 Developed for a particular objective or test condition, such as
 To exercise a particular program path
 To verify compliance with a specific requirement.
Test Case Template
 Below are the standard fields of sample test case template
 Test case ID:
 Unique ID for each test case.
 Follow some convention to indicate types of test. E.g. ‘TC_UI_1’
indicating ‘user interface test case #1’.
 Product / Ver./ Module:
 Mention product name.
 Mention name of main module or sub module.
 Mention version information of the product.
 Test case Version: (Optional)
 Mention the test case version number.
 Use Case Reference(s):
 Mention the use case reference for which the test case is written.
Test Case Template
 GUI Reference(s) :(Optional)
 Mention the GUI reference for which the test case is written.
 QA Test Engineer / Test Designed By:
 Name of the tester
 Test Designed Date:
 Date when test case is written.
 Test Executed By:
 Name of tester who executed this test.
 To be filled after test execution.
 Test Execution Date:
 Date when test case is executed.
Test Case Template
 Test Title/Name:
 Test case title. E.g. verify login page with valid username and password.
 Test Case Summary/Description:
 Describe test objective.
 Pre-Requisite/Pre-condition:
 Any prerequisite that must be fulfilled before execution of this test case.
 List all pre-conditions in order to successfully execute this test case.
 Dependencies: (Optional)
 Mention any dependencies on other test cases or test requirement.
 Test Steps:
 List all test execution steps in detail.
 Write test steps in the order in which these should be executed.
 Make sure to provide as much details as you can.
Test Case Template
 Test Data/Input Specification:
 Use of test data as an input for the test case.
 You can provide different data sets with exact values to be used as an
input.
 Examples: If you’re testing Calculator, this may be as simple as 1+1.
 If you’re testing cellular telephone switching software, there could be
hundreds or thousands of input conditions.
 If you’re testing a file-based product, it would be the name of the file
and a description of its contents.
 Expected Result/ Output Specification:
 What should be the system output after test execution?
 Describe the expected result in detail including message/error that
should be displayed on screen.
 Examples: Did 1+1 equal 2?
 Were the thousands of output variables set correctly in the cell phone
software?
Test Case Template

 Actual result:
 Actual test result should be filled after test execution.
 Describe system behaviour after test execution.
 Status (Pass/Fail):
 If actual result is not as per the expected result mark this test as failed.
 Otherwise update as passed.
 Notes/Comments/Questions: To support above fields if there are some
special conditions which can’t be described in any of the above fields or
there are questions related to the expected or actual results, mention those
here.
 Post-condition:
 What should be the state of the system after executing this test case?
Test Case Template
 Environmental needs: (Optional)
 Environmental needs that are necessary to run the test case include:
 Hardware
 Software
 Test tools
 Facilities
 Staff and so on.
 Special procedural requirements: (Optional)
 This section describes anything unusual that must be done to perform
the test.
 Example: Testing WordPad probably doesn’t need anything special, but
testing nuclear power plant software might.
Level of Detail For Test Case

 If you follow this level of documentation, you could be writing at least a


page of descriptive text for each test case you identify.
 Thousands of test cases could take thousands of pages of
documentation.
 The project could be outdated by the time you finish writing.
 Many government projects and industries are required to document
their test cases to this level.
 In other cases, you can take some shortcuts.
 Taking a shortcut doesn’t mean dismissing or neglecting important
information.
Level of Detail For Test Case

 You can use the following test case format for printer compatibility matrix
 All the other information that goes with a test case are most likely common
to all these cases and could be written once and attached to the table
Testing Life Cycle
Project Initiation Summary Reports

System Study Analysis

Test Plan Regression Test

Design Test Cases Report Defects

Test Environment Setup Execute Test Cases


( manual /automated )
12/06/2024 04:28 PM
Testing Life Cycle
 Project Initiation
 All the necessary analysis is undertaken to allow the project to be
planned.
 System Study
 To test, we need to know the product functionality(Understanding the
product)
 Test Plan
 It is a Systematic approach to test a system or s/w.
 Contains a detailed understanding of what the eventual testing
workflow will be or should be.
 Design Test Cases:
 Test case is the specific procedure of testing a particular requirement
by giving specific input to the system and defining the expected results.
 Executing(Manual):

Testing Life Cycle
 Report defects:
 Reporting defects in Issue Logger (Ex: JIRA, HP-QC)
 Issues will be fixed by the DEV team.
 Regression testing:
 Verify whether the new functionality or bug correction affected the
previous behaviour
 Analysis:
 Analysis about the Testing is Done here.
 Test Summary Report
 An important deliverable which is prepared after Testing is completed.
 This document is to explain various details and activities about the
Testing performed for the Project, to the respective stakeholders like
Senior Management, Client etc.
Product development activity represented as Waterfall
Model
Overall Business
Requirements

Software
Requirements

High Level Design

Low Level Design

Coding

Testing
Phases of testing for different development phases

Overall Business Acceptance


Requirements Testing

Software
System Testing
Requirements

High Level Design Integration Testing

Component
Low Level Design
Testing

Coding Unit Testing

Testing
The V Model

 Overall Business Requirement


 These requirements cover hardware, software, and operational
requirements
 Software Requirements
 Next step is moving from overall requirements to software requirements.
 High Level Design
 Software system is imagined as a set of sub systems that work together
 Low Level Design
 High level design gets translated to a more detailed or low level design.
In this data structures, algorithms choices, table layouts, processing
logics and exception conditions etc are decided
 Coding
 Program code is written in appropriate languages
The V Model

 Unit Testing
 Coding produces several program units, each of these units have to be
tested independently before combining them to form components. The
testing of program units form the unit testing.
 Component Testing
 The components that are the outputs of low level design have to be
tested independently before being integrated. This type of testing is
component level testing.
The V Model

 Integration Testing
 High level design views the system as being made up of interoperating
and integrated subsystems. The individual subsystems should be
integrated and tested. This type of testing corresponds to integration
testing.
 System Testing
 Before product deployment, the product tested as an entire unit to
make sure that all the software requirements are satisfied by the
product. This testing of entire software system is system testing.
 Acceptance Testing
 For overall business requirements, eventually whatever software is
developed should fit into and work in overall context and should be
accepted by end user. This testing is acceptance testing.
The V Model
 Planning of testing for different development phases
 Planning phase is not shown as a separate entity since it is common for
all testing phases.
 It is still not possible to execute any of these tests until the product is
actually built.
 In other words, the step called "testing" is now broken down into
different sub-steps.
 It is still the case that all the testing execution related activities are done
only at the end of the life cycle.
The V Model
 Who should design test
 Execution of the tests cannot be done till the product is built, but the design
of tests can be carried out much earlier.
 Skill sets required for designing each type of tests,
 The people who are actually performing the function of creating the
corresponding artifact.
 For example,
 Acceptance tests should be designed by those who formulate the overall
business requirements (the customers, where possible).
 Those should design the integration tests who know how the system is
broken into subsystems i.e. those who perform the high level design.
 Again, the people doing development know the innards of the program
code and thus are best equipped to design the unit tests.
The V Model
 Benefits of early design
 We achieve more parallelism and reduce the end-of-cycle time taken for
testing.
 By designing tests for each activity upfront, we are building in better
upfront validation, thus again reducing last-minute surprises.
 Tests are designed by people with appropriate skill sets.
V-Model

Overall Business Acceptance Test Acceptance


Requirements Design Testing

Software System Testing


System Testing
Requirements Design

Integration Test
High Level Design Integration Testing
Design

Component Test Component


Low Level Design
Design Testing

Coding Unit Test Design Unit Testing

Verification Validation
V-Model
 Advantages of V- Model
 Testing activities like planning, test designing happens well before
coding.
 This saves a lot of time hence higher chances of success over the
waterfall model.
 Proactive defect tracking – that is defects are found at early stage
 it avoids the downward flow of the defects.
 Dis-advantages of V-Model
 It is Very rigid and least flexible.
 No early prototypes of the software are produced.
 If any changes happen in midway, then the test documents along with
requirement documents has to be updated.
Modified V-Model
 In the V-Model there is an assumption
 Even the activity of test execution was split into execution of tests of
different types, the execution cannot happen until the entire product is
built.
 For a given product, the different units and components can be in
different stages of evolution
 For example one unit may be in development and thus subject to unit
testing whereas another unit may be ready for component testing
 The V model does not explicitly address this parallelism commonly
found in the product development
Modified V-Model

 In the modified V Model,


 Each unit or component or module is given explicit exit criteria to pass
on to the subsequent stage
 The units or components or modules that satisfy a given phase of
testing move to the next phase of testing where possible.
 They do not wait for all the units or components or modules to move
from one phase of testing to another.
Modified V-Model
The Spiral Model
 It represents a risk-driven approach, i.e., the assessment of risks
determines the next project phase.
 The spiral model combines aspects of the waterfall model, the iterative
enhancement model, and prototyping.
 Each spiral cycle starts with the identification of the objectives of the
product part being elaborated (e.g., performance or functionality), the
different alternatives of implementing the product part (e.g., different
designs or reusing existing components), and the constraints for each of
the identified alternatives (e.g., cost or schedule).
 The next step evaluates the identified alternatives and identifies and
resolves risks that come with the different alternatives.
 During the third step, the development approach that best fits the risks is
chosen.
 Finally, the next phases are planned, and the complete cycle is reviewed by
the stakeholders.
The Spiral Model
 Advantages
 The third step accommodates features of other process models as needed.
The spiral model is therefore very flexible.
 The explicit consideration of risks avoids many of the difficulties of other
process models.
 Unattractive alternatives are identified and eliminated early.
 Challenges
 It relies heavily on the organization’s expertise with respect to risk
assessment – therefore, a bad risk assessment may lead to the selection of
bad alternatives or development approaches
The Spiral Model
Agile Testing
 A software testing practice that follow the principle of agile software
development is called Agile Testing.
 Agile is an iterative methodology
 Requirements evolve through collaboration between the customer and self-
organizing teams.
 Agile aligns development with customer needs.
Agile Testing
Agile Testing
 Principles of Agile Testing
 Testing is not a phase:
 Agile team tests continuously and continuous testing is the only way to
ensure continuous progress.
 Testing moves the project forward:
 When following conventional method, testing is considered as quality
gate but agile testing provides feedback on an ongoing basis and the
product meets the business demand.
 Everyone Tests:
 In conventional SDLC only test teams tests, while in agile including
developer and business analyst test the application.
Agile Testing
 Shortening feedback response time:
 In conventional SDLC, the business team will get to know about the product
development during the acceptance testing while in agile for each and
every iteration they are involved.
 Continuous feedback shorten the feedback response time and the cost
involved in fixing is also less.

You might also like