Introduction To Software Testing and Quality Assurance

Download as pdf or txt
Download as pdf or txt
You are on page 1of 84

1

Introduction to Software
Testing and Quality Assurance
Presented by
Danny R. Faught
Tejas Software Consulting
817-294-3998
2001 Danny R. Faught. All Rights Reserved.
Slide 2
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Learning objectives
The fundamental skills needed by software
testing and software quality assurance staff.
How testing and QA processes contribute to the
software development process.
Specific techniques for finding and reporting
bugs, and managing the quality of a software
product using industry-proven methods.
How to tailor testing and QA techniques to their
organizations unique needs.
Slide 3
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Course outline
1. The software life cycle
2. The role of testing and QA
3. You cant test everything
4. Risk management
5. Exploratory testing
6. Test design techniques
7. System testing
(continued)
2
Slide 4
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Course outline
8. Test documentation
9. Bug isolation and reporting
10. Static testing
11. Metrics
12. Process improvement
13. Overview of automated testing
14. Resources
Introduction to Software
Testing and Quality
Assurance
Slide 5
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
I
The software life cycle
Slide 6
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
The classic waterfall
System
Requirements
Software
Requirements
Analysis
Program
Design
Coding
Testing
Operations
3
Slide 7
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
The much-maligned waterfall
Many people complain that the waterfall
model doesnt work well because it assumes
that you cant change the result of an earlier
phase
Many iterative and agile models have been
proposed since then
But even Winston Royces seminal paper
on the waterfall model suggests several
improvements over the classic waterfall
Slide 8
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Iterative models
System
Requirements
Software
Requirements
Analysis
Program
Design
Coding
Testing
Operations
E.g., spiral, incremental, rapid
application development,
evolutionary development
The main
difference is
where they
iterate
Slide 9
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
The spiral model
From A Spiral Model of
Software Development
and Enhancement, Barry
Boehm, IEEE Computer,
May, 1988
4
Slide 10
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Agile methods
Extreme Programming
Adaptive Software Development
Scrum
and several others
Were still learning how to integrate
independent testers into agile processes
Slide 11
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
12 XP practices
The Planning Process, sometimes called the Planning Game.
Small Releases.
Metaphor. (common system description)
Simple Design.
Testing. (test before code)
Refactoring.
Pair Programming.
Collective Ownership.
Continuous Integration.
40-hour Week.
On-site Customer.
Coding Standard. (from http://xprogramming.com/)
Slide 12
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Manifesto for Agile Software Development
We are uncovering better ways of developing software by doing it and
helping others do it. Through this work we have come to value:
Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan
That is, while there is value in the items on the right,
we value the items on the left more.
Kent Beck, James Grenning, Robert C. Martin, Mike Beedle, Jim Highsmith, Steve Mellor, Arie van Bennekum, Andrew
Hunt, Ken Schwaber, Alistair Cockburn, Ron Jeffries, Jeff Sutherland, Ward Cunningham, Jon Kern, Dave Thomas,
Martin Fowler, Brian Marick
2001, the above signatories this declaration may be freely copied in any form, but only in its entirety through this notice.
5
Slide 13
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Test levels
Traditional dynamic testing generally starts
at the end of the life cycle and works its
way back up:
Unit testing
Integration testing
System testing
Acceptance testing
Slide 14
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
V model
Requirements
Architecture
Design
Code
Unit Test
Integration Test
System Test
Acceptance Test
Slide 15
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Unit testing
Unit tests exercise the
components of the system
in isolation
Drivers replace the code
that would normally call
the component
Stubs replace the code
that the component would
be calling
component
Driver
Stub
higher-level
component
lower-level
component
6
Slide 16
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
What is a unit?
Synonyms are component and module.
The IEEE glossary says (for module):
(1) A program unit that is discrete and
identifiable with respect to compiling,
combining with other units, and loading;
(2) A logically separable part of a program.
Theres no solid consensus on this, but a
single source file + its include files is a
good rule of thumb.
Slide 17
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Benefits of unit testing
When you find a bug, you dont have to
look far to find the cause
Point bugs are found shortly after theyre
coded
When the team uses unit testing, its usually
conducted by the developers, who are best-
equipped to put together the scaffolding for
it
Slide 18
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Why most teams dont unit test
Drivers and stubs are often expensive
sometimes no easier to create that the
calling/callee components themselves
Developers often have no training on
(or appreciation of) testing techniques
Developers delegate all testing to the test
group thats what theyre there for, right?
7
Slide 19
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Integration testing
When you start testing more than one
component together, youre doing integration
testing.
Integration is not the same as integration
testing. Integration involves putting the
components together.
If your organization doesnt do unit testing,
doing integration testing with small groups of
integrated components is a good alternative.
Slide 20
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Bottom-up integration testing
A
F
E
D
C
B
Driver Driver
Use drivers,
no stubs
required
Slide 21
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Top-down integration testing
A
F
E
D
C
B
Stub
Stub Stub
Use stubs, no
drivers required
8
Slide 22
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Hybrid integration testing
A
F
E
D
C
B
Stub Stub
Driver
The reality is usually
a hybrid of bottom-
up and top-down
Slide 23
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Test the interfaces
While unit-level bugs arent too difficult to
ferret out when you do unit testing, bugs in
the interfaces between component cause
major heartache
Expect to find some nasty bugs during
integration testing!
Slide 24
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
System testing
Make sure the system does what the user
expects it to
Functional testing
Hopefully, just a sanity check or a verification
against requirements, because detailed
functional testing was done at a lower level
Non-functional testing
Emergent properties of the system reliability,
performance, and many other ilities
9
Slide 25
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Acceptance testing
Testing conducted by or on behalf of the
end user to determine whether they accept
delivery of the software
Varies widely
Usually not done for shrink-wrap software
May be the only type of testing for IT software
Could be a large effort for government
software, capping off a very arduous test
process
Slide 26
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Section 1 references (1 of 3)
Original waterfall paper Managing the Development of Large
Software Systems, Dr. Winston W. Royce, Proceedings, IEEE
WESCON, August 1970
Spiral - Barry W. Boehm, A Spiral Model of Software Development
and Enhancement, IEEE Computer, May, 1988.
EVO Principles of Software Engineering Management, Tom Gilb,
Addison-Wesley, 1988, ISBN 0-201-19246-2
Slide 27
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Section 1 references (2 of 3)
Agile Alliance http://agilealliance.org/
Extreme Programming Explained, Kent Beck, Addison Wesley, 1999,
ISBN: 0201616416
Agile Software Development, Alistair Cockburn, Addison Wesley
Professional, 2001, ISBN: 0201699699
Agile Software Development with Scrum, Mike Beedle & Ken
Schwaber, Prentice Hall PTR, 2001, ISBN: 0130676349
Adaptive Software Development, Jim Highsmith, Dorset House
Publishing, 1999, ISBN: 0932633404, and Retiring Lifecycle
Dinosaurs, http://www.stickyminds.com/
Agile testing - http://www.testing.com/agile/
10
Slide 28
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Section 1 references (3 of 3)
V model alternative New Models for Test Development, Brian
Marick, Quality Week 99 (http://testing.com/writings.html)
Unit testing Software Testing Techniques, second edition, Boris
Beizer, 1990, International Thomson Computer Press, ISBN
1850328803; and Black Box Testing, Boris Beizer, John Wiley &
Sons, Inc., 1995, ISBN 0-471-12094-4
Integration testing The Craft of Software Testing, Brian Marick,
Prentice Hall PTR, 1995, ISBN 0-13-177411-5, and The Art of
Software Testing, Glenford J. Myers, John Wiley & Sons, Inc., 1979,
ISBN 0-471-04328-1
Acceptance testing IEEE Std 1012-1998, IEEE Standard for
Software Verification and Validation
Terminology IEEE Std 610.12-1990, IEEE Standard Glossary of
Software Engineering Terminology
Introduction to Software
Testing and Quality
Assurance
Slide 29
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Z
The role of testing and QA
Slide 30
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Test group objectives
Find bugs Find bugs
Especially
Bugs that would have severe customer impact
Bugs we didnt already know about
Find them sooner rather than later
And use your resources wisely
11
Slide 31
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Test group tasks
Typical:
Plan the test project
Design tests (but probably not unit tests)
Automate tests, as appropriate
Execute tests
Report and help monitor bugs
Slide 32
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Alternative: quality control
A QC group does all of the testing tasks
mentioned, and also serves as a gatekeeper
for the release.
The QC group can hold up the release if
they say say it doesnt meet the release
criteria.
Note: whether there is a QC group or not,
management really makes the release
decision.
Slide 33
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Alternative: No unit tests
If:
Developers dont do any unit testing,
Developers wont report their unit test
activities, or
Unit testing isnt consistent from one developer
to another
Then:
The independent test group must do more
thorough functional testing to compensate
12
Slide 34
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Quality assurance objectives
Take proactive measures to design quality
into the product and prevent defects
May also be responsible for testing, or
testing could be a separate group
May also include services such as
supporting the configuration management
infrastructure, and supporting other tools.
Slide 35
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Quality assurance tasks
A few examples:
Documenting and improving the
organizations processes
Developing and enforcing standards
Implementing a metrics program
Establish a review/inspection process
Note: these tasks require advanced skills.
Slide 36
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
QA as a consulting role
QA staff often take on a consulting role in
the organization, with no direct control over
the people you serve.
This means that instead of enforcing, you
must influence.
Again, it all comes down to management
support.
13
Slide 37
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Who is responsible for quality?
Test group?
QA group?
Everyone?
Slide 38
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
My answer
Management is ultimately responsible for
quality. Decisions involving tradeoffs
between scope, schedule, and resources is
what sets the quality bar.
And, management needs everyone to
provide the information they need to make
the right decisions.
Test groups and QA groups should never
accept final responsibility for quality.
Slide 39
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Terminology
Out in the wild, all of the types of groups
weve discussed are likely to be called
QA.
Do what you can to help people understand
that testing is not assurance testing comes
too late in the process for that.
But when you cant change the culture, use
the terms that the local staff will
understand.
14
Slide 40
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Group charter exercise
Write down:
The name of your group/team
A list of the responsibilities of the group
What the group would be called using the
terminology weve just discussed
Slide 41
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
What is a bug?
A deviation from the documented
specifications.
Or, since the specifications are usually
incomplete or missing a deviation from
how the end user will likely expect the
program to behave.
This must be established early in a project,
and not changed as the pressure of the
release builds up.
Slide 42
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Bug terminology
Some prefer the word defect, because
they think bug is too much of a euphemism
that shifts responsibility away from the
perpetrators.
Others prefer issue, because bug and
defect are too harsh.
There are dozens of other variations use
what makes sense in your organization.
15
Slide 43
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Section 2 references
Testing Computer Software, Cem Kaner, Jack Falk, Hung Quoc
Nguyen, Wiley Computer Publishing, 1999, ISBN 0-471-35846-0
Managing the Testing Process, Rex Black, Microsoft Press, 1999,
ISBN 0-7356-0584-X
Slide 44
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
3
You cant test everything
Slide 45
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
The grim reality
Even if you could test everything, you
wont have the resources to do it.
Possible inputs
Feature interactions
Configurations
Code paths
16
Slide 46
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Inputs
Slide 47
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Inputs for a word processor
Think of all the different things you read that
were created with a word processor
All of them together barely scratch the
surface of the total possible different inputs
At each step, you could type a printable
character, edit, move the cursor, select a
special feature, etc., etc.
Multiply by the number of steps to create a
complete document the possibilities are
infinite!
Slide 48
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Feature interactions
Windows NT clock simple, right?
17
Slide 49
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Maybe not so simple?
Slide 50
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Enumerate the possibilities
Total: 768 possible
combinations of
these options
2 Always on top yes/no
2 Time am/pm
3 Size tiny, small, maximized
2 Display date yes/no
2 Display seconds yes/no
2 Title/no title
2 GMT/local
2 Font truetype/non-truetype
2 Analog/digital
count options
Slide 51
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Should we run 768 tests?
Probably not!
Even if we did, we still would probably
miss bugs.
18
Slide 52
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Font size jumps
Choose digital mode and select the Fixedsys font (non-
Truetype). Slowly increase the size of the window. The
font size for the date occasionally jumps smaller rather
than larger.
This one involves a transition between two of the
combinations.
Slide 53
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Disappearing sample font
Make the window very small, open the set font
dialog. The Sample pane is blank.
This one has a good
chance of being found by
a test using a small
window and a non-default
font.
Slide 54
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Disappearing clock
There is a minimum horizontal size, but not a minimum
vertical size. Making the window very short causes the
clock to become an unreadable dot or disappear
completely.
Might be caught by any of the tests using a
small window size, if the tester experiments a
bit.
19
Slide 55
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Extra space
Set GMT mode after noon, in digital mode. Some
non-truetype fonts put extra space after the hour
when its double-digit (e.g. Whimsy ICG, Trajan,
Tekton).
We might catch this one, if we happen to test in the
afternoon and if we choose one of the affected fonts.
Slide 56
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
In digital mode, make the window small (a few inches
wide). The font for the time looks normal filled-in black.
Maximize the window the font changes to an outline
font.
Font inconsistency bug
This bug involves a transition between two of the
combinations weve defined. Whether wed find it depends
on how we implement the tests.
Slide 57
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Combinatorial exercise
Lets describe a very simple application or
feature.
How many tests would we need to test all
feature combinations?
20
Slide 58
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Configurations
Is your software supported on more than
one configuration? Probably!
Software different operating systems,
system configuration options, other
programs installed
Hardware manufacturer, form factor, cpu,
RAM, disks, CD-ROM, network,
Slide 59
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Web test configuration example
Operating systems Windows 98 SE,
Windows 2000, Red Hat Linux 7.2, Mac OS 9
Browsers IE 5.5, Netscape 4.78, Opera 5
Display 1280x1024, 1024x768, 640x480
Colors 8 bit, 24 bit
Internet 28.8 Kbps, ADSL, T1
This is 216 different configurations, and you
might have to run all of your functional tests
on all of them
Slide 60
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Configuration exercise
How many different categories of
configuration options can we think of?
What are some of the different options in
those categories?
21
Slide 61
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Code paths
Start Finish A
G
F
I
E
B C
D H
max 12
max 7
There are 15
different paths
in one iteration
of the outer loop
How many
total paths?
Slide 62
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
How many total paths?
For 7 possible iterations of the outer loop -
15 + 15
2
+ 15
3
+ 15
4
+ 15
5
+ 15
6
+ 15
7
=183,063,615 possible paths
If we increase the outer loop to 10 possible
iterations, the total is more than 600 billion!
If the maximum for either loop is indefinite,
the number of paths is infinite
Slide 63
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Summary
Complete testing is not feasible.
Even if you exhaustively test using of the
approaches we discussed, you can still miss
bugs
You should aim for good enough testing,
based on risks
22
Slide 64
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Section 3 references
A Framework for Good Enough Testing, James Bach, Computer,
October 1998
The Context-Driven School - http://www.context-driven-testing.com/
Testing Computer Software, Cem Kaner, Jack Falk, Hung Quoc
Nguyen, Wiley Computer Publishing, 1999, ISBN 0-471-35846-0
Software Testing Techniques, second edition, Boris Beizer, 1990,
International Thomson Computer Press, ISBN 1850328803, pp 24-26
Introduction to Software
Testing and Quality
Assurance
Slide 65
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
4
Risk management
Slide 66
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
What is a risk? (SEIs view)
Risk is the possibility of suffering loss.
"Risk in itself is not bad; risk is essential
to progress, and failure is often a key
part of learning. But we must learn to
balance the possible negative
consequences of risk against the
potential benefits of its associated
opportunity." [Software Development Risk: Opportunity,
Not Problem, Roger L. Van Scoy, Software Engineering
Institute, CMU/SEI-92-TR-30, ADA 258743, September 1992]
23
Slide 67
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Anatomy of a risk
Cause Risk Trigger
Mitigation Contingency
fact might happen damage
do this now!
if the risk
happens
Impact
oops!
Slide 68
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Example process risk
Cause: The project is not using a bug tracking tool.
Risk: Known bugs may fall through the cracks and
be released to the field.
Impact: Lower customer satisfaction
Trigger: A build is produced that contains a known
bug that should have been fixed.
Mitigation: Set up an email list for bug reports that
archives all submissions.
Contingency: Evaluate and propose purchasing a bug
tracking tool (see if other departments already
have one)
Slide 69
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Example technical risk
Cause: The error- handling code isnt being tested
because we cant simulate the errors.
Risk: Error- handling code may fail in the field when its
needed most.
Impact: Customers may lose data if they hit an error
condition.
Trigger: If system tests uncover error- handling problems.
Mitigation: Initiate code inspections for all error-
handling code.
Contingency: Improve testability so error conditions can
be artificially triggered.
24
Slide 70
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Risk quantification
Risk = likelihood X impact
This is an expected value calculation
Example
likelihood = 20%, impact is $1 million
If you ran the project 100 times, the risk would
manifest 20 times, with $20 million damage
Given that we just run the project once, we
expect $200,000 damage from this risk
Slide 71
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Difficulties in quantification
Sometimes the likelihood, impact, or both
are difficult to estimate.
Some people prefer a simpler scale
Likelihood: low, medium, high
Impact: low, medium, high
And a scoring mechanism to generate the
overall risk factor from these two (e.g., low=1,
medium=2, high=3)
Slide 72
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Risk management process
Identify
Analyze
Plan
Mitigate
Track
25
Slide 73
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Process: identify
Identify as many of your risks as possible
Team brainstorming
Individual interviews
Risk catalogs
Collate all the risks into one list
Slide 74
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Process: analyze
Prioritize the risks
For the top risks, take the raw brainstorming
ideas and analyze the component parts
(remember the anatomy of a risk)
Quantify the top risks
Slide 75
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Process: plan
For the top risks, plan
Mitigation activities to do now (or soon)
Contingency activities to start when the risk is
actually triggered
Make sure resources are allocated and
scheduled for mitigation activities!
26
Slide 76
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Process: mitigate
Take specific actions in the near term to
reduce either
the likelihood of the risk
the impact of the risk
or both
Slide 77
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Process: track
Track risk triggers and start contingency
activities if necessary
Track the risk management plan itself
Add new risks as you discover them
Modify the priorities as the top risks are
mitigated
Iterate the whole process to actively seek out
new risks
Slide 78
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Brainstorm a list of risks related to finishing
the training
Prioritize
Analyze the top few
Make mitigation and contingency plans
Risk brainstorming
27
Slide 79
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Scopes of risk management
Organization
Project
Test/QA project
QA involvement
Risk-based testing
Slide 80
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Organizational scope
This is the world of Risk & Insurance,
dominated by financial risk management
This scope rarely has any visibility into
software development, except perhaps as
operational risk
The idea of Enterprise Risk Management
has been gaining steam, tying together
disparate risk management efforts,
sometimes with a Chief Risk Officer
Slide 81
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Project risk management
Primarily focused on scope, schedule, and
resources especially schedule
Sometimes covers technical risks, but not
usually in enough detail
28
Slide 82
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Test/QA project scope
Same as project management
The test plan and QA plan are often separate
from the overall development project plan/risk
management plan
So they plan should list their own risks that
affect their specific parts of the project
Also, the risks listed in the project plan
should influence testing and QA activities
Slide 83
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
QA involvement
Because the scope of a QA group usually
covers all project activities, the QA group
could get in involved in many of the project
risks
The QA group could facilitate all phases of
risk management
The QA group should clearly define their
scope, because some risks involve sensitive
management issues
Slide 84
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Risk-based testing
Use any existing risk data at any scope
But even with good risk management at the
project level, you wont have all the
technical information you need to guide
testing decisions
Look for technical risks that could impact
quality for any part of the product, and plan
more testing effort for riskier areas
29
Slide 85
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Varying techniques
You can mitigate risks by modifying the
mix of quality techniques that you use
The most common example is the interplay
between testing and technical review
If the organization doesnt do good technical
reviews, plan to do lots of testing
If there are areas that are difficult to test, do
code reviews instead
We also discussed unit testing implications
earlier
Slide 86
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Sidebar on bug tracking
Bug tracking is risk management
The cause is the bug, the risk is how the
bug might affect the user
The priority should be based on
impact (bug severity)
likelihood (what are the odds that a typical user
will run into the bug?)
Ultimately, this feeds into a risk-based
release decision
Slide 87
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Split into two groups. For two different
sample projects with the same software but
different risks, plan the testing effort.
Compare the results.
Context-driven testing
30
Slide 88
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Section 4 references
Software Engineering Institute -
http://www.sei.cmu.edu/organization/programs/sepm/risk/
Project Management Institute RiskSig - http://www.risksig.com/
Heuristic Risk-Based Testing, James Bach, STQE magazine,
November 1999, http://www.satisfice.com/articles/hrbt.pdf
Risk and Insurance Management Society http://www.rims.org/
Enterprise risk management information http://erisk.com/
Introduction to Software
Testing and Quality
Assurance
Slide 89
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
b
Exploratory testing
Slide 90
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Testing in the Dark
A project manager strides purposefully
into your office. "This disk has the latest
and greatest release of our software.
Please test it. Today." You say, "Okay,
sure...what does it do?" The manager
stops in his tracks and says, "Uh, the
usual stuff..."
[from Testing in the Dark, by Johanna Rothman and Brian
Lawrence]
31
Slide 91
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
What do we do now?
Determine objectives
Discover requirements
Plan and conduct testing
For Testing in the Dark situations, sometimes
exploratory testing is the best option
Slide 92
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Determine objectives
Determine the high-level objectives for the
project. Some possibilities
Deliver a highly reliable and robust application
Get the product to market quickly
Deliver as many new whiz-bang features as
possible
Get a prototype out so customers can give us
feedback on the interface
The testing for each of these will be very
different!
Slide 93
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Discover requirements
Do some detective work to learn as much as
you can about the requirements
Look for requirements documents (the may be
out of date, but still good to start with)
Talk to people on the project
Read any available documentation or help text
Run the software
32
Slide 94
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Document & validate
Document the requirements you found
This probably is not a formal or lengthy
document
Ask people on the team These are the
requirements, right? and then make
modifications based on their feedback
Tell people that you intend to file bugs if the
software doesnt meet these requirements
Slide 95
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Plan and conduct testing
You may need to do formal testing,
depending on the projects objectives
There are serious risks with trying to build a
highly reliable product on such a shaky
foundation
But often, your main testing approach will
be exploratory testing
Slide 96
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
What is exploratory testing?
Test design and test execution at the same
time
Non-scripted testing
Scripted here means planned step-by-step,
and shouldnt be confused with using a
scripting language for test automation
(James Bach is the industrys champion for
exploratory testing)
33
Slide 97
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
When to use it
Any time you dont need to repeat the same
tests over and over, and when you dont
need detailed test reports about every test
case you run
Even when you do formal scripted testing,
you can use exploratory testing early, before
your formal tests are planned youll be
reporting bugs much sooner!
Slide 98
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Exploratory isnt ad-hoc
Exploratory testing is not just random
banging on the keyboard
First, decide what general area you want to test
for a session, about 90 minutes
(test lead may do this)
Spend a few minutes exploring and forming a
game plan
Do the testing
Prepare a brief report and make sure bugs get
filed
Slide 99
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Doing the testing
Exploratory testers should be able to
intuitively sniff out where the big bugs are
Have the discipline to cover the most
commonly used features dont spend all your
time with corner cases
Its breadth-first coverage you cant get
bogged down in any one area
But if you run into a large nest of bugs, you
may need to reduce your scope for the session
34
Slide 100
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Exploratory exercise
Test a virtual software system youve never
seen before using exploratory testing
techniques.
Slide 101
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Section 5 references
Testing in the Dark, Johanna Rothman & Brian Lawrence, STQE
magazine, March/April 1999, http://www.stickyminds.com
James Bachs series of exploratory testing columns on
http://www.stickyminds.com Where Does Exploratory Testing Fit?
Exploratory Testing and the Planning Myth, and What Is
Exploratory Testing?
Session-Based Test Management, Jonathan Bach, STQE magazine,
November 2000, http://satisfice.com/articles/sbtm.pdf
Introduction to Software
Testing and Quality
Assurance
Slide
102
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
o
Test design techniques
35
Slide 103
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Approaches for this section
Boxes of many colors
Techniques smorgasbord
Slide 104
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Boxen
Black-box testing
White-box testing
Gray-box testing
Slide 105
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Black-box testing
Treat the software as a black box that you
cant see inside. Focus on the user-visible
behavior.
Some recommend not allowing testers to
know anything about the implementation of
the system, but this idea is out of favor
Also called: behavioral testing, functional
testing
36
Slide 106
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
White-box testing
Look inside the box and exercise whats
there. Focus on covering the code.
Also called: structural testing, glass-box
testing, translucent-box testing
Not necessarily equivalent to unit testing
Slide 107
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Gray-box testing
Test user-visible functionality, but learn the
architecture of the product first.
This can help the tester focus the testing and
be much more effective. (Black-box testing
has been called ignorance-based testing.)
Testers need to be wary when a developer
says you dont need to test that make
your own conclusions
Slide 108
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Conventional wisdom on boxes
Start with gray-box testing
focus on functionality
Get white-box feedback
gather metrics to determine how well you
covered the code
Revisit the test design to increase coverage
Resist the temptation to add a test specifically
to cover another line of code this is not likely
to find additional bugs
37
Slide 109
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Afterword on boxes
You choose a box when you decide how
to approach test design.
After the tests are designed, you cant tell
by looking at them which approach you
used.
Some experts want us to stop talking about
boxes altogether, but the terms are already
well-entrenched.
Slide 110
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Techniques
Requirements-based testing
Functional testing
Syntax testing
Domain testing
User scenarios
All pairs
Slide 111
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Requirements-based testing
Verify that the requirements were all
satisfied (often a subjective process)
Use a traceability matrix to map
requirements to design to tests
This technique is more common in large
software projects under custom contracts
its a fundamental type of testing in cases
where you have good requirements
38
Slide 112
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Requirements - gotchas
Requirements-based testing is difficult if the
requirements arent in good shape they
should each be unambiguous and atomic
Some requirements arent testable
Sometimes you have no documented
requirements at all
Slide 113
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Feature testing
For each feature, verify that it works the
way it should
Can be either at the unit or system level
Usually based on a functional specification
or user documentation
A very straightforward way to verify that
the software matches the documentation
Slide 114
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Negative feature testing
Also test with invalid inputs
20-80% of functional testing should be
negative testing, depending on how robust
the application needs to be
Only one data point at a time should be
invalid all other data should be valid
Make the data just barely invalid make it
look as much like valid data as possible
39
Slide 115
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Feature testing gotchas
Can be tedious
Usually tests each feature in isolation
doesnt catch feature interactions
Requires a lot of maintenance if the tests are
thorough
Doesnt necessarily exercise the software
the way the user will
Slide 116
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Domain testing
Analyze the domain of possible inputs
Choose one data point for each equivalence
class
Also called boundary testing
A technique for finding fencepost errors
Slide 117
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Domain testing example
Credit card transactions must be at least $20
but less than $1000
20 <= transaction < 1000
Test data (3 equivalence classes):
19.99 (rejected)
20 (accepted)
999.99 (accepted)
1000 (rejected)
No further cases are required to satisfy
domain testing
40
Slide 118
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Domain testing gotchas
If your domain analysis has flaws, youll
miss bugs
Some bugs dont lurk at boundaries, so
theyre immune to this approach
Slide 119
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Syntax testing
Specify the syntax that the program accepts
Define positive and negative tests based on
the syntax specification
With this formal analysis, youll find cases
that you otherwise wouldnt have tried
Slide 120
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Syntax testing example
BNF diagram for a real number input field
+
-
0|1|2|3|4|5|6|7|8|9
0|1|2|3|4|5|6|7|8|9
.
{1-6}
{1-5}
What do you do when you see a graph?
Cover it! Boris Beizer
41
Slide 121
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Syntax testing exercise
What test cases would you design, based on
the syntax diagram?
Slide 122
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Sample syntax test cases
Positive cases:
0
0.
123456.78901
+1
1
999999
+999999.99999
Negative cases:
+
-
. A
.1 1 2
--1 {null string}
1234567
1.123456
Slide 123
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Syntax testing gotchas
Its hard to know when to stop adding more
tests
It wont work well for very complex syntax
diagrams you have to break them into
smaller chunks
42
Slide 124
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
User scenarios
Think from the users point of view
devise scenarios based on the way the user
will use the software
If someone already developed use cases,
then test them
For a more extreme approach, use Soap
Opera Testing
Slide 125
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
User scenario gotchas
Will likely miss many lesser-used features
in the software
Its difficult to account for all of the
possible variations in a scenario
A bug in a commonly-used part of the
software can derail user scenario testing
Slide 126
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Random testing
Let a random number generator create your
test data for you
Dumb monkeys Very few constraints on the
test data or smarts in generating meaningful
data. Good at testing exception handling
Smart monkeys Design in some intelligence
to make the test do something interesting.
Better than dumb monkeys at finding bugs in
mainstream functionality
43
Slide 127
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Monkey hierarchy example
Operating system reliability tests
Dumb monkey crashme
Fairly dumb monkey random system calls
Smarter monkey system call scenarios
Smartest monkey shell stress
Slide 128
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
crashme
Dumb monkey. An open source test
program that executes random data.
Mostly exercises low-level exception-
handling in the operating system, but its
very good at doing this.
Rarely makes a legitimate call into the
operating system.
Slide 129
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Random system calls
Slightly less dumb. Targets randomly-
selected system calls with randomly
selected data.
Does a better job of exercising the operating
system, but still mostly finds error cases.
Good at exercising rarely-used system calls
(even undocumented system calls)
44
Slide 130
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
System call scenarios
I never got to implement this one, but it
would have gone like this
Randomly choose system calls, but give
them mostly legitimate data.
Or, craft small blocks of system calls that
do something useful and can be put together
in any order.
This gets us mostly positive tests cases.
Slide 131
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Shell stress
Smartest monkey, but possibly also the least
random.
Uses a database consisting of blocks of shell
commands that represent typical user
activities (list files, edit and compile, etc.).
Randomly pulls blocks from this database
and executes them.
Good at generating a load on the system
that is representative of a real user load.
Slide 132
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
How to use the hierarchy
Which level of the monkey hierarchy is
best?
They all tend to find different bugs, so all
levels are useful
45
Slide 133
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Random testing gotchas
May have to run for a very long time in
order to get good test coverage
For complex random test scenarios, test
results may be hard to reproduce
The oracle problem how do you verify
test results when the tests are random?
Slide 134
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
All pairs
Lets take a subset of the windows clock
example
Lets consider these features:
size - tiny, small, maximized
analog/digital
am/pm
GMT/local
title/no title
There are 48 possible combinations
(3 X 2 X 2 X 2 X 2)
Slide 135
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Reducing the test set
With the all-pairs technique, we only need 8
tests
The savings increases as the number of
variables increases
We have to evaluate the risk of a bug based
on particular values in three or more
variables
Its best explained by mapping out the pairs
by hand
46
Slide 136
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
One all-pairs solution
N G A D S 8
T L P A T 7
T L A D M 6
N L A A S 5
T L P D T 4
N G P A M 3
T G P D S 2
N G A A A 1
title/
no title
GMT/
local
am/pm analog/
digital
size
t/s/m
test
Slide 137
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
All pairs gotchas
An advanced technique that requires
expertise and special tools in non-trivial
cases
Will probably miss bugs if the bugs are
based on combinations of three or more
variables
Slide 138
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Others
State testing
Transaction flow
Data flow
Finite state
Loop testing
47
Slide 139
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Which technique to use?
Never use just one
All test design techniques have weaknesses
use more than one of them to help
overcome the weaknesses
Slide 140
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Test techniques exercise
Discuss the appropriate combination of
techniques to use for a few different testing
projects.
Slide 141
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Section 6 references
Test Design: Developing test cases from use cases, Ross Collard,
STQE magazine, July/August 1999, http://StickyMinds.com
Using Monkey Test Tools, Noel Nyman, STQE magazine,
January/February 2000, http://www.stickyminds.com
Black-Box Testing, Boris Beizer, John Wiley & Sons, 1995, ISBN 0-
471-12094-4
AETG Web Service (all-pairs tool)
http://aetgweb.argreenhouse.com/
Planning Efficient Software Tests, Madhav S. Phadke,
http://www.stsc.hill.af.mil/CrossTalk/1997/oct/planning.html
(orthogonal arrays - similar to all-pairs)
48
Introduction to Software
Testing and Quality
Assurance
Slide
142
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
7
System testing
Slide 143
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Two system test points of view
Functional testing
Do all the features do what theyre supposed to
do from the users point of view?
Non-functional testing
An unfortunate term!
There are emergent properties of the system
that arent directly related to the features
reliability, usability, maintainability;
the ilities
Slide 144
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Functional system testing
If you need to have requirements
traceability, this may involve a formal
process of verifying the requirements
You have probably done much of the
functional testing at lower levels
Evaluate which tests need to be repeated at the
system level
User scenario testing falls under system testing
49
Slide 145
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Non-functional testing
Usability Testing
Load Testing
Performance Testing
Stress & Hot Spot Testing
Spike & Bounce Testing
Reliability Testing
Configuration Testing
Slide 146
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Usability testing
Verify that the user interface is intuitive
Can be done informally using staff on hand,
especially people outside the development
team (If our division manager can use it,
anyone can.)
Formal usability testing involves
representatives of the user base and special
labs to monitor their use of the product
Slide 147
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Load testing
Everyone seems to use this term differently.
May involve:
Running a typical user load for a short period of
time
Running a high user load for a short period of
time
Stress testing, Performance testing, or
Reliability testing (all described later)
50
Slide 148
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Performance testing
Make sure the user will get their results
when they need them
Focuses on
Throughput how much total work can the
system do per unit time
Time-to-solution how long do I have to wait
before I get my results?
Has some overlap with load testing
Slide 149
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Stress & hot spot testing
Stress testing involves pushing the system
to its limits and beyond
You verify that the system keeps running
and gives a reasonable error when you use
up resources and then ask for more
Hot spot testing is stress testing focused on
a particular part of the system
Slide 150
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Spike & bounce testing
A variation on load testing
Instead of keeping the load constant, spike
it up, then drop it back down (lather, rinse,
repeat :-)
Tests how well the system can recycle its
resources
51
Slide 151
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Reliability testing
How long will the system keep running
under a specified set of conditions?
For informal reliability testing, you might
just run a medium-level load test over a
long period of time
For more formal reliability testing, use
Musas Software Reliability Engineering
(high level of expertise required)
Slide 152
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Configuration testing
Examine all the different configurations your
system supports operating systems,
hardware, peripherals, software configuration
options, versions of third-party programs it
works with, etc.
Find a reasonable way to test the more
common configurations (hint: try the all pairs
technique)
Use outsourced test labs or beta testing when
necessary
Slide 153
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Beta Testing
If youve done all the testing you can do,
but you want a few users to try the system
before general release, run a beta test
Beta tests must be managed carefully to
make sure users actually install the software
and that bug reports get back to you
Make sure theres time in the schedule to
address the problems you find in the beta
test before the general release
52
Slide 154
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Section 7 references
Performance Testing Terms - the Big Picture, Danny Faught, Tejas
Software Consulting Newsletter, April 2001,
http://tejasconsulting.com/newsletter/2001April.html (based on a
training session given by Steve Splaine)
Experience with OS Reliability Testing on the Exemplar System,
Danny Faught, Quality Week 97 conference,
http://tejasconsulting.com/papers/cho_qw97/cho_qw97.pdf
The Web Testing Handbook, Steven Splaine, STQE Publishing, 2001,
ISBN 0-9704363-0-0
Software Reliability Engineering, John Musa, McGraw Hill, 1998,
ISBN: 0079132715
John Musas home page http://members.aol.com/JohnDMusa/
Software System Testing and Quality Assurance, Boris Beizer, Van
Nostrand Reinhold, 1984, ISBN 0-442-21306-9 (out of print, but often
available in used book stores; note somewhat outdated)
Introduction to Software
Testing and Quality
Assurance
Slide
155
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
8
Test documentation
Slide 156
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
What to document?
The IEEE 829 includes:
Test plan
Test design specification
Test case specification
Test procedure specification
Test item transmittal report
Test log
Test incident report
Test summary report
53
Slide 157
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Wow!
Thats a lot of documentation!
IEEE 827, IEEE Standard for Software Test
Documentation, is a popular resource,
especially for the test plan standard
But, most people customize it and arent
compliant with all parts of the standard
Note that the standard isnt free, but you can
find many free templates that are based on it
Slide 158
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Test plans
The test plan is a project plan for the test
project
IEEE 829 is a good template pick and
choose what sections are needed based on
the size of your project
(Many people use this term differently
always ask what people mean when they
say test plan)
Slide 159
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Sections from the IEEE test plan
A) Test plan identifier
B) Introduction
C) Test items
D) Features to be tested
E) Features not to be tested
F) Approach
G) Item pass/fail criteria
H) Suspension criteria and
resumption requirements
I) Test deliverables
J) Testing tasks
K) Environmental needs
L) Responsibilities
M) Staffing and training
needs
O) Risks and
contingencies
P) Approvals
54
Slide 160
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Test designs
The test plan shouldnt go into detail on
what each test suite or test case will do
(combining the IEEE Test design
specification and Test case specification)
Write a test design for each suite of tests
Tests may be:
Exploratory
Manual (scripted step-by-step)
Automated
Slide 161
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Designing exploratory tests
There isnt much literature on this on how
much ahead of time you should design
exploratory tests
Youll need charters for each exploratory
test session, but not step-by-step details
Slide 162
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Designing manual tests
You want to give the manual tester
instructions for running the test
But you have a lot of leeway in how much
detail you put in the instructions
You could document the test in great
amounts of detail
But many experts prefer to give the tester
some leeway in their approach so you run
the test a little bit differently each time
55
Slide 163
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Designing automated tests
Creating automated tests is a software
development activity design automated
tests like you would design any software
You may write tests to be manual for now
and automated later
Sometimes you might use a rapid
prototyping approach, e.g., for running
stress tests early in the project
Slide 164
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Anatomy of a test case
1. Setup, environmental requirements
2. What the test does
3. Cleanup
4. Expected results
Slide 165
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Test reporting
You may need detailed test reports, showing
exactly what tests were run and what the
results were
You may need summary test reports,
showing how many tests passed or failed
Bug reports use whatever bug tracking
system the rest of the project uses
56
Slide 166
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
How much documentation?
Military standards would be too much for a
small low-risk software project
Examine how your documentation will be
used, and create only what will be used
Slide 167
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
The quality plan
The project plan for the QA team
Define the high-level quality standards of
the project
Plan the necessary QA activities for the
project
See IEEE Std 730, but dont be scared away
by the excessive focus on auditing
Slide 168
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Section 8 references
IEEE Std 829-1998, IEEE Standard for Software Test Documentation,
ISBN 0-7381-1443-X
IEEE Std 730-1998, IEEE Standard for Software Quality Assurance
Plans, ISBN 0-7381-0328-4, plus the guide, IEEE Std 730.1-1995
Testing Computer Software, Cem Kaner, Jack Falk, Hung Quoc
Nguyen, Wiley Computer Publishing, 1999, ISBN 0-471-35846-0
(chapter 12)
Samples and templates click Templates at
http://www.stickyminds.com/swtest.asp?tt=Software+Testing
57
Introduction to Software
Testing and Quality
Assurance
Slide
169
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
9
Bug isolation and reporting
Slide 170
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Premise
A problem well-stated is half-solved.
- Carl Stimson
Getting a failure isolated and properly
described is a large part of the debugging
process
Testers will have more success in getting
bugs fixed if they engage in this process
Slide 171
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
A successful test!
You have a test failure. What do you do
next?
Do it again the same way
Is it easily reproducible?
Do you know all the steps required to
reproduce it?
Does it fail the same way each time?
58
Slide 172
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
What, theres more?
You could file a bug now, but youll be
more successful if you take a few more
steps first
(Reproduce)
Simplify
Generalize
Slide 173
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
What if you cant reproduce it?
There are no intermittent software errors.
(Testing Computer Software, p. 82)
The problem is still there, but the failure is
intermittent
Gather as much data as you can about the
failure before you lose it (capture error
dialogs, log files, etc.)
If you cant reproduce it, file a bug report
anyway and say its hard to reproduce
Slide 174
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Simplify
Ideal push one button, and theres the bug
When you first encounter a bug, you will
often have done more than the minimum
necessary to reproduce it
Remove irrelevant steps that will just
complicate the debugging process
59
Slide 175
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
The critical data point
You want to find one set of conditions that
reproduces the bug, where a tiny change in
those conditions does not reproduce it
Needs improvement: Software crashes
with a 10 megabyte data file
Better: Software crashes with a 32,768
byte data file, but is fine at 32,767 or
smaller
Slide 176
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Binary search
If you need to zero in on simplified test
data, use a binary search
E.g., large data files, numeric or alphanumeric
parameters, program code
Remove half of the possibilities and see
which half still reproduces the bug
Take that half and cut it in half again
Continue until you find the precise
conditions that reproduce the bug
Slide 177
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Bottom-up simplification
If you have a hunch about one action out of
many that hits the bug, strip everything else
away and try it in isolation
You can sometimes quickly hit upon the
trigger for the bug
But more often, you need a non-obvious
combination of actions, so dont struggle
with bottom-up for long
60
Slide 178
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Top-down simplification
Start with the full set of actions that
reproduce the bug
Remove a few of those actions at a time and
check to see if you can still trigger the bug
Keep careful records about what reproduced
the bug and what didnt
Youll be trying many subtly different sets of
actions, and its easy to lose track
Slide 179
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Generalize
Ideal push the first button you see and the
whole system crashes
Are the symptoms you see just the tip of the
iceberg?
Is there a way to reproduce the bug in a way
that users are more likely to do?
The main point find the worst possible
consequence of the bug
Slide 180
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
How to generalize
Think of other shorter or more common
paths you can follow to get to the same bug
Try to set up interactions that make the
consequences of the bug worse
Keep using the software for a while after
you hit the bug its amazing how often
this leads to a crash
61
Slide 181
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Bug reporting
The point of writing problem reports
is to get bugs fixed. (Testing Computer Software, p. 65)
Bug reporting is an art, and a critical skill
A poorly written bug report is going to decrease
the chances that a bug gets fixed
A well-written bug report can have a vastly
different result
Slide 182
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Bug reporting tips
The most important element of a bug report
is the summary line make your point in
the first 5 words
Only 1 bug per report
Report severity and likelihood separately,
and dont confuse these with priority
Use a bug tracking tool so bugs dont fall
between the cracks
Slide 183
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
The bug description
Explain step-by-step what to do to
reproduce the bug (remember that someone
on a different system, while youre not
standing there, will try to follow the steps)
Point out the indicators that led you to
believe that there is a bug
Explain why you think this is a problem,
and what behavior you expected
62
Slide 184
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Bug reporting exercise
Write a bug report and look for
opportunities to improve it
Slide 185
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Section 9 references
Software Defect Isolation, Prathibha Tammana & Danny Faught,
High-Performance Computing Users Group, 1998, InterWorks 1998,
http://tejasconsulting.com/papers/iworks98/defect_isol.pdf
Testing Computer Software, Cem Kaner, Jack Falk, Hung Quoc
Nguyen, Wiley Computer Publishing, 1999, ISBN 0-471-35846-0
(chapters 4-6)
Cem Kaners Bug Advocacy Slides, http://stickyminds.com/
Introduction to Software
Testing and Quality
Assurance
Slide
186
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
I0
Static testing
63
Slide 187
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Static testing benefits
Static testing can find defects before your
code is ready to run and before the tests are
ready.
Static testing gets 100% code coverage
without having to traverse each path
separately.
Some people dont consider this testing
in any case, it must be used in conjunction
with dynamic testing
Slide 188
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Static testing overview
In this section, well cover:
Reviews and inspections
Static analysis tools
Slide 189
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Reviews and inspections
A visual review of some work product,
usually for the purpose of finding potential
defects (issues)
Can also have a secondary purpose of
educating the reviewers
Like testing, if the author knows their work
will be reviewed, theyre more likely to do a
good job in the first place
64
Slide 190
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
What to review?
project plans
requirements
designs
test plans
test designs
code
documentation
even proposals and
contracts
Any work product!
Slide 191
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
A typical review
From: [email protected]
To: dozens of people
Subject: product design is ready
I have mostly finished the product design
document, which is attached. Please take
a look some time and let me know if you
see any problems.
Slide 192
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Another typical review
Invite a few dozen people to a meeting to
review the document.
At the meeting, discuss issues that people
bring up, and debate possible solutions.
Run 45 minutes over the scheduled end
time, and get a third of the document
covered.
65
Slide 193
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
We can do better!
In the first example, we wont get
comments back from the key people who
need to review the document, and we wont
get comments back in a timely fashion.
The second example is better, but the
meeting is chaos. Meeting time is not
utilized efficiently.
Slide 194
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Types of review
Inspection
Walkthrough
Buddy check
Warning: Youll find tons of variation in
how these and other terms are used!
Slide 195
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
What do they have in common?
Inspections are the most formal type of
review, but all review types can benefit
from a few formalities.
Get the right people involved
Run meetings wisely
Get closure
Track the process
66
Slide 196
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Get the right people involved
Dont let your review team be self-selected.
Think about the roles you need to fill you
need one and only one person for each role.
Give them advance warning I plan to be
ready to review this by X date, can you set
aside time to review it?
Its okay to involve managers if people
arent afraid to critique the work of their
peers in front of management.
Slide 197
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Run meetings wisely
Decide ahead of time what is fair game to
discuss in the meeting.
How much time to spend discussing
solutions? If many solutions are still up the
the air, youre not ready for a review. Never
let a few issues dominate the meeting.
Learn how long it takes to do a review, and
schedule enough time for it.
Slide 198
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Get closure
If you arent holding a review meeting, set a
deadline.
Decide on a disposition as a group:
1) Exit, no change required (very rare!)
2) Exit, with minor changes
3) Re-review required, after major changes
4) Not ready to review (meeting cancelled)
67
Slide 199
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Track the process
Like any process, it takes a while for
reviews to become a consistent part of the
culture.
Set standards and check up on them.
Check to see if youre still getting a good
return on your investment in reviews are
they still finding defects efficiently?
Slide 200
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Inspections
Also called Fagan inspections, formal
inspections, and formal reviews
Led by a trained moderator
Advance preparation is required
The logging meeting is for logging issues
found and identifying new issues
discussions about solutions are restricted
The best review method for finding defects
Slide 201
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Inspection process
Planning Kickoff Checking Logging Editing
entry
exit
work
product
work
product
Source docs
& Checklists
better
work
product
process
improve-
ments
source
doc
defects
68
Slide 202
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Walkthroughs
Usually dont involve a facilitator
Advance preparation is sometimes required,
sometimes optional, and sometimes not
possible
Useful for training the participants on new
technical material
Not as effective as inspections for finding
defects
Slide 203
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Walkthrough process
The author walks through the work
product page by page, or uses a prepared
presentation
Discussion may be allowed, if it contributes
to the goal of the walkthrough, and if the
time is managed properly
Slide 204
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Buddy checks
Also called desk checks
One or two people familiar with the
technology independently review a work
product and give feedback directly to the
author
Can be done via email, or with face-to-face
meetings
If feedback from more than two people is
needed, use an inspection instead
69
Slide 205
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Variations
There are many other types of reviews, and
there is no good standard set of terminology
for them
The books you read often give a rigid
definition of a type of review, but in
practice, everyone implements them a bit
differently
Use the best ideas from all the techniques
you learn about
Slide 206
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Choosing a review type
If its important to remove as many defects
as possible, use an inspection
If you want to familiarize the reviewers
with the work product, or get consensus, use
a walkthrough
For a low-risk work product, use a buddy
check
Slide 207
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Combining review techniques
Even inspections only find about 50% of the
defects upon each pass
Often you should use two or more reviews
during the life cycle of a work product
70
Slide 208
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Buddy check pre-requisite
buddy
check
inspection
work
product
some
defects
removed
better
work
product
The buddy check finds the
more obvious problems
before the inspection.
Slide 209
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Maintenance work
inspection inspection
work
product
new
feature
added
2.0
work
product
The inspection is repeated after a
new feature is added, focusing on
what changed.
Slide 210
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
inspection
walkthrough
work
product
better
work
product
A work product is inspected, then
a walkthrough is used to
introduce it to a broader audience.
Inspection, then walkthrough
71
Slide 211
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Integrating with testing
The review process feeds into risk-based
testing
If the work products for a particular feature
are well-reviewed, it may need less testing
than one that didnt get a good review
If a feature is difficult to test, you can
compensate with a more thorough review
Slide 212
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Static Analysis Tools
Some types of static analysis
Complexity analysis
Defect detection
Coding standards enforcement
Module mapping
(some tools cover more than one of these)
Slide 213
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Complexity analysis
Most common:
McCabes cyclomatic complexity
The question is what do you do with the
results?
Do complexity analysis only on mission-
critical code
72
Slide 214
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Defect detection
A tool can point out questionable coding
constructs
First, crank up the compiler warnings the
tool you already have probably has features
that youre not using
You must usually tune a static analysis tool
to filter out hundreds of warnings about
things you dont care about
Slide 215
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Coding standards
Many static analysis tools can enforce
coding standards
Make sure your development organization
agrees on coding standards before trying to
use a tool to enforce them
Slide 216
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Module mapping
Tools can map out the tree of module/
library dependencies for your programs
This can be useful in tracking down
dynamic library problems and for verifying
the set of libraries that you need to install
73
Slide 217
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Static analysis services
A few companies offer to scan your code
for you and send you a report
This can be useful if you dont want to
expend the effort to install and tune a static
analysis tool
You have to be willing to give your source
code to an outside company
Slide 218
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Section 10 references
Software Inspection, Tom Gilb & Dorothy Graham, Addison-Wesley,
1993, ISBN 0-201-63181-4
Handbook of Walkthroughs, Inspections, and Technical Reviews, third
edition, Daniel F. Freedman & Gerald M. Weinberg, Dorset House
Publishing, 1990, ISBN 0-932633-19-6
Design and code inspections to reduce errors in program
development, Michael E. Fagan, IBM Systems Journal, Vol 15, No. 3,
1976
(http://researchweb.watson.ibm.com/journal/sj/382/fagan.pdf)
Introduction to Software
Testing and Quality
Assurance
Slide
219
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
II
Metrics
74
Slide 220
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Why metrics?
You have to understand how things are
working now in order to improve
You have to know how much things have
changed to know whether you actually
improved
You need to take measurements in order to
manage your products and processes
Slide 221
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Metrics pitfalls
Dont use metrics to evaluate individuals
this is an incentive for them to skew the
numbers
It must be easy to collect the metrics
The harder it is to do, the less complete and
accurate the data will be
The best metrics are collected automatically
with no additional effort by the staff
Only record metrics that you will actually
use
Slide 222
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Code Coverage
Many different tools exist to measure
control flow coverage in the code
Which paths remain untested?
This is one good way to measure the
completeness of your tests
Note that coverage tools are more difficult
to use in embedded environments and tools
may not exist for less popular environments
75
Slide 223
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
How to use code coverage
Get good functional coverage first
Measure code coverage (preferably branch
coverage or better)
This is best done at the unit level
For integration or system tests, look at which
functions you hit, and functional call coverage
Then revisit the test design to improve
coverage
Dont just add a test to cover a line of code look
for holes in the test design
Slide 224
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Functional Coverage
More important than code coverage, but
harder to measure
Measure how well you covered the
functionality from the users point of view
Use a requirements traceability matrix
Count function points if you can get the
expertise
If all else fails, use the user documentation as a
checklist
Slide 225
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Bug Metrics
Your bug tracking database is a rich source
of metrics
But it shouldnt be your only type of metrics
Track close rate vs. incoming rate
Track open bugs by severity and length of
time open
Chart which modules have the most bugs
and take corrective action
76
Slide 226
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Section 11 references
Software Metrics, Robert B. Grady & Deborah L. Caswell, Prentice
Hall PTR, 1987, ISBN 0-13-821844-7
Practical Software Metrics for Project Mangement and Process
Improvement, Robert B. Grady, Prentice Hall PTR, 1992, ISBN 0-13-
720384-5
How to Misuse Code Coverage, Brian Marick, Testing Computer
Software conference, 1999, http://www.testing.com/writings.html
A Buyer's Guide to Code Coverage Terminology, Brian Marick,
http://www.testing.com/writings.html
International Function Point Users Group - http://www.ifpug.org/
Introduction to Software
Testing and Quality
Assurance
Slide
227
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
IZ
Process improvement
Slide 228
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Genius mode
Oh, no we found a problem!
Look heres a solution.
Lets implement that solution, quick!
.Whats wrong with this?
77
Slide 229
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Maybe were not such geniuses
Is this the most important problem that we
need to be addressing?
Did we choose the best solution?
Can we finish implementing the solution
and maintain it?
Slide 230
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
1) Identify the problems
This is the step that the literature offers the
least help with
Build a database of problems, er, areas for
potential improvement
Brainstorming sessions
Look at reports from retrospectives
Have a suggestion box to capture ideas
whenever they come up
Slide 231
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
2) Prioritize
This is especially difficult if you do a good
job of building a long list of improvement
areas!
Choose one or a few improvements that will
have the largest impact
Dont let large projects eclipse all ideas that
would be quick to implement and show
immediate benefits
78
Slide 232
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
3) Identify solutions
Gather further data about the area to be
improved.
Skipping this step leads you back to genius
mode.
Identify a handful of possible solutions.
Dont stop at just one.
Slide 233
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
4) Choose a solution
Once you have done the analysis and have
options before you, choose one of them.
Sometimes, you can choose more than one if
they work well together
Plan the improvement project and commit
resources to it
Its not a project if you dont have resources,
yet many people skip this step!
Slide 234
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
5) Manage the improvement
Countless improvement projects run out of
steam before they are completed
Track the project, preferably at a high level
Put it on the overall project schedule as a
first-class activity
Often you want to start with a pilot project
and then spread the improvement more
broadly
Recruit a champion to keep it going
79
Slide 235
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
On incremental improvement
Many large improvement projects have
failed
Smaller projects are more likely to be
approved and completed
Even for large projects, try to make
incremental steps that show progress
Slide 236
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Section 12 references
Test Process Improvement: A Practical Step-By-Step Guide to
Structured Testing, Martin Pol/Tim Koomen, 1999, Addison Wesley,
ISBN: 0201596245
The PIT Crew: A Grass-Roots Process Improvement Effort, Danny
Faught, STAR 1998 conference,
http://tejasconsulting.com/papers/star98/pit.pdf
Successful Software Process Improvement, Robert B. Grady, Prentice
Hall PTR, 1997, ISBN 0-13-626623-1
Out of the Crisis, W. Edwards Deming, MIT Press, 2000 (1986), ISBN
0262541157
Quality is Free, Philip Crosby, New American, 1982, ISBN
0451625854
Quality is Still Free, Philip Crosby, McGraw Hill, 1996, ISBN
0070145326
American Society for Quality http://www.asq.org/
Introduction to Software
Testing and Quality
Assurance
Slide
237
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
I3
Overview of automated testing
80
Slide 238
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
What to automate?
Test execution is the most popular task to
automate
There are many other opportunities
Test environment setup
Test reporting
Creating test data
Test design
Handling the bug database
QA tasks like configuration management
Slide 239
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Why automate test execution?
If you need to run tests more than once, you
might can save effort with automation
Rule of thumb if youll run the test at least 10
times, automate it
Manual testers may miss bugs that an
automated test wouldnt
Manual testers dont enjoy the drudgery of
repeating the same tests over and over
Slide 240
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Why not automate?
Manual testers can find bugs that automated
tests cant
If it will take lots of effort to automate the
tests, you might not get a good return on the
investment
The product itself should be designed with
automated testing in mind testability
If the interfaces youre testing arent stable,
test maintenance will be a big burden
81
Slide 241
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
GUI test automation
Many teams have tried a simplistic
capture/replay GUI test tool and then had to
throw the tests away when the GUI changes
GUI test automation is a programming task,
even if you use capture/replay to help
Try to test below the GUI
Ask developers to design the GUI as a thin
layer on top of an API then call into that API
Slide 242
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Who does the automation?
You will need testers with programming
experience
If you set up a data-driven testing
infrastructure, then non-programmer
domain experts can create automated tests
You might be able to use technicians to run
the tests
Analyzing failures takes expertise
Slide 243
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
How to automate?
Hundreds of commercial tools are available
for many different types of testing tasks
You can create your own tools if necessary
Beware that creating your own tools is usually
more expensive than you think
You will probably use a hybrid of
commercial tools with hand-crafted glue
where necessary
82
Slide 244
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
The sandwich model
The bottom and top of the sandwich often uses
customized code, when its difficult to add features to
the general- purpose harness in the middle
Reusable test harness
Custom front end
Custom test code
Slide 245
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Section 13 references
Software Test Automation, Dorothy Graham & Mark Fewster, Addison
Wesley, 1999, ISBN 0201331403
Automated Software Testing, Elfriede Dustin, Jeff Rashka, & John
Paul, Addison Wesley, 1999, ISBN 0201432870
Event-Driven Scripting, Danny Faught, presentation for the July
2001 meeting of the DFW Unix Users Group,
http://tejasconsulting.com/papers/event-driven/Event-Driven.htm
"Scripts on My Tool Belt," Danny Faught, presentation for the Fall
2001 Software Test Automation Conference,
http://tejasconsulting.com/papers/toolbelt/toolbelt.ppt
Articles by Elisabeth Hendrickson
http://www.qualitytree.com/feature/feature.htm
Software Test Automation Conference
http://www.sqe.com/testautomation/
Introduction to Software
Testing and Quality
Assurance
Slide
246
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
I4
Resources
83
Slide 247
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Tools
Testing Tools Supplier List (includes
pointers to other resources)
http://www.testingfaqs.org
StickyMinds Tools Guide
http://www.stickyminds.com
Slide 248
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Printed references
Best-seller: Testing Computer Software, Cem Kaner, Jack
Falk, Hung Quoc Nguyen, Wiley Computer Publishing,
1999, ISBN 0-471-35846-0
Gentle (may over-simplified) introduction: Software
Testing, Ron Patton, Sams Publishing, 2001, ISBN 0-672-
31983-7
STQE magazine (Software Testing and Quality
Engineering) http://www.stqemagazine.com
IEEE Standards, Software Engineering (all software
standards together in 4 volumes)
Slide 249
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Web Sites
My writings http://www.tejasconsulting.com /
StickyMinds http://www.StickyMinds.com/
Comp.software.testing FAQ
http://www.crim.ca/ctl/cst.FAQ.html
Software Testing Hotlist
http://www.testinghotlist.com/
SRM Hotlist
http://www.cigitallabs.com/resources/hotlist
84
Slide 250
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Wrap-up
Please feel free to follow up with any
questions [email protected],
817-294-3998
Consider subscribing to my free monthly
email newsletter -
http://tejasconsulting.com/#news
Please fill out your evaluations!
Slide 251
Introduction to Software
Testing and Quality
Assurance
2001 Danny R. Faught
Tejas Software Consulting
Thanks!
Thanks for your
participation!

You might also like