Test Plan (A Real Sample) : /11/2014 My Company Name Abhishek Kulkarni
Test Plan (A Real Sample) : /11/2014 My Company Name Abhishek Kulkarni
Test Plan (A Real Sample) : /11/2014 My Company Name Abhishek Kulkarni
com
© www.mycompanyweb.com
/11/2014
My Company Name
Abhishek Kulkarni
Version: 1.0
Created: 08/11/2014
Last Updated: 08/17/2014
Status: DRAFT (The status would change to finalized post the BA, PM and dev team review and sign off)
Sample Test Plan – OrangeHRM Live Project Training © www.mycompanyweb.com
Approvers List - To track who has reviewed and signoff on the Test plan
Approver / Approval /
Name Role
Reviewer Review Date
Reference Documents - Clearly mark the document used as an input to create the test plan
Version Date Document Name
Table of Contents
1. INTRODUCTION.......................................................................................................................5
1.1. Purpose.......................................................................................................................5
1.2. Project Overview.........................................................................................................5
1.3. Audience......................................................................................................................5
2. TEST STRATEGY.......................................................................................................................5
2.1. Test Objectives............................................................................................................5
2.2. Test Assumptions........................................................................................................6
2.3. Test Principles..............................................................................................................7
2.4. Data Approach.............................................................................................................7
2.5. Scope and Levels of Testing.........................................................................................7
2.5.1. Exploratory.....................................................................................................7
2.5.2. Functional Test...............................................................................................8
TEST ACCEPTANCE CRITERIA...........................................................................8
TEST DELIVERABLES........................................................................................8
MILESTONE LIST..............................................................................................9
2.5.3. User Acceptance Test (UAT)...........................................................................9
TEST DELIVERABLES........................................................................................9
2.6. Test Effort Estimate...................................................................................................10
3. EXECUTION STRATEGY..........................................................................................................10
3.1. Entry and Exit Criteria................................................................................................10
3.2. Test Cycles.................................................................................................................11
3.3. Validation and Defect Management..........................................................................11
3.4. Test Metrics...............................................................................................................12
3.5. Defect tracking & Reporting......................................................................................13
4. TEST MANAGEMENT PROCESS.............................................................................................13
4.1. Test Management Tool..............................................................................................13
4.2. Test Design Process...................................................................................................14
4.3. Test Execution Process..............................................................................................15
4.4. Test Risks and Mitigation Factors..............................................................................16
4.1. Communications Plan and Team Roster....................................................................17
1. INTRODUCTION
1.1. Purpose
This test plan describes the testing approach and overall framework that will drive the testing of the
OrangeHRM Version 3.0 – My Info Module.com site. The document introduces:
Test Strategy: rules the test will be based on, including the givens of the project (e.g.: start / end
dates, objectives, assumptions); description of the process to set up a valid test (e.g.: entry / exit
criteria, creation of test cases, specific tasks to perform, scheduling, data strategy).
Execution Strategy: describes how the test will be performed and process to identify and report
defects, and to fix and implement fixes.
Test Management: process to handle the logistics of the test and all the events that come up
during execution (e.g.: communications, escalation procedures, risk and mitigation, team roster)
My Info Module is a powerful tool providing employees of the company with the ability to view
relevant information such as personal information and updating personal information with an
internet enabled PC without having to involve the HR department.
The functionality of this module spans through the entire system, making information available
anywhere, anytime. All information is subject to company’s defined security policy, where he/she can
only view the information he/she is authorized to. An ESS-User can only edit certain fields in the ESS
Module, maintaining the security and confidentiality of employee information
1.3. Audience
Project team members perform tasks specified in this document, and provide input and
recommendations on this document.
Project Manager Plans for the testing activities in the overall project schedule, reviews the
document, tracks the performance of the test according to the task herein specified, approves
the document and is accountable for the results.
The stakeholders’ representatives and participants (individuals as identified by the PMO Leads)
may take part in the UAT test to ensure the business is aligned with the results of the test.
Technical Team ensures that the test plan and deliverables are in line with the design, provides
the environment for testing and follows the procedures related to the fixes of defects.
Business analysts will provide their inputs on functional changes.
2. TEST STRATEGY
The objective of the test is to verify that the functionality of ORANGEHRM VERSION 3.0 – MY INFO
MODULE works according to the specifications.
The test will execute and verify the test scripts, identify, fix and retest all high and medium severity
defects per the entrance criteria, prioritize lower severity defects for future fixing via CR.
A production-ready software;
A set of stable test scripts that can be reused for Functional and UAT test execution.
Key Assumptions
Production like data required and be available in the system prior to start of Functional Testing
In each testing phase, Cycle 3 will be initiated if the defect rate is high in Cycle 2.
General
Exploratory Testing would be carried out once the build is ready for testing
Performance testing is not considered for this estimation.
All the defects would come along with a snapshot JPEG format
The Test Team will be provided with access to Test environment via VPN connectivity
The Test Team assumes all necessary inputs required during Test design and execution will be
supported by Development/BUSINESS ANALYSTs appropriately.
Test case design activities will be performed by QA Group
Test environment and preparation activities will be owned by Dev Team
Dev team will provide Defect fix plans based on the Defect meetings during each cycle to plan.
The same will be informed to Test team prior to start of Defect fix cycles
BUSINESS ANALYST will review and sign-off all Test cases prepared by Test Team prior to start of
Test execution
The defects will be tracked through HP ALM only. Any defect fixes planned will be shared with
Test Team prior to applying the fixes on the Test environment
Project Manager/BUSINESS ANALYST will review and sign-off all test deliverables
The project will provide test planning, test design and test execution support
Test team will manage the testing effort with close coordination with Project PM/BUSINESS
ANALYST
Project team has the knowledge and experience necessary, or has received adequate training in
the system, the project and the testing processes.
There is no environment downtime during test due to outages or defect fixes.
The system will be treated as a black box; if the information shows correctly online and in the
reports, it will be assumed that the database is working properly.
Cycle 3 will be initiated if there are more defects in Cycle 2.
Functional Testing
During Functional testing, testing team will use preloaded data which is available on the system
at the time of execution
The Test Team will be perform Functional testing only on ORANGEHRM VERSION 3.0 – MY INFO
MODULE
UAT
UAT test execution will be performed by end users (L1, L2 and L3) and QA Group will provide
their support on creating UAT script.
Testing will be focused on meeting the business objectives, cost efficiency, and quality.
There will be common, consistent procedures for all teams supporting testing activities.
Testing processes will be well defined, yet flexible, with the ability to change as needed.
Testing activities will build upon previous stages to avoid redundancy or duplication of effort.
Testing environment and data will emulate a production environment as much as possible.
Testing will be a repeatable, quantifiable, and measurable activity.
Testing will be divided into distinct phases, each with clearly defined objectives and goals.
There will be entrance and exit criteria.
In functional testing, ORANGEHRM VERSION 3.0 – MY INFO MODULE will contain pre-loaded test
data and which is used for testing activities.
2.5.1. Exploratory
PURPOSE: the purpose of this test is to make sure critical defects are removed before the
next levels of testing can start.
METHOD: this exploratory testing is carried out in the application without any test scripts and
documentation
PURPOSE: Functional testing will be performed to check the functions of application. The
functional testing is carried out by feeding the input and validates the output from the
application.
Scope: The below excel sheet details about the scope of Functional test. Note: The scope is
high level due to changes in the requirement.
To keep the document easily fragmented and categorized, the scope has been embedded as
separate document. If you prefer you can insert a table here itself. The scope is created
based on the Test scenarios that were identified in the previous article.
Functional Testing
Scope.xlsx
METHOD: The test will be performed according to Functional scripts, which are stored in HP
ALM.
Sign-off Readiness
TEST DELIVERABLES
MILESTONE LIST
The milestone list is tentative and may change due to below reasons
Testing generally is not carried out in one cycle. Based on the testing scope, we can
estimate how much time it takes and establish the time lines as you can see in the below
embedded excel sheet.
DFRT Execution
Cycle.xlsx
PURPOSE: this test focuses on validating the business logic. It allows the end users to
complete one final review of the system prior to deployment.
TESTERS: the UAT is performed by the end users (L1, L2 and L3).
METHOD: Since the business users are the most indicated to provide input around business
needs and how the system adapts to them, it may happen that the users do some validation
not contained in the scripts. Test team write the UAT test cases based on the inputs from End
user (L1,L2 and L3 users) and Business Analyst’s.
TIMING: After all other levels of testing (Exploratory and Functional) are done. Only after this
test is completed the product can be released to production.
TEST DELIVERABLES
This document lists out all the activities that have to be performed by the QA team and estimates
how many man-hours each activity is going to take.
New_Detailed DRFT
Test estimate v1.xlsx
3.
Note: this estimate is for the TCOE team only Testing Schedule
4. EXECUTION STRATEGY
The entry criteria refer to the desirable conditions in order to start test execution; only the
migration of the code and fixes need to be assessed at the end of each cycle.
The exit criteria are the desirable conditions that need to be met in order proceed with the
implementation.
Entry and exit criteria are flexible benchmarks. If they are not met, the test team will assess the
risk, identify mitigation actions and provide a recommendation. All this is input to the project
manager for a final “go-no go” decision.
Entry criteria to start the execution phase of the test: the activities listed in the Test Planning
section of the schedule are 100% completed.
Entry criteria to start each cycle: the activities listed in the Test Execution section of the schedule
are 100% completed at each cycle.
Test Technical
Exit Criteria Notes
Team Team
100% Test Scripts executed
o There will be two cycles for functional testing. Each cycle will execute all the scripts .
o The objective of the first cycle is to identify any blocking, critical defects, and most of the
high defects. It is expected to use some work-around in order to get to all the scripts.
o The objective of the second cycle is to identify remaining high and medium defects, remove
the work-around from the first cycle, correct gaps in the scripts and obtain performance
results.
UAT test will consist of one cycle.
It is expected that the testers execute all the scripts in each of the cycles described above.
However it is recognized that the testers could also do additional testing if they identify a
possible gap in the scripts. This is especially relevant in the second cycle, when the Business
analyst’s join the TCOE in the execution of the test, since the BUSINESS ANALYSTs have a deeper
knowledge of the business processes. If a gap is identified, the scripts and traceability matrix will
be updated and then a defect logged against the scripts.
The defects will be tracked through HP ALM only. The technical team will gather information on a
daily basis from HP ALM, and request additional details from the Defect Coordinator. The
technical team will work on fixes.
It is the responsibility of the tester to open the defects, link them to the corresponding script,
assign an initial severity and status, retest and close the defect; it is the responsibility of the
Defect Manager to review the severity of the defects and facilitate with the technical team the fix
and its implementation, communicate with testers when the test can continue or should be halt,
request the tester to retest, and modify status as the defect progresses through the cycle; it is
the responsibility of the technical team to review HP ALM on a daily basis, ask for details if
necessary, fix the defect, communicate to the Defect Manager the fix is done, implement the
solution per the Defect Manager request.
Defects found during the Testing will be categorized according to the bug-reporting tool “Mercury HP
ALM” and the categories are:
Severity Impact
1 (Critical) This bug is critical enough to crash the system, cause file corruption, or
cause potential data loss
It causes an abnormal return to the operating system (crash or a
system failure message appears).
It causes the application to hang and requires re-booting the system.
2 (High) It causes a lack of vital program functionality with workaround.
3 (Medium) This Bug will degrade the quality of the System. However there is an
intelligent workaround for achieving the desired functionality - for
example through another screen.
This bug prevents other areas of the product from being tested.
However other areas can be independently tested.
4 (Low) There is an insufficient or unclear error message, which has minimum
impact on product use.
5(Cosmetic) There is an insufficient or unclear error message that has no impact on
product use.
Test metrics to measure the progress and level of success of the test will be developed and shared
with the project manager for approval. The below are some of the metrics
status
Start
Tester:
Tester: Test Dev Develo
Lead Retests
Lead per: the
Report
Validate product
defects Assign Fixes
defects defects defects
N
o Appr
oved?
ClosY
e e
defs
ect
Stop
During Defect fix testing, defects are re-assigned back to the tester to verify the defect fix.
The tester verifies the defect fix and updates the status directly in HP ALM.
Various reports can be generated from HP ALM to provide status of Test execution. For
example, Status report of Test cases executed, Passed, Failed, No. of open defects, Severity
wise defects etc.
Establishing Incorporating
SME /Peer
Understanding Traceability Preparation of Review
Review of Test
Requirements Matrix in HP Test cases comments in
cases
ALM test cases
The tester will understand each requirement and prepare corresponding test case to
ensure all requirements are covered.
Each Test case will be mapped to Use cases to Requirements as part of Traceability
matrix.
Each of the Test cases will undergo review by the BUSINESS ANALYST and the review
defects are captured and shared to the Test team. The testers will rework on the review
defects and finally obtain approval and sign-off.
During the preparation phase, tester will use the prototype, use case and functional
specification to write step by step test cases.
Testers will maintain a clarification Tracker sheet and same will be shared periodically
with the Requirements team and accordingly the test case will be updated. The
clarifications may sometimes lead to Change Requests or not in scope or detailing
implicit requirements.
Sign-off for the test cases would be communicates through mail by Business Analyst’s.
Any subsequent changes to the test case if any will be directly updated in HP ALM.
Participate in
Execute each of Mark Status as Raise defects for Send the daily Complete the
Defect Triage
the test step in Pass/Fail in HP the failed test status report to test execution of
cycle and explain
test case ALM cases in HP ALM Test Lead all the test cases
the defects
Once all Test cases are approved and the test environment is ready for testing, tester
will start a exploratory test of the application to ensure the application is stable for
testing.
Each Tester is assigned Test cases directly in HP ALM.
Testers to ensure necessary access to the testing environment, HP ALM for updating test
status and raise defects. If any issues, will be escalated to the Test Lead and in turn to
the Project Manager as escalation.
If any showstopper during exploratory testing will be escalated to the respective
development SPOCs for fixes.
Each tester performs step by step execution and updates the executions status. The
tester enters Pass or Fail Status for each of the step directly in HP ALM.
Tester will prepare a Run chart with day-wise execution details
If any failures, defect will be raised as per severity guidelines in HP ALM tool detailing
steps to simulate along with screenshots if appropriate.
Daily Test execution status as well as Defect status will be reported to all stakeholders.
Testing team will participate in defect triage meetings in order to ensure all test cases
are executed with either pass/fail category.
If there are any defects that are not part of steps but could be outside the test steps,
such defects need to be captured in HP ALM and map it against the test case level or at
the specific step that issue was encountered after confirming with Test Lead.
This process is repeated until all test cases are executed fully with Pass/Fail status.
During the subsequent cycle, any defects fixed applied will be tested and results will be
updated in HP ALM during the cycle.
The following list defines in general terms the expectations related to the roles directly involved in
the management, planning or execution of the test for the project.
1. Project Manager
2. Test Lead
3. Business Analyst
4. Development Lead
5. Testing Team
6. Development Team
7. Technical Lead
Project Manager: reviews the content of the Test Plan, Test Strategy and Test Estimates
signs off on it.
Ensure entrance criteria are used as input before start the execution.
Develop test plan and the guidelines to create test conditions, test cases, expected
results and execution scripts.
Provide guidelines on how to manage defects.
Attend status meetings in person or via the conference call line.
Communicate to the test team any changes that need to be made to the test
deliverables or application and when they will be completed.
Provide on premise or telecommute support.
Provide functional (Business Analysts) and technical team to test team personnel (if
needed).
Develop test conditions, test cases, expected results, and execution scripts.
Perform execution and validation.
Identify, document and prioritize defects according to the guidance provided by the Test
lead.
Re-test after software modifications have been made according to the schedule.
Prepare testing metrics and provide regular status.
Review testing deliverables (test plan, cases, scripts, expected results, etc.) and provide
timely feedback.
Assist in the validation of results (if requested).
Support the development and testing processes being used to support the project.
Certify correct components have been delivered to the test environment at the points
specified in the testing schedule.
Keep project team and leadership informed of potential software delivery date slips
based on the current schedule.
Define processes/tools to facilitate the initial and ongoing migration of components.
Conduct first line investigation into execution discrepancies and assist test executors in
creation of accurate defects.
Implement fixes to defects according to schedule.
6. TEST ENVIRONMENT
ORANGEHRM VERSION 3.0 – MY INFO MODULE’s servers will be hosted at X company’s site.
RANGEHRM VERSION 3.0 – MY INFO MODULE will be hosted on two servers: One to host the
actual website and (language) code, and the other to host the (database name) database.
A windows environment with Internet Explorer 8, 9 and 10, and with Firefox 27.0, as well as Google
Chrome 32.0 and later should be available to each tester.
7. APPROVALS
The Names and Titles of all persons who must approve this plan.
Signature:
Name:
Role:
Date:
Signature:
Name:
Role:
Date:
Note: This is a sample test plan created on real time software testing live
project - training conducted by softwaretestinghelp.com on following page: