Test Plan Template 20

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 15

Clinical Dashboard Test Plan

Programme Clinical Document Record ID Key


Dashboard
Sub-Prog
Project
Prog. Director Steve Gray Status Revision
Owner Don Payne Version 1.0
Author Sabrina Version Date 27/07/09
Brown

Common Assurance Process


Test Plan
Clinical Dashboard Release 2009.2
Bolton PCT Implementation

© Crown Copyright 2024


Dashboard Test Plan
Dashboard Test Plan 20/04/09 Revision

Amendment History:
Version Date Amendment History
1.0 27/07/2009 Re-drafted for Bolton

Reviewers:
This document must be reviewed by the following:
Name Signature Title / Responsibility Date Version

Approvals:
This document must be approved by the following:
Name Signature Title / Responsibility Date Version

Distribution:
Clinical Dashboard Programme Manager
Clinical Dashboard Test Co-ordinator
Document Status:
This is a controlled document.
Whilst this document may be printed, the electronic version maintained in FileCM is the
controlled copy. Any printed copies of the document are not controlled.

Related Documents:
These documents will provide additional information.
Ref no Doc Reference Number Title Version
1 NPFIT-FNT-TO-RM-GEN-0172 CAP Glossary of Terms 0.2
2 NPFIT-FNT-TO-TIN-1070 Common Assurance Process Procedure Latest

Glossary of Terms:
List any terms used in this document.
Term Acronym Definition
Common CAP CAP is a single end to end process for assuring
Assurance development and delivery of high quality and clinically

© Crown Copyright 2024 Page 2 of 15


Dashboard Test Plan
Dashboard Test Plan 20/04/09 Revision

Process safe IT.


It provides assurance to the NHS, patients and other key
stakeholders that a Service meets a given set of
requirements.

© Crown Copyright 2024 Page 3 of 15


Dashboard Test Plan
Dashboard Test Plan 20/04/09 Revision

Contents
1. Introduction 6
2. Features to be tested 6
3. Features not to be tested 7
4. Test Items 7
5. Item Pass/Fail Criteria 8
6. Approach 8
7. Suspension Criteria and Resumption Requirements 8
8. Test Deliverables 8
9. Testing Tasks 9
9.1. Test Planning 9
9.2. Test Estimating and Scheduling 9
9.3. Allocation and Assignment of Personnel to the testing Sub-Project 9
9.4. Test Analysis and Design 9
9.5. Test Specification 9
9.6. Test Data preparation 9
9.7. Test Execution 9
9.7.1. Resources.........................................................................................................9
9.7.2. Re-Test Procedure..........................................................................................10
9.8. Test Management 10
9.9. Test Environment Planning and Co-ordination 10
9.10. Test Incident Management 10
9.11. Test Automation 10
9.12. Test Witnessing 10
9.12.1. Inspection of Test Results, Scripts and Procedures.......................................11
9.13. Reporting11
9.14. Ensuring Quality, Completeness and Relevance 11
10. Test Issue Management 12
11. Environmental Needs 12
12. Responsibilities 15
13. Staffing and Training Needs 15
14. Schedule 15
15. Risks and Contingencies 15

© Crown Copyright 2024 Page 4 of 15


Dashboard Test Plan
Dashboard Test Plan 20/04/09 Revision

1. Introduction
This document describes how the Test Strategy for release 2009.2 of the Clinical Dashboard
Product will be implemented by detailing the processes by which testing is conducted and
the means by which it is recorded and managed.
It also details the scope of the tests to be conducted together with their expected outcome.
This is then taken into the Schedule that details the tests that are required to cover the areas
scoped in this document.

This release of the Clinical Dashboard software incorporates the changes required for the
Bolton PCT implementation, including rendering gauges within Reporting Services reports,
due to the practice-level data filtering specified by the Trust.

2. Features to be tested
2.1. Functionality
Data Load and localisation
Reporting Services functionality
Sigma Business Intelligence functionality
Sharepoint functionality
2.2. Interfaces
(External and Internal if necessary)
2.3. Report Production
Reports include “Urgent Contacts” sheets for each Practice
2.4. Screen Functionality
All screens to be usable, legible and to display the correct data.
2.5. Simulated Business Running
Test data will be loaded to check end-to-end reconciliation and screen functionality.
2.6. Backup/ Restore
Trust IT staff to have backed up data as instructed by the Technical team.
2.7. Failure/ Recovery
Centrally tested.
2.8. Performance / Capacity / Volume / Limit / Stress
Centrally tested.
2.9. Operational / Housekeeping / Database administration
All terminals in Practice offices to be functional and set to correct resolution. Infrastructure to
be in place and data connections made.

© Crown Copyright 2024 Page 5 of 15


Dashboard Test Plan
Dashboard Test Plan 20/04/09 Revision

2.10. Security and Access Control (e.g. Database Integrity Testing,


Online Help Testing, Context Sensitive Help Testing, Usability and
Accessibility)
This will be tested as part of User Acceptance Testing.
2.11. Audit
This will be tested as part of User Acceptance Testing

3. Features not to be tested


Identify all functionality, features and combinations of features that will not, or cannot, be
tested together with some supporting explanation as to why they are out of scope. If anything
is not being tested it must be declared here to provide a balanced picture.
Use this as an opportunity to raise an issue of an area not to be tested. Try to quantify the
risks of not testing and suggest any alternative options, where appropriate.
Each feature not being tested must be cross referenced to the NHS CFH requirements.

N.B.
A Requirements Matrix is one of the deliverables of CAP and will be created and submitted
at the Design Stage. This Requirements Matrix must be updated to reflect that all items in
this section are out of scope.

4. Test Items
A Test Item is anything that will be delivered; the physical things that are to be tested.
Identify as many pre-requisite Test Items that are required to undertake the testing. Include
their version/ revision level.
Also identify anything that is needed to convert, transfer or migrate Test Items for the
purpose of testing.
Supply references to the associated item documentation where it exists:
 Requirements Specification
 Design Specification
 Package Components
 Bespoke Components
 Package Configuration Specifications
 Data
 Implementation Procedures
 Operations Documentation
 Training Documentation
 Production Support / Production Acceptance Criteria
 Maintenance Agency Acceptance Criteria
 User Guides
- User
- Administration

Reference any outstanding fault or incident reports associated with any Test Item.

© Crown Copyright 2024 Page 6 of 15


Dashboard Test Plan
Dashboard Test Plan 20/04/09 Revision

Items that are to be specifically excluded from testing may be identified.

5. Item Pass/Fail Criteria


For each item listed in the “Features to be Tested” section of this Test Plan there must be
criteria for passing (or failing) that feature, such as the number of known and predicted
outstanding faults.
For example, “all tests are completed and the percentage of minor faults outstanding = “.
This section is essential to know when to stop testing.

If the Entry and Exit Criteria for this Test Phase differ from what was defined in the Test
Strategy then any update should be detailed in this section.

6. Approach
Testing Stages are necessary to ensure that the code and configuration of the delivered
Dashboard are working in accordance with the requirements listed. To ensure this, there
must be a rigid process around which agreed tests are executed, witnessed, documented
with any ensuing issues, reviewed and reported on.
It is envisaged that the testing will be conducted over a life cycle consisting of four stages:

 Module/System Testing
 Integration Testing
 Ready for Operations Testing
 User Acceptance Testing

Go Live will take place after Ready for Operations testing has been completed. Integration,
Ready for Operations and User Acceptance testing will be conducted on the live
environment. System testing is conducted on a separate test environment at System C’s
Warrington Office.

7. Suspension Criteria and Resumption Requirements

Suspension of testing will be called for if there is an unacceptably high number of Level 5
faults/issues. The exact criteria for suspension are to be agreed.

Resumption of testing will be authorised once the number of Level 5 issues has fallen below
the agreed threshold. Authority to resume will be sought from the NHS CFH/Trust resources
by the Supplier Test Co-ordinator.

8. Test Deliverables
The following test documents will be supplied:
 Incident Reports
 Test Plan

© Crown Copyright 2024 Page 7 of 15


Dashboard Test Plan
Dashboard Test Plan 20/04/09 Revision

 Test Specification
 Test Schedule
 Test Scripts and Results Documentation
 Test Report
 Release Notes.

9. Testing Tasks
9.1. Test Planning
To be done by the Test Manager in consultation with the Development Team, Trust
and Authority
9.2. Test Estimating and Scheduling
As above
9.3. Allocation and Assignment of Personnel to the testing Sub-Project
Test Manager to request personnel from Programme Manager and Product
Development Manager
9.4. Test Analysis and Design
This will be done by the Test Department in consultation with the Product
Development Manager and the Programme Manager
9.5. Test Specification
Separate Document
9.6. Test Data preparation
A mixture of test and live data will be used, with the live data coming in towards the end of
the test cycle. Live data will be anonymised if necessary.
9.7. Test Execution
9.7.1. Resources
The formal testing activity is estimated to take 5 days elapsed time to complete. The
following resources are required to be available during some of or the entire duration of the
testing activity as indicated below.
The actual test execution time for formal testing activity is estimated to take a total of 30
hours which is scheduled within the 5 days allocated to the test phase. The detailed
programme for testing will be agreed with the Trust prior to testing. The Authority resources
required for the testing are listed in the Test Specification. The detailed resource
requirements will be listed in the Test Specification.
The Authority will be informed of the date and location of the testing and are expected to
provide a witness if desired.

© Crown Copyright 2024 Page 8 of 15


Dashboard Test Plan
Dashboard Test Plan 20/04/09 Revision

Resource Type Trust Authority Plans for


Availability
Management Test Manager None In person or by a
deputy
Execution Test Team None According to this test
plan
Third Party / Third party None Support agreements
Subcontractors suppliers for in place with all
support during test hardware suppliers.
as required.
Support LSP Services None According to the
Infrastructure Environment plan of
Support the Authority
Witnesses N/A Test Witnesses According to the
Authority’s witness
strategy

9.7.2. Re-Test Procedure


The use of test cycles means that any test which has not passed will be re-run in a
subsequent cycle after the agreed remedial work has been completed and after the test has
been proven to be capable of passing by a re-run. A technical impact assessment will be
undertaken on each fix received and appropriate regression testing executed. The timetable
will be amended as required by the number of re-tests required.

9.8. Test Management


Testing is managed through Test Track; as the developers finish a module, it is released to
the Test Department for testing. Once internal testing is complete, the Test Department
notes this in the TT development calls and conducts a test review. The purpose of the test
review is to demonstrate readiness to exit internal testing and commence on-site testing.

9.9. Test Environment Planning and Co-ordination


This is carried out as a joint effort between the Test Manager or his nominated representative
and the Development team. The aim is to ensure consistency between the development, test
and live environments. The live environment will be installed using a software cut from the
test environment once internal testing is judged complete.

9.10. Test Incident Management


All test incidents are to be reported immediately to the Test Manager or his nominated
deputy, for onward transmission to representatives of Connecting for Health.

9.11. Test Automation


None envisaged.

© Crown Copyright 2024 Page 9 of 15


Dashboard Test Plan
Dashboard Test Plan 20/04/09 Revision

9.12. Test Witnessing


Where test witnessing is required the dates and schedules for the witnessed activities will be
agreed prior to the commencement of test execution.
The witness schedule will be agreed in advance of the witnessed period of test execution.
Failure of the witnesses to attend on the planned date(s) will not prevent the test plan from
being carried out.
9.12.1. Inspection of Test Results, Scripts and Procedures
Test results progress reports will be available in weekly reports in the agreed format. The test
scripts will be made available prior to the test execution phase or by request given
appropriate notice.

9.13. Reporting
A draft test report will be prepared prior to completion of each Test stage. This report will be
internally reviewed before formal issue.
The underlying technical test infrastructure and environment will be configured prior to
execution of all tests.
Tests will be executed according to the test schedule as included in this test plan and
associated test specification or during available time after the scheduled test date if a test
could not be completed.
Against each Test these results will include as a minimum:
 Person who executed the test;
 Date the test was executed;
 Pass/fail result of the test;
There are two options when a test fails:
 When the cause of the failure is obvious, such as a configuration error, then
- the test failure is recorded;
- the cause of the failure is entered as a comment with the test result;
- error is fixed immediately;
- the test is repeated immediately; or
 When the test fails and the cause is not obvious, a test issue will be raised in the test
management system.
Once all tests are completed, the development and test teams will review the test results to
make a recommendation if the software is ready for release. This team can make a positive
recommendation even if not all tests were passed, after carefully evaluating the failures and
the impact of those failures.

9.14. Ensuring Quality, Completeness and Relevance


The test team will review the status of tests prior to the relevant Milestone at least weekly
and more frequent if needed. At this time, the team will:

© Crown Copyright 2024 Page 10 of 15


Dashboard Test Plan
Dashboard Test Plan 20/04/09 Revision

 Review progress of tests versus schedule and make adjustments if needed;


 Review failed tests and the expected resolution date and take further action if
needed;

10. Test Issue Management


In the case where a test fails and the defect cannot be immediately repaired, a test issue will
be created assigning a severity and owner to enable tracking and reporting. In such a case
the failed test and the subsequent successful test will be recorded with an explanation in the
comments of the defect that was fixed. No other defect report will be filed.
In addition, the issue will be classified in accordance with the agreed priority levels. This
classification will be included in the text of the test issue and will be reflected on the Test
Report. If the authority disagrees with the classification, it shall notify the test representatives
by email within one day of receipt of the test report.
10.1. Test Issue Classification
Test Issues (TIs) will initially be assigned a priority classification (P- level) by the tester who
raised the Issue. During the daily TI meetings with the Authority, these classifications will be
discussed and amended where necessary, with an appropriate NHS Level (L – level) being
applied at this point. Test Issues will be raised under one of the following priorities:

 Test Issue Level 1 – prevents a critical element of the Trust Services from
functioning or being performed which has a direct or indirect impact on
patients and/or End Users.
 Test Issue Level 2 – all elements of the Trust Services can still function with a
workaround, however functionality or performance is severely impacted;
 Test Issue Level 3 – all elements of the Trust Services can still function with a
workaround, however required functionality or performance is materially
impacted;
 Test Issue Level 4 – all elements of Trust Services can still function, however
there is minor functionality/performance impact; and
 Test Issue Level 5 – all elements of Trust Services can still function, however
there are minor cosmetic defects with no functional impact and with no impact
on patients or clinical services.

11. Environmental Needs


11.1. Identification of the physical components, the communications, the
system and middleware necessary
Clinical Dashboard is a module of the System C Medway Sigma Business Intelligence
product, running on an application server and a database server. The Application Server
runs Windows 2008 Server and Microsoft SharePoint, among other pieces of software.
The Database Server runs Microsoft SQL Server 2008. Communication is done over a
standard TCP/IP network or the NHS National Network (N3) connection if necessary.

© Crown Copyright 2024 Page 11 of 15


Dashboard Test Plan
Dashboard Test Plan 20/04/09 Revision

11.2. The mode or state of usage (for example, standalone or dedicated


to the discrete usage of the application alone)
The system is designed to be run on two physical or virtual servers that are dedicated to the
Clinical Dashboard product alone.

11.3. Other software or supplies needed to support testing


The tests will be run using Microsoft Excel spreadsheets and therefore all users will need
access to Microsoft Office to append their results to the test sheets.

11.4. Security and access requirements to the test area and equipment
Access to test areas should be restricted to authorised System C, Trust and NHS CFH
resources.

11.5. Test tools and utilities required (including those supplied by NHS
CFH which must be installed and configured prior to Integration
Testing)
None are envisaged apart from Test Track

11.6. Identify any other testing needs (publications, office space)


Sufficient office space to accommodate the test team will be required for the duration of the
Integration, Ready for Operations and User Acceptance Testing phases.
All NHS documents referenced in the template Test Strategy, Test Plan and Test
Specifications are to be made available to System C at the earliest opportunity.

11.7. Additional software licences to cover test period usage


To be confirmed
11.8. Support resources and skills required to maintain the test
environments
Trust support resources are to be directed to maintaining the live environment – integration,
ready for operations and User Acceptance Testing will be conducted using test and live data
on the servers that will eventually host the live environment.

11.9. Identify a source for any of the needs that are not currently
available to the test group
To be negotiated.

11.10. Parallel use of above resources


Not envisaged.

© Crown Copyright 2024 Page 12 of 15


Dashboard Test Plan
Dashboard Test Plan 20/04/09 Revision

11.11. Hardware and software needed to execute the tests


See the Specification Document (Reference required)

11.12. Facilities required, e.g. office space and desks


Enough office space to accommodate the Test Team, Site Test Co-ordinator and Project
Manager(s).

11.13. Information on use of any stubs to mimic end to end process


Not thought to be needed at the time of writing.

11.14. Model to define the relationship between test and live


environment.
The test environment will be created using a test database on the live database server, and
using a test site on the live application server. During the test phase, use of the N3
connection will be restricted as much as possible.

Figure 1 - Test Environment

© Crown Copyright 2024 Page 13 of 15


Dashboard Test Plan
Dashboard Test Plan 20/04/09 Revision

12. Responsibilities
This section is included to show who is responsible for which activities and deliverables.
Identify all of the groups responsible for test-related managing, designing, preparing,
executing, witnessing, checking, and resolving:
 Detail who will prepare test cases, test data and expected results in each Test Stage
 Detail who will execute the testing
 Detail who will make the go/ no go decisions between Test Stages
In addition, identify the groups responsible for the Test Items identified in, Section 5 - Test
Items, and the environmental needs identified in Section 11 - Environmental Needs.
These groups may include the developers, testers, operations/production support staff,
technical support, user representatives, data administration, and quality support staff.
 Identify point of contact for particular applications, platform, systems and business
functions (especially for interface testing).
 Identify the point of contact for inter-dependent projects

Identify who is responsible for training

Identify who is responsible for risk management.

13. Staffing and Training Needs


Specify test staffing needs by skill level. Identify training options for providing necessary
skills.

Training may relate to the system under test, the business, the test techniques, the tools
being used.

14. Schedule
Include test milestones identified in the Project Schedule as well as key Testing Milestones
and tasks included in Section 10 - Testing Tasks. However include a warning that any dates
that are included are only to aid comprehension and that current dates should always be
ascertained from the current version of the Project Plan.
 Detail when test cases, test data and expected results will be prepared
 Detail when each Test Stage will be executed
 Include test milestones, scheduled dates for test phase sign-off or for testing reviews
 Identify additional testing activities
 Provide a reference to Plan/ Gantt Chart appendix
 Milestones for delivery of the software to the Test Team
 Availability of the environment
 Test deliverables
 Dates for when NHS CFH can Witness Test
 Document how milestone slippage is to be handled.

15. Risks and Contingencies

© Crown Copyright 2024 Page 14 of 15


Dashboard Test Plan
Dashboard Test Plan 20/04/09 Revision

This section relates to what could go wrong and what will be done to minimise the adverse
impact if the risk does indeed occur.

Assumptions are not explicit within the IEEE 829 standard however it is necessary to detail
any in this section. Identify the high-risk assumptions of the Test Plan. Specify the impact
and contingency plans for each.
Possible risks are:
 Changing Authority requirements
 Changing end date
 Quality problems
 New technology
 Lack of skilled personnel
 Lack of good quality test environments.
It is necessary to focus on any risks to the testing outlined within this Test Plan and not to the
project itself. Also note that the risks will be managed via the internal (Supplier) and external
(NHS CFH) project risk registers.

Remember that for each risk identified it is necessary to provide a contingency.

© Crown Copyright 2024 Page 15 of 15

You might also like