Testing Checklist V 1.0
Testing Checklist V 1.0
Testing Checklist V 1.0
Document Information
Project Name: Project Manager: Project ID:
Ver. No.
Ver. Date
Description
Prepared By
1.0
14-Dec-09
Anantara
# 1 2 3 4 5 6 7 8 9 10 11 12 13 14
Unit Test Plan & Test Case Review Checkli Description Status Is the scope of the Unit Testing specified? Is a mechanism identified to map requirements and specifications to unit test cases? Are standards listed for unit testing activity? Are Tools required to support Unit testing activities planned? Does the test plan completely test the functionality of the unit as defined in the design specification and in the system requirements specification? Are all variables (including global variables) input to the unit included in the features to be tested in the test plan? Do all variables output from the unit result from the features to be tested? Is every procedure in the design exercised by the test? Are tests included for critical procedures to demonstrate that performance meets stated requirements? For units that interface with other units, are invalid features tested to test for defensive coding? Are the test features combined into test cases wherever practical to reduce the number of test cases required? Is error handling logic tested? Does each test case have a corresponding expected result? Are generic test cases used wherever possible to reduce the number of test cases?
# 1 2 3 4 5 6 7 8 9 10 11
Checklist for Integration Testing Status (Yes / No / NA) Are Integration Test Plan and Test Case under configuration control? Check Point If test is not performed as per plan, are the variations from the approved plan documented? (change in integration sequence, test data, test environment, etc.) Have all the test cases been executed? Has the Test Log been updated to reflect the test case status? (with appropriate comments provided where need be.) Are testing limitations and issues documented and shared with the customers and impacted team(s)? Are the new test cases developed / modified documented for the functional changes? Are the new test cases mapped in Requirements Traceability Matrix (RTM) document? Is the defect report updated and baselined? Is the defect report shared with the concerned members / team? Are the defects regressed and closed? Has the defect data analysis been conducted?
Remarks
Check Point
Remarks
Are the functional requirements available for generating test cases? Are the design specifications / use cases available for generating test cases? Have the testers understood the Business requirements to test the application effectively? Have the test cases been reviewed against Traceability matrix to check the test coverage? Have you got sample data from the Customer to create test data? Is the test environment set as per the test plan? Are the test cases taking care of database manipulations? Are the test case records base-lined as per configuration management? Are the test cases updated based on the change requests? Is the defect tracking system in place for an effective Regression testing? Are the test cases / scenarios identified for test automation? Are the test records base lined as per configuration management? Are the test scenarios identified and approved by the Business analysts and Customer? Is the test data created for checking the module-to-module integration? Is the test data created for checking the end-to-end business flow? Are test scenarios taking care of uncertain and invalid conditions? Are there test cases generated for checking concurrent user access? Are there test cases generated to check user privileges with respect to the application? Is there a test procedure available to verify Backup & Restoration procedures? Is there a test procedure available to verify Installation procedures independently by a tester? Apart from the test cases/ scenarios, has someone tested the application without following any procedures to check the behavior of the application? Is the system tested the application by simulating Network failure? Is the application checked for its compliance with the targeted Operating Systems? Have all the test results documented and defects tracked to closure? Are Constraints and issues faced during testing, documented and shared with the stakeholders?
Acceptance Testing Checklist Check Point # Test Preparation 1 2 3 4 5 6 7 8 9 10 11 Are the objectives and scope of the Acceptance Testing specified? Are all defects in earlier phase fixed and re-tested? Is the Integrated System built with the latest, tested units? Is the required input / test data for testing available? Is test environment set-up in-line with the Contract / as agreed with the Customer? Are test cases identified to test all aspects of the Acceptance Criteria? Are negative test cases identified? Is System 'Installation Check' performed? Is the System tested for 'Exceptional scenario and rare conditions? Have you defined acceptance criteria (e.g. performance, portability, throughput, etc.) on which the completion of the acceptance test will be judged? Status (Yes / No / NA) Remarks
Has the method of handling problems / defects detected during acceptance testing and their disposition been agreed between you and the customer? 12 Are acceptance test cases for new / changed requirements developed? 13 Is the RTM updated to reflect the latest acceptance test cases? Test Execution and Evaluation 14 Have all the test results documented and defects tracked to closure? 15 16 17 18 19 Have all steps of the test run been documented?(optional) Is the customer accepted version of the Solution archived/ baselined for future reference? Are Constraints and issues faced during testing, documented and shared with the stakeholders? Has the customer sign-off been obtained? Has the customer feedback been administered?
Design Walkthrough
1. Does the algorithm accomplish the desired function? 2. Is the algorithm logically correct? 3. Is the interface consistent with the architectural design? 4. Is the logical complexity reasonable? 5. Has error handling been specified? 6. Are local data structures properly defined? 7. Are structured programming constructs used throughout? 8. Is design detail amenable to implementation language? 9. Has maintainability been considered? 10. Are all conditions and processing defined for each decision point? 11. Do all defined and referenced calling sequence parameters agree?
f requirements?