Test Strategy With Examples

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 5

Test Strategy Document

Version 1.0

Table of Contents
 1. Document Information and Approvals
 1.1 Approval/Sign-off
 1.2 Document History
 1.3 Related Documents
 1.4 Open Items
 2. Overview
 3. Types of QA Tests
 3.1 Unit Testing
 3.2 Functional Testing
 3.3 System Testing
 3.4 End-to-End Testing
 3.5 Regression Testing
 3.6 Performance Testing
 3.7 Data Migration Testing
 3.8 Cross-Browser Testing
 3.9 User Acceptance Testing
 4. Testing Tools & Processes
 4.1 Test Management
 4.2 Bug Triage and Defect Tracking
 4.3 Defect Severity Levels
 4.4 Release Management
 5. Test Environments
 6. Test Data Management
 6.1 Overview
 6.2 Dependencies
 7. Exit Criteria
 8. QA Deliverables
 9. Toolset
 10. Additional Dependencies

1. Document Information and Approvals


| Name | Role | Date | Signature |
|------|------|------|-----------|
| Director of Web App Development | Approver | TBD | TBD |

1.2 Document History


| Date | Author(s) | Update Summary | Version |
|------------|--------------|--------------------------|---------|
| 5/24/2016 | Initial Draft| Initial draft for review | 1.0 |
| 6/7/2016 | Updates | Feedback incorporation | 2.0 |
| 6/16/2016 | Final Approval | Feedback incorporated | 3.0 |

2. Overview
This document outlines the test strategy for ensuring quality assurance across different
phases of the project. It serves as a guideline for test planning, execution, and reporting
while addressing all functional and non-functional requirements.

3. Types of QA Tests

3.1 Unit Testing


Unit tests will be implemented at the code level to validate individual components.

3.2 Functional Testing


Testing to validate the application functions as intended, aligned with functional
requirements.

3.3 System Testing


System-wide validation, ensuring all integrated components operate as expected.

3.4 End-to-End Testing


Focuses on workflows and overall application performance, including external systems such
as SAP and WMS.

3.5 Regression Testing


Includes both manual and automated test cases for system stability.

3.6 Performance Testing


Covers baseline, peak, stress, and endurance tests to identify system limits.

3.7 Data Migration Testing


Ensures accuracy and reliability of data migration processes.

3.8 Cross-Browser Testing


Validation across a defined list of supported browsers to ensure compatibility.

3.9 User Acceptance Testing


Executed in collaboration with stakeholders to confirm the system meets business
expectations.

4. Testing Tools & Processes

4.1 Test Management


Use tools like HP QC/ALM or Jira for managing test cases and execution.
4.2 Bug Triage and Defect Tracking
Defects will be tracked in Jira with weekly triage meetings to prioritize fixes.

4.3 Defect Severity Levels


| Severity | Impact |
|----------|----------------------------------------------|
| 1 - Critical | System crashes, no workaround available. |
| 2 - High | Significant impairment, workaround unavailable. |
| 3 - Medium | Minor issues, workaround available. |
| 4 - Low | Cosmetic or minor inconvenience. |

5. Test Environments
Defines environments for testing, such as DEV, QA, UAT, and Performance environments.

6. Test Data Management


Test data will be refreshed periodically. Dependencies with external systems will be
managed collaboratively.

7. Exit Criteria
Testing is complete when no critical or high defects remain, and low-severity defects meet
the thresholds.

8. QA Deliverables
Includes test plans, defect reports, and execution summaries.

9. Toolset
| Type | Tool |
|------------------|-----------------|
| Unit Testing | JUnit |
| Defect Management| Jira |
| Test Automation | Selenium |
| Performance Testing| LoadRunner, JMeter |

Sample Defect Severity Examples


The following table provides examples of defect severity levels and their potential impacts.
For more details, refer to the attached Excel file.

File: Defect_Severity_Examples.xlsx

Sample Examples and Details

Unit Testing Example


Example: Testing a function in a payment processing module that calculates the total price
including discounts and taxes. Inputs like price, discount rate, and tax percentage are
validated individually.
Functional Testing Example
Example: Validating the 'Add to Cart' functionality in an e-commerce application. Ensure
that products are added correctly, quantities can be adjusted, and price updates
dynamically.

System Testing Example


Example: Testing an order placement process that integrates the front-end, payment
gateway, inventory management system, and notification service. Validate all components
working together.

Regression Testing Details


Timelines:
- Sprint 6 QA Deploy: Re-test functionality from Sprints 1–5.
- Post Development Complete: Regression testing for all previously tested functionality.
- Pre-UAT: Full regression to ensure the system is stable before user acceptance testing.

Performance Testing Example


Example: Running a peak load test on a website by simulating 10,000 concurrent users and
measuring the response time of critical transactions like checkout and user login.

Testing Time:
- Baseline: 2 hours at 50% peak load.
- Peak Load: 2 hours at 100% peak load.
- Stress Testing: 4 hours increasing load to 200%.
- Endurance Testing: 8 hours at 75% peak load to identify long-term performance issues.

Cross-Browser Testing Example


Example: Validating the compatibility of the web application across Chrome, Firefox, Safari,
and Edge. Focus on layout consistency, functionality, and performance on each browser.

Defect Severity Examples


Below are examples of defect severity levels to help identify and classify issues during
testing:
- **Critical:** A crash occurs on login, preventing access for all users.
- **High:** Payment gateway fails, blocking transactions.
- **Medium:** A label is misaligned but still functional.
- **Low:** A typo in the help documentation.

Test Data Management Example


Example: Testing an e-commerce application using the following data:
- Product Catalog: 500 SKUs including discounts and tax variations.
- User Accounts: 200 user profiles with different roles (admin, buyer, guest).
- Orders: 100 completed orders for testing order history and returns.
Testing Time: Test data refresh and preparation require approximately 2 business days per
cycle.

You might also like