Guiding standards
Standard | Description |
Shared Responsibility | Everyone in the team is responsible for quality. |
Data Management | Production data must be analyzed before being used for testing. |
Test Management | Test cases, code, documents and data must be treated with the same importance as the production system. |
Test Automation | Attempt to automate all types of testing (Unit, Functional, Regression, Performance, Security) as far as feasible. |
Requirements strategy
1. Always implement highest priority work items first (Each new work item is prioritized by Product Owner and added to the stack).
2. Work items may be reprioritized at any time or work items may be removed at any time.
3. A module in greater detail should have higher priority than a module in lesser detail.
Quality and Test Objectives
Feature | Description | Measure and Target | Priority |
Accuracy | Features and functions work as proposed (i.e. as per requirements) | 100% completion of agreed features with open
- Severity 1 defects = 0
- Severity 2 defects = 0
- Severity 3 defects < 5
- Severity 4 defects < 10
|
Must Have |
Integrity | Ability to prevent unauthorized access, prevent information loss, protect from viruses infection, protect privacy of data entered |
- All access is via HTTPS (over a secured connection).
- User passwords and session tokens are encrypted.
|
Must Have |
Maintainability | Ease to add features, correct defects or release changes to the system |
- Code Duplication < 5%
- Code Complexity < 8
- Unit Test Coverage > 80%
- Method Length < 20 Lines
|
Must Have |
Availability | Percentage of planned up-time that the system is required to operate | System is available for 99.99% for the time measured through system logs. | Should Have |
Interoperability | Ease with which the system can exchange information with other systems User interface renders and functions properly on the following (and later) browsers versions:
- IE version = 9.0
- Firefox version = 18.0
- Safari version = 5.0
- Chrome version 11.0
|
Must Have |
Performance | Responsiveness of the system under a given load and the ability to scale to meet growing demand. |
- Apdex Score > 0.9
- Response Time < 200ms
- Throughput > 100 pm
|
Should Have |
Test Scope (both business processes and the technical solution)
In Scope
Identify what is included in testing for this particular project. Consider what is new and what has been changed or corrected for this product release.
- Automated) Unit testing
- Code analysis (static and dynamic)
- Integration testing
- (Automated) Feature and functional testing
- Data conversion testing
- System testing
- (Automated) Security testing
- Environment testing
- (Automated) Performance and Availability testing
- (Automated) Regression testing
- Acceptance testing
Copyright ©
Software Testing Space
Out of Scope
Identify what is excluded in testing for this particular project.
Testing Types
Testing type | Definition | Test tool examples
Remove tools that will not be used.
|
Unit testing | Testing that verifies the implementation of software elements in isolation |
Xunit test tools (Nunit, Junit), Mocking tools |
Code analysis (static and dynamic) | Walkthrough and code analysis | 1. Static code tool ->
Java – Checkstyle, Findbugs, Jtest, AgileJ Structure views
.Net – FxCop, stypeCop, CodeRush
2. Dynamic code tool ->Avalanche, DynInst, BoundsChecker. |
Integration testing | Testing in which software elements, hardware elements, or both are combined and tested until the entire system has been integrated | Vector Cast C/C++ |
Functional and Feature testing | Testing an integrated hardware and software system to verify that the system meets required functionality:
- 100% requirements coverage
- 100% coverage of the main flows
- 100% of the highest risks covered
- Operational scenarios tested
- Operational manuals tested
- All failures are reported
| UFT, Selenium WebDriver, Watir, Canoo webtest , SoapUI Pro |
System testing | Testing the whole system with end to end flow | Selenium, QTP, TestComplete |
Security testing | Verify secure access, transmission and password/ session security | BFB Tester, CROSS, Flowfinder, Wireshark, WebScarab, Wapiti, X5s, Exploit Me, WebSecurify, N-Stalker |
Environment testing | Testing on each supported platform/ browser | GASP, QEMU, KVM,Xen, PS tools |
Performance and Availability testing | Load, scalability and endurance tests | LoadRunner, JMeter, AgileLoad test, WAPT, LoadUI |
Data conversion testing | Performed to verify the correctness of automated or manual conversions and/or loads of data in preparation for implementing the new system | DTM, QuerySurge, PICT, Slacker |
Regression testing | Testing all the prior features and re-testing previously closed bugs | QTP, Selenium WebDriver |
Acceptance testing | Testing based on acceptance criteria to enable the customer to determine whether or not to accept the system | Selenium , Watir, iMacros, Agile Acceptance Test Tool |
Test Design strategy
1. Specification based / Black box techniques (Equivalence classes, Boundary value analysis, Decision tables, State Transitions and Use case testing)
2. Structure based / white box techniques (Statement coverage, Decision coverage, Condition coverage and Multi condition coverage)
3. Experience based techniques (Error guessing and Exploratory testing)
Test Environments strategy
Name | Description | Data Setup | Usage |
Development | This environment is local and specific to each developer/tester machine. It is based on the version/branch of source code being developed. Integration points are typically impersonated. | Data and configuration is populated through setup scripts. | Unit, Functional and Acceptance Tests.
Test tools e.g. Xunit test tools (Nunit, Junit), Mocking tools.
Source code management for version control |
Integration | This environment supports continuous integration of code changes and execution of unit, functional and acceptance tests. Additionally, static code analysis is completed in this environment. | Data and configuration is populated through setup scripts. | Unit, Functional and Acceptance Tests.
Static code analysis
Continuous Integration tools e.g. Cruise control |
Staging | This environment supports exploratory testing | Populated with post-analysis obfuscated production data | Exploratory testing |
Production | Live environment | New instances will contain standard project reference data. Existing instances will have current data migrated into the environment | Production verification testing |
Test Execution strategy
We will keep in mind the following points:
- Agile testing must be iterative.
- Testers cannot rely on having complete specification.
- Testers should be flexible.
- They need to be independent and independently empowered in order to effective
- Be generalizing specialists.
- Be prepared to work closely with developers.
- Focus on value added activities.
- Be flexible.
- Focus on What and Not How to test.
- Testers should be embedded in agile team.
- Flexible to contribute in any way then can
- Have wide range of skills with one or more specialties
- Shorter feedback cycles
- Focus on sufficient and straightforward situations.
- Focus on exploratory testing.
- Specify the meaning of "Done” i.e. when activities/tasks performed during the system development can be considered complete.
- Define when to continue or stop testing before delivering the system to the customer. Specify which evaluation criteria is to be used (e.g. time, coverage, and quality) and how it will be used.
Additionally, use this section to describe the steps for executing tests in preparation for deployment/release/upgrade of the software. Key execution steps could include:
1. Steps to build the system
2. Steps to execute automated tests
3. Steps to populate environment with reference data
4. Steps to generate test report/code metrics
Test Data Management strategy
Use this section to describe the approach for identifying and managing test data. Consider the following guidelines:
1. System and user acceptance tests – a subset of production data should be used to initialize the test environment.
2. Performance and availability test – full size production files should be used to test the performance and volume aspects of the test.
Test Automation strategy
Adopt a planned approach to developing test automation. Increase the quality of test automation code. Select the test cases for automation based on the following factors:
- Risk
- How long it takes to run the tests manually?
- What is the cost of automating the test?
- How easy are the test cases to automate?
- How many times is the test expected to run in project?
Test Management
The Test Plan, test scenarios, test cases and bug report should be in a same system as in Bugzilla, Zira. Any agile tool can be used where User stories, Test Plan, Test scenarios, test cases and bug report can be stored in the same place.
Risks and Assumptions
Risks and assumptions raised in Daily stand up meeting (in front of all team members, scrum master and members) should be logged and addressed immediately.
Defect Management strategy
Ideally, defects are only raised and recorded when they are not going to be fixed immediately. In this case, the conditions under which they occur and the severity needs to be accurately recorded so that the defect can be easily reproduced and then fixed.
Defect Classification
Severity | Description |
Critical | Defect causes critical loss of business functionality or a complete loss of service. |
Major | Defect causes major impact to business functionality and there is not an interim workaround available. |
Minor | Defect causes minor impact to business functionality and there is an interim workaround available. |
Trivial | Defect is cosmetic only and usability is not impacted. |
Defect Lifecycle
Step | Description |
Identify Defect | Ensure defect can be reproduced. Raise in defect tracking system. |
Prioritize Defect | Based on severity defect is prioritized in team backlog. |
Analyze Defect | Based on analysis acceptance criteria and implementation details. |
Resolve Defect | Implement changes and/or remediate failing tests. |
Verify Resolution | Execute tests to verify defect is resolved and no regression is seen. |
Close Defect | Close in defect tracking system. |
Specify the shared defect tracking system.
Copyright ©
Software Testing Space