March 19, 2014

Example Test Strategy | Test Plan


Test strategy is the plan (that may exist at any level like project, program, department or organization level) that describes how the test objectives would be met effectively with the help of the available resources. If you have a test strategy, it is easier to focus effort on the most important test activities at the time. Moreover, a test strategy provides clarity on the test approach to the project stakeholders. First, view my Test Strategy video. Then read on.
Many readers have asked me for example software testing strategy document. I requested Varsha, who is a senior member of the Software Testing Space community, to create an example test strategy for a hypothetical agile project. First, view the video, Example Agile Test Strategy, Agile Test Plan. Then read on.
Below is the resulting sample test strategy document. The sections contain much information. Additional guidelines are given in italics. I hope that this sample test strategy document helps you create a really effective test strategy for your own project. - Inder P Singh

Example Test Strategy

Introduction to Agile
Agile is an iterative and incremental (evolutionary) approach to software development that is performed in a highly collaborative manner by self-organizing teams within a control framework. High quality and adaptive software is developed by small teams using the principles of continuous design improvement and testing based on rapid feedback and change. Agile is people-centric, development and testing is performed in an integrated way, self-organizing teams encourage role interchangeability, customer plays a critical role and Project Life-cycle is guided by product features.

How Agile is different from Waterfall model
1. Greater collaboration
2. Shorter work cycle and constant feedback
3. Need to embrace change
4. Greater flexibility
5. Greater discipline
6. The goal should be quality and not just speed
7. Greater stakeholder accountability
8. Greater range of skills
9. Go faster and do more
10. Courage
11. Confidence in design

Purpose of this document
The purpose of this Test Strategy is to create a shared understanding of the overall targets, approach, tools and timing of test activities. Our objective is to achieve higher quality and shorter lead times with minimum overhead, frequent deliveries, close teamwork with team and the customer, continuous integration, short feedback loops and frequent changes of the design. Test strategy guides us through the common obstacles with a clear view of how to evaluate the system. Testing starts with the exploration of the requirements and what the customer really wants by elaborating on the User stories from different perspectives. Testing becomes a continuous and integrated process where all parties in the project are involved. 
Copyright © Software Testing Space

Guiding standards
StandardDescription
Shared ResponsibilityEveryone in the team is responsible for quality.
Data ManagementProduction data must be analyzed before being used for testing.
Test ManagementTest cases, code, documents and data must be treated with the same importance as the production system.
Test AutomationAttempt to automate all types of testing (Unit, Functional, Regression, Performance, Security) as far as feasible.

Requirements strategy
1. Always implement highest priority work items first (Each new work item is prioritized by Product Owner and added to the stack).
2. Work items may be reprioritized at any time or work items may be removed at any time.
3. A module in greater detail should have higher priority than a module in lesser detail.

Quality and Test Objectives
FeatureDescriptionMeasure and TargetPriority
AccuracyFeatures and functions work as proposed (i.e. as per requirements)100% completion of agreed features with open
  • Severity 1 defects = 0
  • Severity 2 defects = 0
  • Severity 3 defects < 5
  • Severity 4 defects < 10
Must Have
IntegrityAbility to prevent unauthorized access, prevent information loss, protect from viruses infection, protect privacy of data entered
  • All access is via HTTPS (over a secured connection).
  • User passwords and session tokens are encrypted.
Must Have
MaintainabilityEase to add features, correct defects or release changes to the system
  • Code Duplication < 5%
  • Code Complexity < 8
  • Unit Test Coverage > 80%
  • Method Length < 20 Lines
Must Have
AvailabilityPercentage of planned up-time that the system is required to operateSystem is available for 99.99% for the time measured through system logs.Should Have
InteroperabilityEase with which the system can exchange information with other systems User interface renders and functions properly on the following (and later) browsers versions:
  1. IE version = 9.0
  2. Firefox version = 18.0
  3. Safari version = 5.0
  4. Chrome version 11.0
Must Have
PerformanceResponsiveness of the system under a given load and the ability to scale to meet growing demand.
  1. Apdex Score > 0.9
  2. Response Time < 200ms
  3. Throughput > 100 pm
Should Have

Test Scope (both business processes and the technical solution)
In Scope
Identify what is included in testing for this particular project. Consider what is new and what has been changed or corrected for this product release.
  • Automated) Unit testing
  • Code analysis (static and dynamic)
  • Integration testing
  • (Automated) Feature and functional testing
  • Data conversion testing
  • System testing
  • (Automated) Security testing
  • Environment testing
  • (Automated) Performance and Availability testing
  • (Automated) Regression testing
  • Acceptance testing
Copyright © Software Testing Space
Out of Scope
Identify what is excluded in testing for this particular project.

Testing Types
Testing typeDefinitionTest tool examples
Remove tools that will not be used.
Unit testingTesting that verifies the implementation of software elements in isolation Xunit test tools (Nunit, Junit), Mocking tools
Code analysis (static and dynamic)Walkthrough and code analysis1. Static code tool -> Java – Checkstyle, Findbugs, Jtest, AgileJ Structure views .Net – FxCop, stypeCop, CodeRush
2. Dynamic code tool ->Avalanche, DynInst, BoundsChecker.
Integration testingTesting in which software elements, hardware elements, or both are combined and tested until the entire system has been integratedVector Cast C/C++
Functional and Feature testingTesting an integrated hardware and software system to verify that the system meets required functionality:
  • 100% requirements coverage
  • 100% coverage of the main flows
  • 100% of the highest risks covered
  • Operational scenarios tested
  • Operational manuals tested
  • All failures are reported
UFT, Selenium WebDriver, Watir, Canoo webtest , SoapUI Pro
System testingTesting the whole system with end to end flowSelenium, QTP, TestComplete
Security testingVerify secure access, transmission and password/ session securityBFB Tester, CROSS, Flowfinder, Wireshark, WebScarab, Wapiti, X5s, Exploit Me, WebSecurify, N-Stalker
Environment testingTesting on each supported platform/ browserGASP, QEMU, KVM,Xen, PS tools
Performance and Availability testingLoad, scalability and endurance testsLoadRunner, JMeter, AgileLoad test, WAPT, LoadUI
Data conversion testingPerformed to verify the correctness of automated or manual conversions and/or loads of data in preparation for implementing the new systemDTM, QuerySurge, PICT, Slacker
Regression testingTesting all the prior features and re-testing previously closed bugsQTP, Selenium WebDriver
Acceptance testingTesting based on acceptance criteria to enable the customer to determine whether or not to accept the system Selenium , Watir, iMacros, Agile Acceptance Test Tool

Test Design strategy
1. Specification based / Black box techniques (Equivalence classes, Boundary value analysis, Decision tables, State Transitions and Use case testing)
2. Structure based / white box techniques (Statement coverage, Decision coverage, Condition coverage and Multi condition coverage)
3. Experience based techniques (Error guessing and Exploratory testing)

Test Environments strategy
NameDescriptionData SetupUsage
DevelopmentThis environment is local and specific to each developer/tester machine. It is based on the version/branch of source code being developed. Integration points are typically impersonated.Data and configuration is populated through setup scripts.Unit, Functional and Acceptance Tests.
Test tools e.g. Xunit test tools (Nunit, Junit), Mocking tools.
Source code management for version control
IntegrationThis environment supports continuous integration of code changes and execution of unit, functional and acceptance tests. Additionally, static code analysis is completed in this environment.Data and configuration is populated through setup scripts.Unit, Functional and Acceptance Tests.
Static code analysis
Continuous Integration tools e.g. Cruise control
StagingThis environment supports exploratory testingPopulated with post-analysis obfuscated production dataExploratory testing
ProductionLive environmentNew instances will contain standard project reference data. Existing instances will have current data migrated into the environmentProduction verification testing

Test Execution strategy
We will keep in mind the following points:
  1. Agile testing must be iterative.
  2. Testers cannot rely on having complete specification.
  3. Testers should be flexible.
  4. They need to be independent and independently empowered in order to effective
  5. Be generalizing specialists.
  6. Be prepared to work closely with developers.
  7. Focus on value added activities.
  8. Be flexible.
  9. Focus on What and Not How to test.
  10. Testers should be embedded in agile team.
  11. Flexible to contribute in any way then can
  12. Have wide range of skills with one or more specialties
  13. Shorter feedback cycles
  14. Focus on sufficient and straightforward situations.
  15. Focus on exploratory testing.
  16. Specify the meaning of "Done” i.e. when activities/tasks performed during the system development can be considered complete.
  17. Define when to continue or stop testing before delivering the system to the customer. Specify which evaluation criteria is to be used (e.g. time, coverage, and quality) and how it will be used.
Additionally, use this section to describe the steps for executing tests in preparation for deployment/release/upgrade of the software. Key execution steps could include:
1. Steps to build the system
2. Steps to execute automated tests
3. Steps to populate environment with reference data
4. Steps to generate test report/code metrics


Test Data Management strategy
Use this section to describe the approach for identifying and managing test data. Consider the following guidelines:
1. System and user acceptance tests – a subset of production data should be used to initialize the test environment.
2. Performance and availability test – full size production files should be used to test the performance and volume aspects of the test.


Test Automation strategy
Adopt a planned approach to developing test automation. Increase the quality of test automation code. Select the test cases for automation based on the following factors:
  • Risk
  • How long it takes to run the tests manually?
  • What is the cost of automating the test?
  • How easy are the test cases to automate?
  • How many times is the test expected to run in project?
Test Management
The Test Plan, test scenarios, test cases and bug report should be in a same system as in Bugzilla, Zira. Any agile tool can be used where User stories, Test Plan, Test scenarios, test cases and bug report can be stored in the same place.

Risks and Assumptions
Risks and assumptions raised in Daily stand up meeting (in front of all team members, scrum master and members) should be logged and addressed immediately.

Defect Management strategy
Ideally, defects are only raised and recorded when they are not going to be fixed immediately. In this case, the conditions under which they occur and the severity needs to be accurately recorded so that the defect can be easily reproduced and then fixed.

Defect Classification
SeverityDescription
CriticalDefect causes critical loss of business functionality or a complete loss of service.
MajorDefect causes major impact to business functionality and there is not an interim workaround available.
MinorDefect causes minor impact to business functionality and there is an interim workaround available.
TrivialDefect is cosmetic only and usability is not impacted.

Defect Lifecycle
StepDescription
Identify DefectEnsure defect can be reproduced. Raise in defect tracking system.
Prioritize DefectBased on severity defect is prioritized in team backlog.
Analyze DefectBased on analysis acceptance criteria and implementation details.
Resolve DefectImplement changes and/or remediate failing tests.
Verify ResolutionExecute tests to verify defect is resolved and no regression is seen.
Close DefectClose in defect tracking system.

Specify the shared defect tracking system.
Copyright © Software Testing Space
Note: This example test strategy has been contributed by Varsha Tomar. Varsha has 9 years experience in both manual and automated software testing. Currently, she works with Vinculum Solutions as Senior Test Lead. Her interests include software testing, test automation, training, testing methodologies and exploring testing tools.

Please put any questions that you have in the comments.

7 comments:

  1. Thanks for this informative sample. Can I use and apply this at my work place.

    ReplyDelete
    Replies
    1. Please feel free to use it in your projects. The only condition is that you credit this blog at any place within your document.

      Delete
  2. hello, say in a Agile environment whats the smallest test plan/strategy can i use? meaning that how can i build a small document that is reusable for each sprint rather than having one thats long and some sections would not be used as i would like to keep it basic?

    ReplyDelete
    Replies
    1. Hello! Yes, I get your point of spending minimum time to build Agile test plan and strategy for a specific sprint. Actually, such a document is not required. For each sprint planning, you can read the project level Agile test strategy. Then identify the needed tasks for the selected stories and plan points accordingly.

      Delete
    2. I thought the initial question was interesting, so to expand on it. If you have an agile team, doing Scrum with 2 week sprints, for example.. Is there still value in even having a test plan? Tha probably sounds like a crazy question to ask, but is the test plan not defined by the sprint backlog now? It determines what is in and out of scope. Types of testing is defined in the individual stories. And most of the resource related things like environments, data, personnel all tend to endure thru multiple sprints, so why would we document them every sprint? I'm starting to feel like to actually test in an aigle manner, we should have an overarching Test Strategy (which can be reviewed at any time) and some smaller enduring docs to track Test Data, Environments and maybe a doc that tracks overall coverage across multiple sprints, so you don't develop blindspots in your regression. Really curious what you think.

      Delete
    3. Adding to what I commented earlier. For a 2 week sprint, a separate test plan may not be required. The Test Strategy may exist at the release level or project level (or even at program level, department level or company level). Just defining types of testing may be insufficient e.g. Performance Testing may need to be further specified as load, scalability or endurance testing. I agree that the Test Strategy should be a live document, meaning it can be reviewed and updated at any time according to the current risks or opportunities. What would you call the smaller enduring docs (to track Test Data, Environments and overall coverage)? Isn't that the Test Plan? Ideally, the Test Plan should be also be a live document. It's versions would indicate the content that has changed across the sprints and releases. Thank you for your comment.

      Delete
  3. Hi,
    Your blog is excellent, thanks for sharing your blog.Do you spend countless hours testing your solutions but are still unable to adequately manage your test plans? TestForma is an excellent test plan software for developers, engineers, and quality assurance experts. TestForma makes it simple to create and update test plans, run tests, and produce documentation.

    ReplyDelete

Note: Only a member of this blog may post a comment.