This section establishes the scope and purpose of the test plan. This is where to describe the Fundamental aspects of the testing effort.
- Purpose - Describe why the test plan was developed--what the objectives are. This may include documenting test requirements, defining testing strategies, identifying resources, estimating schedules and project deliverables.
- Background - Explain any events that caused the test plan to be developed. This can include implementing improved processes, or the addition of new environments or functionality.
- Technical Architecture - diagram the components that make up the system under test. Include data storage and transfer connections and describe the purpose each component serves including how it is updated. Document the layers such as presentation/interface, database, report writer, etc. A higher level diagram showing how the system under test fits into a larger automation picture also can be included if available.
- Specifications - list all required hardware and software including vendors and versions.
- Scope - briefly describe the resources that the plan requires, areas of responsibility, stages and potential risks.
- Project Information - identify all the information that is available in relation to this project. User documentation, project plan, product specifications, training materials and executive overview materials are examples of project information.
This section of the test plan lists all requirements to be tested. Any requirement not listed is outside of the scope of the test plan. (The day you’re held accountable for a released bug in an untested area, you’ll be glad you had a written, signed document that shows what was in and out of scope when the testing effort was carried out!)
- Functional Test Requirements - all functions to be tested, such as creating, editing and deleting records, are listed. This can be a fairly comprehensive listing for a full system test, or it may refer to another document.
- Design Requirements - testing the user interface, menu structures or other forms of design elements also should be listed.
- Integration Requirements - the requirements for testing the flow of data from one component to the other may be included if it will be part of the test plan.
Use this section to describe how the test objectives will be met for each type of testing that may be part of the test plan: unit, function, integration, system, volume, stress, performance, configuration and/or installation testing. For each subset, detail the following:
- Objective - the overall objective this strategy is designed to meet. For a complete system test, this may be a statement that all functional requirements must behave as expected or as documented.
- Technique - document how test cases were developed, the tool(s) used to store them and where they can be found, how they will be executed, and the data to be used. Make notes here if tests are to be performed in cycles, or in concert with other testing efforts.
- Special Considerations - unique or necessary system setup, data or other test dependencies; environment conditions or other aspects that are required to establish a known state for testing.
- Test Cases - list or refer to the actual test cases that will be carried out to implement the plan. (See Anatomy of a Test Case on page 7 of this issue.)
- Completion Criteria - record the criteria that will be used to determine pass/fail of tests and the action that is to be taken based on test results.
- Assumptions - describe any outside projects or issues that may impact the effectiveness or timeliness of the test effort.
- Tools - document the tools that will be employed for testing. Cite the vendor, version and the help desk number to call for support.
Identify the resource roles and responsibilities that will be required for test plan execution.
- Project Plan - develop a project plan showing the phases, tasks, and resources. Update the project plan as needed to reflect such events as changes in deadlines or available resources.
Document the schedule in which the application under test is to be made available for testing, and the estimated time for executing test cases. Specify if frequent builds will be provided on a regular basis during the test cycle, or when system components are expected to be ready for testing.
List all the deliverables that are associated with the testing effort, and where copies of these deliverables or documents may be located. This includes the test plan itself, test scripts, test cases and project plan.
Defect Tracking and Reporting
Document the tool and process used to record and track defects. List any reports to be produced and include recipients, frequencies, delivery mechanisms and examples. Identify team resources involved in the defect tracking process.
Describe any ratings, categories or classifications used to identify or prioritize defects. Following are sample categories for prioritizing defects:
- Critical - denotes an unusable function that causes an abend or general protection fault, or when a change in one area of the application causes a problem elsewhere.
- Severe - a function does not perform as required or designed, or an interface object does not work as presented.
- Annoyance - function works but not as quickly as expected, or does not conform to standards and conventions.
- Cosmetic - not critical to system performance: misspelled words, incorrect formatting, vague or confusing error messages or warnings.
The test plan should be reviewed by all parties responsible for its execution, and approved by the test team, product and development managers. Provide for approval signatures at the bottom of the test plan. A walkthrough meeting with all parties in attendance is the most effective method of obtaining test plan approval.
When the test effort is complete, document the results. Identify any discrepancies between the plan and the actual implementation, and document how those discrepancies were handled. Get ready for your next succesful test plan.