Purpose
  • To identify the individual system elements, both hardware and software, that need to be tested.
Role:  Test Analyst 
Frequency:  This activity is typically conducted multiple times per iteration. . 
Steps
Input Artifacts:    Resulting Artifacts:   
Tool Mentors:   

Workflow Details:   

Determine what software will be implemented To top of page

Purpose:  To understand the main deliverables of the development team in the forthcoming schedule. 

Using the Iteration Plan and other available sources, identify the individual software items that the development team plan to produce for the forthcoming Iteration. Where the development effort is distributed to teams in various locations, you may need to discuss the development plans with each team directly. Check to see whether any development is subcontracted and use whatever channels are available to you to gather details of the subcontractors development effort.

As well as new software, also note changes to infrastructure and shared components. These changes may effect other dependent or associated software elements produced in previous development cycles, making it necessary to test the effect of these changes on those elements. For similar reasons, you should identify any changes and additions to third-party components that the development effort intends to make use of. This includes shared components, base or common code libraries, GUI widgets, persistence utilities etc. Review the the software architecture to determine what mechanism are in use that may be effected by third-party component changes.

Identify candidate system elements to be tested To top of page

Purpose:  To identify target items that the testing effort should exercise. 

For each identified test motivator, examine the list of software items to delivered as part this development cycle. Make an initial list that excludes any items that cannot be justified as useful in terms of satisfying the test motivators. Remember to include third-party software as well as that to be developed directly by the project development team.

You will also need to consider what impact the various target deployment environments will have on the elements to be tested. Your list of candidate system elements should be expanded to include both the software being developed and the candidate elements of the target environment. These elements will include hardware devices, device-driver software, operating systems, network and communications software, third-party base software components (e.g. eMail client software, Internet Browsers, etc.) and various configurations and settings related to the possible combinations of all these elements.

Where you have identified important target deployment environments, you should consider recording this information by creating or updating one or more outlined Test Environment Configurations; this outline should provide a name, brief description and enumerate the main requirements or features of the configuration. Avoid spending a lot of time on these outlines; the list of requirements and features will be subsequently detailed in Activity: Define Test Environment Configurations.

Refine the candidate list of target items To top of page

Purpose:  To eliminate unnecessary targets from-and add missing elements to-the test effort work plan. 

Using the evaluation mission and scope of the test effort agreed with the evaluation stakeholders, examine the list of target items and identify items that do not satisfy the evaluation mission and are obviously out of the test effort scope.

As an opposing check, critically examine the items again and challenge whether the evaluation mission and test effort scope will really be satisfied by the refined list of target items. It may be necessary to add additional elements to the list of target items to ensure appropriate scope and the ability to achieve the evaluation mission.

Define the list of target items To top of page

Purpose:  To communicate the decisions made about the target test items for the forthcoming work. 

Now that you've decided on the target test items, you need to communicate your choices to the test team and other stakeholders in the test effort. Arguably the most common method is to document the decisions about the target items in the Iteration Test Plan.

An alternative is to simply record this information in some form of table or spreadsheet and make use of it to govern work and responsibility assignment. During test implementation and execution individual testers will make use of this information to make tactical decisions regarding the specific tests to implement, and what test results to capture in relation to these target items.

Evaluate and verify your results To top of page

Purpose:  To verify that the activity has been completed appropriately and that the resulting artifacts are acceptable. 

Now that you have completed the work, it is beneficial to verify that the work was of sufficient value, and that you did not simply consume vast quantities of paper. You should evaluate whether your work is of appropriate quality, and that it is complete enough to be useful to those team members who will make subsequent use of it as input to their work. Where possible, use the checklists provided in RUP to verify that quality and completeness are "good enough".

Have the people performing the downstream activities that rely on your work as input take part in reviewing your interim work. Do this while you still have time available to take action to address their concerns. You should also evaluate your work against the key input artifacts to make sure you have represented them accurately and sufficiently. It may be useful to have the author of the input artifact review your work on this basis.

Try to remember that that RUP is an iterative process and that in many cases artifacts evolve over time. As such, it is not usually necessary-and is often counterproductive-to fully-form an artifact that will only be partially used or will not be used at all in immediately subsequent work. This is because there is a high probability that the situation surrounding the artifact will change-and the assumptions made when the artifact was created proven incorrect-before the artifact is used, resulting in wasted effort and costly rework. Also avoid the trap of spending too many cycles on presentation to the detriment of content value. In project environments where presentation has importance and economic value as a project deliverable, you might want to consider using an administrative resource to perform presentation tasks.



Rational Unified Process   2003.06.13