Purpose
  • To establish the structure in which the test suite implementation will reside
  • To assign responsibilities for test suite implementation areas and their contents
  • To outline the required Test Suites
Role:  Test Designer 
Frequency:  This activity is typically conducted multiple times per iteration. . 
Steps
Input Artifacts:    Resulting Artifacts:   
Tool Mentors:   
More Information: 

Workflow Details:   

Examine the Test Approach, Target Test Items and Assessment Needs To top of page

Purpose:  To gain an understanding of how testing will be assessed, and the implications that has on how the specific Test Suites need to be implemented to assess the Target Test Items. 

Starting with a review of the Test Plan to determine the assessment needs, consider how the assessment of the extent of testing and of software quality can be determined using the stated Test Approach. Consider any special needs that need to be addressed related to specific Target Test Items.

Examine the testability mechanisms and supporting elements To top of page

Purpose:  To understand the available testability elements and understand what mechanisms they support and benefits the offer. 

Review the mechanisms that are useful to enable testing in this environment, and identify the specific testability elements that implement these mechanisms. This includes reviewing resources such as any function libraries that have been developed by the test team and stubs or harnesses implemented by the development team.

Testability is achieved through a combination of developing software that is testable and defining a test approach that appropriately supports testing. As such, testability is an important aspect of the test teams asset development, just as it is an important part of the software development effort. Achieving Testability (the ability to effectively test the software product) will typically involve a combination of:

  • testability enablers provided by test automation tools
  • specific techniques to create the component Test Scripts
  • function libraries that separate and encapsulate complexity from the basic test procedural definition in the Test Script, providing a central point of control and modification.

Analyze distribution requirements To top of page

Does the current Test Suite have the requirement to be distributed? If so, make use of the testability elements that support distribution. These elements will typically be features of specific automation support tools that will distribute the Test Suite, execute it remotely and bring back the Test Log and other outputs for centralized results determination.

Analyze concurrency requirements To top of page

Does the current Test Suite have the requirement to be run concurrently with other Test Suites? If so, make use of the testability elements that support concurrency. These elements will typically be a combination of specific supporting tools an utility functions to enable multiple Test Suites to execute concurrently on different physical machines. Concurrency requires careful Test Data design and management to ensure no unexpected or unplanned side effects occur such as two concurrent tests updating the same data record.

Create the initial Test Suite structure To top of page

Purpose:  To outline the Test Suite(s) to be implemented. 

Enumerate one or more Test Suites that (when executed) will provide a complete and meaningful result of value to the test team, enabling subsequent reporting to stakeholders. Try to find a balance between enough detail to provide specific information to the project team but not so much detail that it's overwhelming and unmanageable.

Where Test Scripts already exist, you can probably assemble the Test Suite and it's constituent parts yourself, then pass the Test Suite stabilization work on to a Test Suite implementer to complete.

For Test Suites that require new Test Scripts to be created, you should also give some indication of the Test Scripts-or other Test Suites-you believe will be referenced by this Test Suite. If it's easy to enumerate them, do that. If not, you might simply provide a brief description that outlines the expected content coverage of the main Test Suite and leave it to the Test Suite implementer to make tactical decisions about exactly what Test Scripts are included.

Adapt the Test Suite structure to reflect team organization and tool constraints To top of page

Purpose:  To refine the Test Suite structure to work with the team responsibility assignments. 

It may be necessary to further subdivide or restructure the Test Suites you've identified to accommodate the Work Breakdown Structure (WBS) the team is working to. This will help to reduce the risk that access conflicts might arise during Test Suite development. Sometimes test automation tools might place constraints on how individuals can work with automation assets, so restructure the Test Suites to accommodate this as necessary

Identify inter-Test Script communication mechanisms To top of page

Purpose:  To identify Test Data and System State that needs to be shared or passed between Test Scripts. 

In most cases, Test Suites can simply call Test Scripts in a specific order. This will be sufficient in many cases to ensure the correct system state is passed through from one Test Script to the next.

However, in certain classes of system, dynamic run-time data is generated by the system or derived as a result of the transactions that take place within it. For example, in an order entry and dispatch system, each time an order is entered a unique order number is system generated. To enable an automated Test Script to dispatch an order, a preceding order entry Test Script needs to capture the unique number the system generates and pass it on to the order dispatch Test Script.

In cases like this, you will need to consider what inter-Test Script communication mechanism is appropriate to use. Typical alternatives include passed parameters, writing and reading values in a disk file and using global run-time variables. Each strategy has pro's and con's that make it more or less appropriate in each specific situation.

Define initial dependencies between Test Suite elements To top of page

Purpose:  To identify and record the run-time dependencies between Test Suite elements. 

This is primarily associated with the sequencing of the Test Scripts and possibly Test Suites for run-time execution. Tests that run without the correct dependencies being established run the risk of either failing or reporting anomalous data.

Visually model the test implementation architecture To top of page

Purpose:  To make use of a diagram to document and explain how the test implementation is realized. 

If you have access to a UML modeling or drawing tool, you may wish to create a diagram of the Test Implementation Model that depicts the key elements of the automated test software. You might also diagram some key aspects of the Test Automation Architecture in a similar way.

Another approach is to draw these diagrams on a white-board that is easily visible to the test team.

Refine the Test Suite structure To top of page

Purpose:  To make necessary adjustments to maintain the integrity of the test implementation. 

As the project progresses, Test Suites are likely to change: new Test Scripts will be added and old Test Scripts updated, reordered or deleted. These changes are a natural part of Test Suite maintenance and you need to embrace them rather than avoid them.

If you don't actively maintain the Test Suites, they will quickly become broken and fall into disuse. Left for a few builds, a Test Suite may take extensive effort to resurrect, and it may be easier to simply abandon it and create a new one from scratch. See the More Information: section in the header table of this page for more guidelines on maintaining automated Test Suites.

Maintain traceability relationships To top of page

Purpose:  To enable impact analysis and assessment reporting to be performed on the traced items. 

Using the Traceability requirements outlined in the Test Plan, update the traceability relationships as required.

Evaluate and verify your results To top of page

Purpose:  To verify that the activity has been completed appropriately and that the resulting artifacts are acceptable. 

Now that you have completed the work, it is beneficial to verify that the work was of sufficient value, and that you did not simply consume vast quantities of paper. You should evaluate whether your work is of appropriate quality, and that it is complete enough to be useful to those team members who will make subsequent use of it as input to their work. Where possible, use the checklists provided in RUP to verify that quality and completeness are "good enough".

Have the people performing the downstream activities that rely on your work as input take part in reviewing your interim work. Do this while you still have time available to take action to address their concerns. You should also evaluate your work against the key input artifacts to make sure you have represented them accurately and sufficiently. It may be useful to have the author of the input artifact review your work on this basis.

Try to remember that that RUP is an iterative process and that in many cases artifacts evolve over time. As such, it is not usually necessary-and is often counterproductive-to fully-form an artifact that will only be partially used or will not be used at all in immediately subsequent work. This is because there is a high probability that the situation surrounding the artifact will change-and the assumptions made when the artifact was created proven incorrect-before the artifact is used, resulting in wasted effort and costly rework. Also avoid the trap of spending too many cycles on presentation to the detriment of content value. In project environments where presentation has importance and economic value as a project deliverable, you might want to consider using an administrative resource to perform presentation tasks.



Rational Unified Process   2003.06.13