Wylie College
Requirements Management Plan
Version 2.0
Revision History
Date |
Version |
Description |
Author |
---|---|---|---|
08/Jan/1999 |
1.0 |
Initial Release |
Simon Jones |
10/Feb/1999 |
2.0 |
Extending plan |
Simon Jones |
|
|
|
|
|
|
|
|
Table of Contents
1.3 Definitions, Acronyms, and Abbreviations
2.1 Organization, Responsibilities, and Interfaces
2.2 Tools, Environment, and Infrastructure
3. The Requirements Management Program
3.1 Requirements Identification
3.2.1 Criteria for Product Requirements
3.2.2 Criteria for Use Case Requirements
3.3.1 Attributes for Use Case Requirements
3.3.2 Attributes for Test Cases
3.5 Requirements Change Management
3.6 Disciplines and Activities
Requirements Management Plan
The Requirements Attributes Guidelines identifies and describes the attributes that will be used for managing the requirements for all software projects at Wylie College. In addition, this document outlines the requirements traceability that will be maintained on projects during development.
The attributes assigned to each requirement will be used to manage the software development and to prioritize the features for each release.
The objective of requirements traceability is to reduce the number of defects found late in the development cycle. Ensuring all product requirements are captured in the software requirements, design, and test cases improves the quality of the product.
The attribute and traceability guidelines in this document apply to the product requirements, software requirements, and test requirements for all Wylie College software projects.
We use terms as defined in the Rational Unified Process and in Rational RequisitePro documentation.
The following references may be found on or from the Wylie College Software Process website.
1. Wylie College Configuration Management Plan.
2. Rational Unified Process.
3. Wylie College Development Case
Covered by the individual project's Software Development Plan.
Rational RequisitePro will be used to manage requirements attributes and traceability. Other infrastructure details are covered on the Wylie College Software Process Website.
Each project will identify and manage the following requirement types:
Artifact (Document Type) |
Requirement Type |
Description |
---|---|---|
Vision |
Product requirements |
Product features, constraints, quality ranges, and other product requirements. |
Use-Case Model |
Use Case |
Use cases, documented in Rational Rose |
Test Plan |
Test Cases |
Cases describing how we will verify that the system behaves as expected. |
The product requirements defined in the Vision Document will be traced to the corresponding use case or supplementary requirements in the Use Case Specifications, and the Supplementary Specification.
Each product requirement traces to 1 or more use case requirements and supplementary requirements.
Each product requirement traces to 1 or more use case requirements and supplementary requirements.
The use case requirements defined in the Use Case Specifications and the Supplementary Specification will be traced to the corresponding test cases specified in the Test Plan.
Each use case requirement traces to 1 or more system test cases.
The test cases specified in the Test Plan are traced back to the product requirements (from the Vision) and use case requirements which are being verified by the particular test case.
A test case may trace back to 1 or more product and use case requirements. In the case where the test case is verifying a derived requirement or the design, the test case may have no traceability back to the original product requirements or use case requirements.
The use case requirements and the Supplementary Specification will be managed using the attributes defined in this section. These attributes are useful for managing the development effort, determining iteration content, and for associating use cases with their specific Rose models.
Set after the analysis has drafted the use cases. Tracks progress of the development of the use case from initial drafting of the use case through to final validation of the use case.
Proposed |
Use Cases which have been identified though not yet reviewed and approved. |
---|---|
Approved |
Use Cases approved for further design and implementation. |
Validated |
Use Cases which have been validated in a system test. |
Set by the Project Manager. Determines the priority of the use case in terms of the importance of assigning development resources to the use case and monitoring the progress of the use case development. Priority is typically based upon the perceived benefit to the user, the planned release, the planned iteration, complexity of the use case (risk), and effort to implement the use case.
High |
Use Case is a high priority relative to ensuring the implementation of the use case is monitored closely and that resources are assigned appropriately to the task. |
---|---|
Medium |
Use Case is medium priority relative to other use cases. |
Low |
Use Case is low priority. Implementation of this use case is less critical and may be relayed or rescheduled to subsequent iterations or releases. |
Set by the development team. Because some use cases require more time and resources than others, estimating the number of team or person-weeks, lines of code required or function points, for example, is the best way to gauge complexity and set expectations of what can and cannot be accomplished in a given time frame. Used in managing scope and determining development priority. The Project Manager uses these effort estimates to determine the project schedule and to effectively plan the resourcing of the tasks.
Estimate effort in Person Days (assume 7.5 hours in a workday).
Set by development team based on the probability the use case will experience undesirable events, such as effort overruns, design flaws, high number of defects, poor quality, poor performance, etc. Undesirable events such as these are often the result of poorly understood or defined requirements, insufficient knowledge, lack of resources, technical complexity, new technology, new tools, or new equipment.
Wylie College projects will categorize the technical risks of each use case as high, medium, or low.
High |
The impact of the risk combined with the probability of the risk occurring is high. |
---|---|
Medium |
The impact of the risk is less severe and/or the probability of the risk occurring is less. |
Low |
The impact of the risk is minimal and the probability of the risk occurring is low. |
Records the development iteration in which the use case will be implemented. It is anticipated that the development for each release will be performed over several development iterations during the Construction Phase of the project.
The iteration number assigned to each use case is used by the Project Manager to plan the activities of the project team.
The possible values will be of the form <letter>-<iteration number> where the letter is I, E, C, T for inception, elaboration, construction and transition respectively. For example:
E-1 |
Scheduled for Elaboration Phase, Iteration 1 |
C-1 |
Scheduled for Construction Phase, Iteration 1 |
C-2 |
Scheduled for Construction Phase, Iteration 2 |
C-3 |
Scheduled for Construction Phase, Iteration 3 |
Use cases are assigned to either individuals or development teams for further analysis, design, and implementation. A simple pull down list will help everyone on the project team better understand responsibilities.
Identifies the Rose use case model associated with the use case requirement.
Set by the Test Lead. Tracks status of each test case.
Untested |
Test Case has not been performed. |
---|---|
Failed |
Test has been conducted and failed. |
Conditional Pass |
Test has been completed with problems. Test assigned status of Pass upon the condition that certain actions are completed. |
Pass |
Test has completed successfully. |
Records the system build in which the specific test case will be verified.
Individual assigned to perform and verify the test case. This simple pull down list will help everyone on the project team better understand responsibilities.
Planned test date or actual test date.
Any notes associated with planning or executing the test.
TBD
See the Wylie College Configuration Management Plan.
See the Wylie College Development Case.
This is described in each project's Software Development Plan.
This is described in each project's Software Development Plan.