Topics

Principles To top of page

  • Metrics must be simple, objective, easy to collect, easy to interpret, and hard to misinterpret.
  • Metrics collection must be automated and non-intrusive, i.e., not interfere with the activities of the developers.
  • Metrics must contribute to quality assessment early in the lifecycle, when efforts to improve software quality are effective.
  • Metric absolute values and trends must be actively used by management personnel and engineering personnel for communicating progress and quality in a consistent format.

A taxonomy of metrics To top of page

Metrics for certain aspects of the project, include:

  • Progress in terms of size and complexity.
  • Stability in terms of rate of change in the requirements or implementation, size, or complexity.
  • Modularity in terms of the scope of change.
  • Quality in terms of the number and type of errors.
  • Maturity in terms of the frequency of errors.
  • Resources in terms of project expenditure versus planned expenditure

Trends are important, and somewhat more important to monitor than any absolute value in time.

Metric Purpose Sample measures/perspectives
Progress Iteration planning
Completeness
  • Number of classes
  • SLOC
  • function points
  • scenarios
  • test cases
  • ...per iteration or category
  • Amount of rework per iteration (no. of classes)
Stability Convergence
  • Number and type of changes (bug vs. enhancement; interface vs. implementation)
  • ... per iteration or category
  • Amount of rework per iteration
Adaptability Convergence
Software "rework"
  • Average person-hours/change
  • ...per iteration or category
Modularity Convergence
Software "scrap"
  • Number of classes/categories modified per change
  • ...per iteration
Quality Iteration planning
Rework indicator
Release criterion
  • Number of errors
  • defect discovery rate
  • defect density
  • Depth of inheritance
  • class coupling
  • size of interface (number of operations)
  • number of methods overridden
  • method size
  • ...per class and category
Maturity Test coverage/adequacy
Robustness for use
  • Test hours/failure and type of failure
  • ...per iteration or category
Expenditure profile Financial insight
Planned vs. actual
  • Person-days/class
  • Full-time staff per month
  • - % budget expended

A Small Set of  Metrics To top of page

This example is extracted from Software Project Management, a Unified framework [ROY98].

Metrics and Primitives metrics

Total SLOC   SLOCt  = Total size of the code
SLOC under configuration
control  
SLOCc = Current baseline
Critical defects   SCO0  = number of type 0 SCO
Normal defects   SCO1 = number of type 1 SCO
Improvement requests   SCO2 = number of type 2 SCO
New features   SCO3 = number of type 3 SCO
Number of SCO   N = SCO0 + SCO1 + SCO2
Open Rework (breakage)   B = cumulative broken SLOC due to SCO1 and SCO2
Closed rework (fixes)   F   = cumulative fixed SLOC
Rework effort   E = cumulative effort expended fixing type 0/1/2 SCO
Usage time   UT = hours that a given baseline has been operating under realistic usage scenarios

Quality metrics for the end-product

From this small set of  metrics, some more interesting metrics can be derived:

Scrap ratio   B/SLOCt, percentage of product scrapped
Rework ratio   E/Total effort, percentage of rework effort
Modularity   B/N, average breakage per SCO
Adaptability   E/N, average effort per SCO
Maturity   UT/(SCO0 + SCO1), Mean time between defects
Maintainability   (scrap ratio)/(rework ratio), maintenance productivity

In-progress indicators

Rework stability   B - F, breakage versus fixes over time
Rework backlog   (B-F)/SLOCc, currently open rework
Modularity trend   Modularity, over time
Adaptability trend   Adaptability, over time
Maturity trend   Maturity, over time

 

Display Rational Unified Process using frames

 

© Rational Software Corporation 1998 Rational Unified Process 5.1 (build 43)