0% found this document useful (0 votes)
37 views120 pages

Software Inspection Process

The document outlines the software inspection process, detailing its purpose, benefits, and structured activities involved in identifying defects in software work products. It emphasizes the importance of inspections in improving software quality, reducing costs, and enhancing productivity, with various metrics provided to illustrate the return on investment and efficiency. The process includes planning, overview, preparation, meeting, rework, and follow-up activities, each with specific inputs and outputs to ensure thorough defect identification and correction.

Uploaded by

cruzalejandro113
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views120 pages

Software Inspection Process

The document outlines the software inspection process, detailing its purpose, benefits, and structured activities involved in identifying defects in software work products. It emphasizes the importance of inspections in improving software quality, reducing costs, and enhancing productivity, with various metrics provided to illustrate the return on investment and efficiency. The process includes planning, overview, preparation, meeting, rework, and follow-up activities, each with specific inputs and outputs to ensure thorough defect identification and correction.

Uploaded by

cruzalejandro113
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 120

Software Inspection Process

Technical Overview

David F. Rico
Table of Contents
• Preface • Metrics
• Introduction • Deployment
• Benefits • Management
• Process • Pit Falls
• Forms • Research
• Roles • Conclusion
• Case Studies • Resources

2
Preface
Reviews

Reviews are formal team


evaluations of software work
products, which identify any
discrepancies from
specifications and standards, or
provide recommendations after
the examination of alternatives,
or both.

4
Walkthroughs

Walkthroughs are unstructured


meetings held by software
managers to publicize design and
implementation concepts,
without obligation to use any
feedback, alternative ideas, or
suggested changes resulting
from the meeting

5
Inspections

Inspections are structured and


neutrally facilitated meetings for
technical peers to identify
defects in software work
products which must be
corrected, without suggesting
solutions or interference from the
originator of the work product

6
Followup Questions
• What is a review ???
• What is a walkthrough ???
• What is an inspection ???

7
Introduction
What is it ?
• A simple process to identify defects
• Highly structured meeting
• Forum for independent evaluation
• Form of static analysis or static testing
• Early, in-process validation technique
• Form of quality and reliability engineering
• Performed by software engineering

9
What are the goals ?
• Identify as many defects as possible
• Identify defects in early stages of life cycle
• Identify defects before testing and fielding
• Identify defects cheaply and inexpensively
• Reduce development and maintenance costs
• Shorten development cycle time
• Quantitatively control quality and reliability

10
Where did it come from ?
• Created by Michael Fagan of IBM in 1972
• Typically referred to as Fagan Inspections
• Adaptation of statistical quality control to
large systems computer programming
• First published in IBM Systems Journal
– “Design and code inspections to reduce errors
in program development”
– Volume 15, Number 3, 1976

11
Why do it ?
• Reduce development and maintenance costs
• Improve software quality and reliability
• Initiate effective verification and validation
• Reduce cost and risk of software testing
• Reduce dependence on quality assurance
• Support SEI Levels 3, 4, and 5 (e.g.,
software quality metrics, process metrics,
defect prevention, change management)
12
What is it like ?
STATIC REVIEW TECHNIQUE
Review Inspection Desk Check Walkthrough Audit Phase Review
Feature
Defect Defect Design Process Progress
Purpose Identification Identification Evaluation Verification Evaluation

Timing Very Early Early Early Late Very Late

Product Product Product Phase Phase


When Completion Completion Completion Completion Completion

People 4 1 5-15 5 10-200

Cost Very Low Low High Medium Very High

Project Quality
Who Engineers Engineers
Manager Assurance
Customer

Independent Very High Low Very Low High High

Not Not Not Not


Rework Mandatory
Mandatory Mandatory Mandatory Mandatory

Pace Slow High High High Very High

Efficiency Very High Average Low Low Very Low

Measurable Very High High Very Low Low Low

Change
Metric Defect Defect N/A Deviation
Request

13
What isn’t it ?
• Not for software quality assurance group
• Not for design alternative evaluation
• Not for management participation
• Not for individual performance evaluation
• Not for socio-political assassination
• Not at all like a Walkthrough
• Not a notoriously late and ineffective
manufacturing inspection
14
How does it work ?
• Has a singular objective
• Introduces notion of counting defects
• Precision mechanics of an audit
• Mandatory defect correction
• Uses “second set of eyes” principle
• Rational orchestration of facilitated forum
• Identifies defects at early point-of-origin
• Exploits uniquely-skilled domain experts
15
Followup Questions
• Inspections are simple processes for ???
• What is the goal of inspections ???
• Who created inspections ???
• Why perform inspections ???
• What are inspections similar to ???
• What aren’t inspections ???
• How do inspections work ???

16
Benefits
What is the return on investment ?
• Average return on investment of 133:1
$70

$60– DACS estimates an ROI of 72:1


$50
– AT&T estimates an ROI of 234:1
$40
Millions

– Rico estimates an ROI of 160:1


$30
– BNR estimates an ROI of 114:1
$20

$10
– Gilb estimates an ROI of 113:1
– HP estimates an ROI of 104:1
1989 1990 1991 1992 1993 1994 1995 1996 1997 1998
Year

18
What is the cycle time reduction ?
• Average cycle time reduction of 5.5x
– DACS estimates a reduction of 1.55x
– Fagan estimates a reduction of 6.67x
– AT&T estimates a reduction of 8.37x
– Rico estimates a reduction of 6.54x
– BNR estimates a reduction of 5.17x
– Gilb estimates a reduction of 5.13x
– HP estimates a reduction of 4.84x

19
What is the quality increase ?
• Average quality increase of 16.4x
6% 3%
– Rico estimates
4%
an increase of 3.03x
3%

– Bull HN estimates an increase of 76.93x


– Aetna estimates
12% an increase of 5.56x
– IBM estimates an increase of 14.29x
– BNR estimates
5% an increase of 4.99x
– AT&T estimates an increase of 3.35x
67%
– Fagan estimates an increase of 6.67x

20
What is the productivity increase ?
• Average productivity increase of 6x
– DACS estimates an increase of 1.55x
– Fagan estimates an increase of 6.67x
– AT&T estimates an increase of 8.37x
– Rico estimates an increase of 6.54x
– BNR estimates an increase of 5.17x
– Gilb estimates an increase of 5.13x
– HP estimates an increase of 4.84x

21
What is the defect removal efficiency ?
• Average defect removal efficiency of 82.2%
– Rico estimates an efficiency of 67%
– Bull HN estimates an efficiency of 98.7%
– Aetna estimates an efficiency of 82%
– IBM estimates an efficiency of 93%
– BNR estimates an efficiency of 80%
– AT&T estimates an efficiency of 70%
– Fagan estimates an efficiency of 85%

22
What is the break even point ?
• Average break even point of 23.02 hours
– AT&T estimates a break even of 21.58 hours
3.00
– Rico Inspection
estimates a break even of 22.43 hours
Source Lines of Code (SLOC)

2.50
Ad Hoc
– BNR estimates a break even of 23.56 hours
2.00

– Gilb estimates a break even of 23.61 hours


1.50
Effort = 23.02
– HP estimates a break
1.00 even 1.85
SLOC =
of 23.95 hours
0.50

0.00
22 23 24 25
Effort in Hours

23
What is the accuracy ?
• Average accuracy of 92.3%
– Estimate accuracy of 100% for a 680K project
– Estimate accuracy of 100% for a 30K project
– Estimate accuracy of 75% for a 70K project
– Estimate accuracy of 89% for a 1,700K project
– Estimate accuracy of 86% for a 290K project
– Estimate accuracy of 96% for a 70K project
– Estimate accuracy of 92% for a 540K project
– Estimate accuracy of 100% for a 700K project
24
Followup Questions
• What is the return-on-investment ???
• What is the cycle time reduction ???
• What is the quality increase ???
• What is the productivity increase ???
• What is the defect removal efficiency ???
• What is the break even point ???
• What is the accuracy ???

25
Process
What is the overall process ?
SOFTWARE INSPECTION PROCESS

Purpose ✓ Team Identification of Software Work Product Defects

✓ Predecessor Specifications
✓ Software Work Product Standards
✓ Software Work Product
✓ Software Work Product Overview
Input ✓ Statement-of-Work
✓ Software Defect Types
✓ Checklists
✓ Inspection Defect List

✓ Planning Activity - Organize Inspections


✓ Overview Activity - Describe Software Work Products
✓ Preparation Activity - Analyze Software Work Products
Activity ✓ Meeting Activity - Identify Software Defects
✓ Rework Activity - Correct Software Defects
✓ Followup Activity - Verify Software Defect Corrections

✓ Inspection Defect Summary


Output ✓ Inspection Report
✓ Software Work Product

27
What is the planning activity ?
PLANNING ACTIVITY

Purpose ✓ Organize Software Work Product Inspections

✓ Predecessor Specifications
Input ✓ Software Work Product Standards
✓ Software Work Product

✓ Authors Submit Software Work Products


✓ Moderators/Authors Review Software Work Products
✓ Moderators Select Inspectors/Assign Roles
Activity ✓ Moderators Schedule Overviews/Meetings
✓ Moderators Arrange Overview/Meeting Locations
✓ Moderators Prepare Inspection Meeting Notices
✓ Moderators Distribute Inspection Materials

✓ Inspection Meeting Notice


✓ Statement-of-Work
✓ Predecessor Specifications
Output ✓ Software Work Product Standards
✓ Software Defect Types
✓ Checklists
✓ Software Work Product

28
What is the overview activity ?
OVERVIEW ACTIVITY

Purpose ✓ Introduce/Describe Software Work Products

Input ✓ Software Work Product Overview

✓ Authors Prepare Software Work Product Overviews


✓ Moderators Facilitate Overviews
✓ Authors Distribute Software Work Product Overviews
Activity ✓ Authors Present Software Work Product Overviews
✓ Inspectors Ask Software Work Product Questions
✓ Authors Answer Software Work Product Questions
✓ Inspectors Note Software Work Product Problems

Output ✓ Software Work Product Overview

29
What is the preparation activity ?
PREPARATION ACTIVITY

Purpose ✓ Individually Analyze Software Work Products

✓ Statement-of-Work
✓ Predecessor Specifications
✓ Software Work Product Standards
Input ✓ Software Defect Types
✓ Checklists
✓ Software Work Product

✓ Inspectors/Recorders Review Software Defect Types


✓ Inspectors Review Checklists
✓ Inspectors Review Statements-of-Work
Activity ✓ Inspectors Review Software Work Product Standards
✓ Inspectors Review Predecessor Specifications
✓ Inspectors Review Software Work Products
✓ Readers Select Narration Techniques

Output ✓ Software Work Product

30
What is the meeting activity ?
MEETING ACTIVITY

Purpose ✓ Team Identification of Software Work Product Defects

✓ Statement-of-Work
✓ Predecessor Specifications
✓ Software Work Product Standards
Input ✓ Software Defect Types
✓ Checklists
✓ Software Work Product

✓ Moderators Facilitate Meetings


✓ Readers Narrate Software Work Products
✓ Inspectors Ask Software Work Product Questions
✓ Authors Answer Software Work Product Questions
Activity ✓ Inspectors Identify Software Work Product Defects
✓ Recorders Transcribe Software Work Product Defects
✓ Moderators Review Inspection Defect Lists
✓ Moderators Disposition Software Work Products

Output ✓ Inspection Defect List

31
What is the rework activity ?
REWORK ACTIVITY

Purpose ✓ Mandatory Software Work Product Defect Correction

✓ Inspection Defect List


Input ✓ Software Work Product

✓ Authors Obtain Inspection Defect Lists


✓ Authors Obtain Software Work Products
✓ Authors Review Inspection Defect Lists
Activity ✓ Authors Correct Software Work Product Defects
✓ Authors Correct New Software Work Product Defects
✓ Authors Verify Defect Type, Class, and Severity
✓ Authors Submit Reworked Software Work Products

Output ✓ Software Work Product

32
What is the followup activity ?
FOLLOWUP ACTIVITY

Purpose ✓ Verify/Summarize Software Work Product Corrections

✓ Inspection Defect List


Input ✓ Software Work Product

✓ Moderators Obtain Reworked Software Work Products


✓ Moderators/Authors Review Software Work Products
✓ Moderators Prepare Inspection Defect Summaries
Activity ✓ Moderators Prepare Inspection Reports
✓ Moderators/Authors Verify Summaries/Reports
✓ Moderators Distribute Inspection Reports
✓ Moderators Submit Software Work Products to SCM

✓ Inspection Defect Summary


Output ✓ Inspection Report
✓ Software Work Product

33
Followup Questions
• What is the purpose of inspections ???
• What is the purpose of planning ???
• What is the purpose of the overview ???
• What is the purpose of preparation ???
• What is the purpose of the meeting ???
• What is the purpose of rework ???
• What is the purpose of the followup ???

34
Forms
Inspection meeting notice
Inspection Meeting Notice

Date: Component: Moderator:


Project: Release: Phone:
Activity: Document: Location:

Meeting Type: Overview Inspection Re-Inspection

Inspection Type: Software Installation Plan Software Architecture Description User Documentation Description

Software Integration Plan System Architecture Description Test or Validation Procedure

Test or Validation Plan Software Design Description Software Integration Audit Report

System Requirements Specification Software Interface Design Description Test or Validation Results Report

Database Design Description Software Requirements Description

Meeting Date: Meeting Duration:


Meeting Time: Software Work Product Size:
Meeting Location: Expected Preparation Time:

Meeting Participants: Participant Location: Participant Role:

36
Inspection defect list
Inspection Defect List

Date: Component: Moderator:


Project: Release: Phone:
Activity: Document: Location:

Meeting Type: Overview Inspection Re-Inspection

Inspection Type: Software Installation Plan Software Architecture Description User Documentation Description

Software Integration Plan System Architecture Description Test or Validation Procedure

Test or Validation Plan Software Design Description Software Integration Audit Report

System Requirements Specification Software Interface Design Description Test or Validation Results Report

Database Design Description Software Requirements Description

Disposition: Accept Conditional Re-Inspect

Location: Defect Description: Type: Class: Severity:

Type: Data, Documentation, Functionality, Human Factors, Interface, Input/Output, Logic, Maintainability, Performance, Syntax, Standards, Test, Other
Class: Missing, Wrong, Extra
Severity: Major, Minor

37
Inspection defect summary
Inspection Defect Summary

Date: Component: Moderator:


Project: Release: Phone:
Activity: Document: Location:

Meeting Type: Overview Inspection Re-Inspection

Inspection Type: Software Installation Plan Software Architecture Description User Documentation Description

Software Integration Plan System Architecture Description Test or Validation Procedure

Test or Validation Plan Software Design Description Software Integration Audit Report

System Requirements Specification Software Interface Design Description Test or Validation Results Report

Database Design Description Software Requirements Description

Disposition: Accept Conditional Re-Inspect

Defect: MINOR DEFECTS MAJOR DEFECTS


Missing Wrong Extra Total Missing Wrong Extra Total
Data
Documentation
Functionality
Human Factors
Interface
Input/Output
Logic
Total:

38
Inspection report
Inspection Report

Date: Component: Moderator:


Project: Release: Phone:
Activity: Document: Location:

Meeting Type: Overview Inspection Re-Inspection

Inspection Type: Software Installation Plan Software Architecture Description User Documentation Description

Software Integration Plan System Architecture Description Test or Validation Procedure

Test or Validation Plan Software Design Description Software Integration Audit Report

System Requirements Specification Software Interface Design Description Test or Validation Results Report

Database Design Description Software Requirements Description

Est. Rework Effort: Re-Inspection Date: Duration:

Actual Rework Effort: Inspector Number: Size of Materials:

Size of Materials: Preparation Time: Certification:

Rework Author: Meeting Number: Completion Date:

Inspectors:

Comments:

39
Followup Questions
• What is the inspection meeting notice ???
• What is the inspection defect list ???
• What is the inspection defect summary ???
• What is the inspection report ???
• What is the meeting type ???
• What is the inspection type ???
• What are the defect severity types ???

40
Roles
What are the roles ?
• Moderator
– Facilitator
• Author
– Producer of work product
• Inspector
– Identifier of defects
• Reader and recorder (separate roles)
– Paraphraser of product and logger of defects

42
What is a moderator ?
• Function → Classical facilitator (maestro)
• Who → Specially trained technical lead
• Responsibilities → Careful coordination
– Maintains time limits for all activities
– Verifies entry criteria and schedules meetings
– Manages overview and inspection subprocesses
– Keeps project managers out of inspection meeting
– Allows only inspectors to identify defects
– Mutes author from interfering with inspection
– Prevents inspectors from identifying “solutions”
– Prevents inspectors from insulting author
– Verifies rework and records inspection results

43
What is an author ?
• Function → Developer of work product
• Who → Trained project manager/engineer
– Project or test manager (project or test plan)
– Analyst (requirements specification)
– Designer (design specification)
– Programmer (software source code)
– Tester (test report)
• Responsibilities → Passive participation
– Introduces work product to inspectors
– Answers any questions
– Corrects defects

44
What is an inspector ?
• Function → Identifier of defects
• Who → Trained project manager/engineer
– Project plan
• Program and fellow project managers
• Technical leads (responsible for executing plan)
– Requirements, design, code, and test (domain specialist)
• Engineers, analysts, designers, coders, and testers
• Responsibilities → Passive participation
– Attends product overview
– Analyzes defect history and prepares for inspection
– Identifies defects during inspection subprocess

45
What is a reader ?
• Function → Consumer of work product
• Who → Trained project manager/engineer
– Project plan (technical lead is reader)
– Test plan (tester is reader)
– Requirements (designer is reader)
– Design (programmer is reader)
– Code (tester is reader)
• Responsibilities → Passive participation
– Attends product overview
– Studies and practices reading work product
– Paraphrases work product during inspection subprocess

46
What is a recorder ?
• Function → Writes down defects
• Who → Trained project manager/engineer
• Responsibilities → Passive participation
– Becomes familiar with process and forms
– Writes down defects identified by inspectors
– Negotiates adequate time to write down defects
– Subject to moderator’s direction
– Does not write down information other than defects
– Provides completed defect lists to moderator

47
Followup Questions
• What is a moderator ???
• What is an author ???
• What is an inspector ???
• What is a reader ???
• What is a recorder ???

48
Case Studies
Bull HN Information Systems
• System
– Operating system
– 11 million lines of code
– 600,000 lines of code added annually
– “C” programming language
• Experience
– 7,413 inspections conducted
– 11,557 “major” defects identified
– 98.7% inspection efficiency achieved
– 667,170 inspection data points (in 3 years)
Mainframe

50
Bell Northern Research
• System
– Embedded, real-time digital switching systems
– 15 million lines of code
– 312,500 lines of code added quarterly
– Modern, high-level programming languages
• Experience
– 2,778 inspections conducted
– 240,000 defects identified
– 80% inspection efficiency achieved
– 250,020 inspection data points (in 2 years)

51
IBM AS/400
• System
– Operating system
– 7.1 million lines of code
– 2 million lines of code added annually
– PL/1, Jovial, and RPG programming languages
• Experience
– 7,889 inspections conducted
– 681,600 defects identified
– 70% inspection efficiency achieved
– 710,010 inspection data points (in 3.5 years)
IBM AS/400
52
AT&T
• System
– Embedded, real-time systems
– 111,600 lines of code
– 9,300 lines of code per project average
– “C” programming language
• Experience
– 324 inspections conducted
– 4,860 defects identified
– 70% inspection efficiency achieved
– 29,160 inspection data points (in 7 years)

53
Applicon
• System
– Computer aided drafting (CAD) tools
– 25,920 lines of code
– 12,960 lines of code per year
– “C” programming language
• Experience
– 211 inspections conducted
– 3,857 defects identified
– 70% inspection efficiency achieved
– 18,990 inspection data points (in 2 years)

54
Lockheed Martin
• System
– Embedded, real-time system
– 2 million lines of code
– 200,000 lines of code added per year
– “C” programming language
• Experience
– 23 inspections conducted
– 324 defects identified
– 67% inspection efficiency achieved
– 2,070 inspection data points (in 1 year)
Oscilloscope
55
IBM Space Shuttle
• System
– Man-rated spacecraft avionics system
– 500,000 lines of code
– 25,467 lines of code added/maintained per year
– HAL-S programming language (custom)
• Experience
– 1,061 inspections conducted
– 36,672 defects identified
– 90% inspection efficiency achieved
– 95,490 inspection data points (in 15 years)

56
Followup Questions
• Who conducted over 7,000 inspections ???
• What was the highest efficiency ???
• How many data points can be generated ???
• How many cases were maintenance ???
• What was the lowest efficiency ???
• What languages were inspected ???
• Do inspections apply to 4GLs ???

57
Metrics
What are the overall metrics ?
• Estimated versus actual effort and duration
– Estimated versus actual defects
• Major/minor defects per hour and inspection
• Defect types per inspection
– Inspection suppression and gain rate
– Participants per hour and inspection
– Subprocess intervals
• Planning-Overview-Preparation-Rework-Followup
• Overview-Preparation-Rework-Followup
• Preparation-Rework-Followup
• Rework-Followup
59
What are the planning metrics ?
• Effort and duration of planning subprocess
– Effort to verify entry criteria
• Number of products passing and failing entry criteria
• Number of passed and failed entry criteria
• Entry criteria validation rate
– Effort to select participants
– Effort to schedule inspection
– Effort to prepare notice

60
What are the overview metrics ?
• Effort and duration of overview subprocess
– Effort to facilitate overview
– Effort to introduce product
• Product presentation rate
• Number of product inquiries
• Number of inquiries handled
• Number of inquiries deferred
– Effort to assign roles
– Number of participants

61
What are the preparation metrics ?
• Effort and duration of preparation subprocess
– Effort to analyze specifications
– Effort to analyze checklists
– Effort to analyze defect history
• Number and type of estimated/expected defects
– Effort to analyze product
• Product analysis rate
– Effort to note potential defects
• Number of potential defects noted
– Number of participants
62
What are the meeting metrics ?
• Effort and duration of meeting subprocess
– Effort to facilitate meeting
– Effort to inspect product
• Product inspection rate
– Effort to review and summarize defects
– Number of major and minor defects
– Number of participants

63
What are the rework metrics ?
• Effort and duration of rework subprocess
– Effort to review defect list
– Effort to correct defects
– Number of major and minor defects corrected
– Effort to verify defect correction

64
What are the followup metrics ?
• Effort and duration of followup subprocess
– Effort to verify rework
– Effort to summarize inspection

65
How are metrics collected ?
• Software metrics plan
• Software quality plan
• Software project plan
• Inspection forms
• Spreadsheets
• Desktop databases
• Multi-user databases

66
Followup Questions
• What were the overall metrics ???
• What were the planning metrics ???
• What were the overview metrics ???
• What were the preparation metrics ???
• What were the meeting metrics ???
• What were the rework metrics ???
• What were the followup metrics ???

67
Deployment
What is a defect ?
• Nonconformance to requirements
• Deviation from specification
• Untestable requirement
• Abnormal condition
• Unmet standards
• Erroneous state
• Failure

69
What should be inspected ?
• Strategic enterprise artifacts
– Statement of work (SOW)
– Project plans
– Requirements
– Designs
– Code
– Tests
– Quality plans

70
What is the required training ?
• Executive overview
– Organizational, economic, and political impacts
• Introductory overview
– Function, uniqueness, and power of inspections
• Management overview
– Do’s, don’ts, and project planning
• Other (technical, metrics, and auditing)
– Mechanics, measurement, and enforcement

71
Who should be trained ?
• Executives
– Costs and benefits
• Managers
– Planning, estimating, and tracking inspections
• Engineers
– Mechanics, rules, and optimization
• Software process and quality analysts
– Deployment support, auditing, and analysis

72
Why is training required ?
• Train managers to
– Plan and manage projects using metrics
– Use metric data responsibly (not abuse staff)
• Certify moderators to
– Facilitate and maintain order
– Yield successful inspections
• Prepare inspectors to
– Identify defects quickly and efficiently

73
Why are moderators certified ?
• Keep managers out
• Planning and coordination
• Halt unproductive inspections
• Maintain non-threatening forum
• Yield optimal defect identification
• Ensure precision process execution
• Maintain independence and objectivity

74
Who enforces the process ?
• Inspectors
– Responsible for obeying rules
• Moderators
– Most effective defense (hence, “certified”)
• Project managers
– Responsible for intra-inspection monitoring
• Software process improvement and quality
– Responsible for optimization and conformance

75
Followup Questions
• What is a defect ???
• What should be inspected ???
• Who should be trained ???
• Why is training required ???
• What are moderators trained to do ???
• Who enforces the process ???
• Who performs inspections ???

76
Management
How is effort estimated ?

2.0

1.5
Hours

Inspection
1.0 Substage

Overview Preparation Rework


0.5
Planning Substage Substage Substage Followup
Substage Substage

0.5 * M + 1.0 * P + 1.0 * I + 2.0 * P + 1.0 * C + 0.5 * M

Hours = Product Size / ( Inspection Rate * 2 ) * ( Team Size * 4 + 1)

10,000 Lines 100,000 Lines 1,000,000 Lines


60 SLOC 120 SLOC 180 SLOC 60 SLOC 120 SLOC 180 SLOC 60 SLOC 120 SLOC 180 SLOC
People
Per Hour Per Hour Per Hour Per Hour Per Hour Per Hour Per Hour Per Hour Per Hour
4 1,417 708 472 14,167 7,083 4,722 141,667 70,833 47,222
5 1,750 875 583 17,500 8,750 5,833 175,000 87,500 58,333
6 2,083 1,042 694 20,833 10,417 6,944 208,333 104,167 69,444
7 2,417 1,208 806 24,167 12,083 8,056 241,667 120,833 80,556

78
When is it done ?

Month 1 Month 2 Month 3 Month 4 Month 5

I. Software Development

A. Analysis Phase

1. Requirements

2. Inspections

B. Design Phase

1. Design

2. Inspections

B. Code Phase

1. Code

2. Inspections

79
How often is it done ?

Estimated Source Lines of Code (SLOC) per Software Project


10,000 100,000 1,000,000
SLOC/Hour 60 120 180 60 120 180 60 120 180
4 1,417 708 472 14,167 7,083 4,722 141,667 70,833 47,222
People

Hours
Total
5 1,750 875 583 17,500 8,750 5,833 175,000 87,500 58,333
6 2,083 1,042 694 20,833 10,417 6,944 208,333 104,167 69,444
7 2,417 1,208 806 24,167 12,083 8,056 241,667 120,833 80,556
Inspection
6 6 6 6 6 6 6 6 6
Duration
Number of
83 42 28 833 417 278 8,333 4,167 2,778
Inspections
Hours 500 250 167 5,000 2,500 1,667 50,000 25,000 16,667
Duration

Days 63 31 21 625 313 208 6,250 3,125 2,083


Weeks 13 6 4 125 63 42 1,250 625 417
Months 3 1 1 29 14 10 288 144 96

80
How are defects measured ?

Source Metric Name Metric Algorithm

Defects
IEEE Defect Density
KSLOC

Inspection Defects
IBM (Michael Fagan) Defect Removal Effectiveness x 100 %
Inserted Defects

Major Inspection Defects


IBM (NASA Space Shuttle) Early Detection Percentage x 100 %
Inserted Defects

Defects
Dunn Effectiveness x 100 %
Current Phase + Post Phase

Pre-Release Defects
Motorola Total Defect Containment Effectiveness
Pre-Release + Post-Release Defects

Phase Errors
Motorola Phase Containment Effectiveness
Phase Errors + Phase Defects

81
How are defects estimated ?
• Basic (simple, but powerful)
– Observed defect density (immediate)
– Complete estimation of detection ratio
• Intermediate (popular)
– Partial estimation of detection ratio
(a.k.a. Capture-recapture models)
• Advanced (accurate and methodical)
– Rayleigh life cycle reliability models

82
How are metrics analyzed ?

100 2σ
80 1σ
Defects

60 Mean

40

20

1 2 3 4 5 6 7

Inspection

n n n 2

Σ
i=1
x
_+ σ ! Σ
i=1
x2 _ Σx
i=1

n n n

83
How are metrics applied ?

Software
Process
Defect Improvement
Reduction

Earlier
Defects

Defect
Residual
Removal
Defects
Delivered
to
Customers

Preliminary Detailed Unit Component System Field


Analysis Code
Design Design Test Test Test Operation

Software Inspections Software Testing


(10-100x Cheaper than Software Testing) (10-100x More than Inspections)

84
Followup Questions
• What is inspection effort a factor of ???
• When are inspections performed ???
• Where are inspections scheduled ???
• How many inspections per work product ???
• Can defects be estimated ???
• Why should defects be analyzed ???
• Why should defects be eliminated early ???

85
Pitfalls
What are the confusing points ?
• What is a defect ?
• Why do inspections ?
• When do inspections occur ?
• Why not just have a meeting ?
• Who’s responsible for inspections ?
• Isn’t it better to use an independent group ?
• Shouldn’t quality assurance do inspections ?

87
Why aren’t inspections used ?
• Programming viewed as a trade
• Large body of amateur practitioners
• Not in computer science curriculum
• Good descriptive literature is lacking
• Inspections spread by word-of-mouth
• Many are highly critical of inspections
• Not in software engineering curriculum
• Benefits of inspections are known by a few
88
What are the common obstacles ?
• Engineers are only interested in design
• Using latest technologies is high priority
• Managers don’t understand their benefits
• Managers don’t perceive them as valuable
• Software quality/reliability is not a priority
• Managers won’t institutionalize inspections
• Engineers will not participate in inspections
• Winning contracts is the only success factor
89
What are the common myths ?
• Too expensive
• No government mandate
• Obsolete mainframe era technique
• Identical to structured walkthroughs
• No more effective than desk checking
• Equivalent of manufacturing inspections
• Not a verification and validation technique
• Not applicable to Internet age technologies
90
What are the common mistakes ?
• Attack the author
• Don’t take them seriously
• Inspect at a high rate of speed
• Inspect for longer than 2 hours
• Don’t prepare for the inspections
• Evaluate design and style alternatives
• Let the participants control the inspection

91
What are the pitfalls of metrics ?
• Used for personal attacks
• Collection is cumbersome
• Dissimilar data often compared
• Transcription errors are common
• Invalid data is also very common
• Data from chaotic process isn’t as good
• Dissimilar circumstances often compared

92
What are the pitfalls of politics ?
• Programming is very competitive
• Inspections depend on heavy teamwork
• Cooperation and teamwork are uncommon
• Managers don’t want engineers to succeed
• Engineers don’t want managers to succeed
• Managers can use metrics for personal gain
• Engineers report incorrect data to managers

93
Followup Questions
• What are common points of confusion ???
• Is computer programming a trade ???
• Why don’t engineers use inspections ???
• Are inspections too expensive ???
• What is the most common mistake ???
• How are metrics commonly abused ???
• Does division of labor hinder inspections ???

94
Research
David F. Rico
• Research
– Costs, benefits, process modeling, and training
• Findings
– Cost estimation (only published models)
– Advanced cost and benefit evaluation method
– Return-on-investment (ROI) model
– Total life cycle cost analysis
• Contact
– http://davidfrico.com
96
University of Maryland
• Research
– Cost and benefits of inspection “variations”
• Findings
– New inspection metrics and models
– Groups no more effective than individuals
– Reading technologies are promising focus areas
• Contact
– http://www.cs.umd.edu

97
IBM
• Research
– Orthogonal defect classification (ODC)
• Findings
– Inspection defect type classification is subjective
– Designed objective defect classification method
– Defect type signatures identify process failures
• Contact
– http://chillarege.com

98
DACS
• Research
– Costs and benefits of inspections
• Findings
– Basic cost and benefit evaluation method
– Return-on-investment (ROI) model
– Total life cycle cost analysis
• Contact
– http://www.dacs.dtic.mil

99
Fraunhofer Gesellschaft
• Research
– Costs, benefits, and quality modeling
• Findings
– Cost and benefit evaluations
– Capture-recapture models (latest research)
– Inspection metrics and models
• Contact
– http://www.iese.fhg.de

100
University of Hawaii
• Research
– Inspection bibliographic studies
• Findings
– Extensive online annotated bibliography
• Contact
– http://www.ics.hawaii.edu

101
AT&T
• Research
– Costs and benefits of inspection “variations”
• Findings
– Good alternatives to Fagan inspections
– Flagship capture-recapture modeling research
– Testbed for University of Maryland
• Contact
– http://www.research.att.com

102
Followup Questions
• Who has the only published cost models ???
• What does University of Maryland claim ???
• What is orthogonal defect classification ???
• Does ITT do inspection research ???
• Who’s doing the latest research ???
• Who has an extensive bibliography ???
• What was AT&T known for ???

103
Conclusion
Perceived power of inspections
• Group synergy
• Ghost inspector
• Structured teamwork
• Focused human intelligence
• High defect removal efficiency
• Many eyes are better than two eyes
• Group review better than individual review

105
Real power of inspections
• Counting defects
• Identifying defects early
• Mandatory defect correction
Inspection

• Defect classification & analysis


• Defect prevention (causal analysis)
• Defect modeling (however imperfect)
Planning Analysis Design Code Compile Test

• Life cycle reliability modeling (Rayleigh)

106
Inspections and management
• Inspections enable managers to
– Estimate defects
– Plan defect removal
– Track defect removal
– Track total life cycle costs
– Quantitatively track progress
– Track return-on-investment (ROI)
– Learn how to manage projects using metrics

107
Inspections and engineering
• Inspections enable engineers to
– Learn how to apply metrics
– Assume responsibility for quality
– Assume responsibility for reliability
– Perform proactive quality engineering
– Perform proactive reliability engineering
– Build trust in the software life cycle process
– Gain respect for software engineering discipline

108
Inspections and quality assurance
• Inspections enable quality assurance to
– Focus on process automation
– Focus on process simplification
– Focus on quality and reliability modeling
– Focus on defect analysis and classification
– Assume their rightful process analysis role
– Focus on root cause analysis and prevention
– Return responsibility for quality to engineering

109
Inspections and testing
• Inspections are
– 10x cheaper than testing
– 100x cheaper than maintenance
• Inspections enable managers to
– Reduce the cost of testing
– Reduce total life cycle costs
– Reduce software maintenance costs
– Quantify and manage the costs of testing

110
Inspections and the CMM
• Inspections embody key CMM principles
– Software Project Planning
– Software Project Tracking & Oversight
– Peer Reviews
– Software Quality Management
– Quantitative Process Management
– Defect Prevention
– Technology Change Management
– Process Change Management
111
Followup Questions
• What is the perceived power ???
• What is the real power ???
• How do managers use inspections ???
• How do engineers use inspections ???
• How does SQA use inspections ???
• What do inspections mean to testing ???
• What do inspections mean to CMM ???

112
Resources
Books
• “Software Inspection Process”
– Robert G. Ebenau
– Susan H. Strauss
– McGraw-Hill (1993)

• “Software Inspection”
– Tom Gilb
– Dorothy Graham
– Addison Wesley (1993)

114
Industry case studies
• “Lessons from Three Years of Inspection
Data”
– Edward F. Weller
– IEEE Software (September 1993)

• “Experience with Inspection in Ultralarge-


Scale Developments”
– Glen W. Russell
– IEEE Software (January 1991)

115
Tools
• SDT ReviewPro
– http://www.sdtcorp.com/reviewpr.htm

• SyberNet CheckMate
– http://www.sybernet.ie/source/checkmate.htm

• StrathClyde ASSIST
– http://www.cs.strath.ac.uk/research/efocs/assist.html

116
Training
• Michael Fagan Associates
– http://www.mfagan.com
• Grove Consultants
– http://www.grove.co.uk/Inspection_Courses.html
• Don O'Neill
– http://hometown.aol.com/ONeillDon
• Tom Gilb
– http://www.result-planning.com

117
Websites
• David F. Rico
– http://davidfrico.com

• University of Hawaii
– http://www2.ics.hawaii.edu/%7Ejohnson/FTR/

• DoD Data & Analysis Center for Software


– http://www.dacs.dtic.mil/databases/url/key.hts?keycode=165

118
Scientific research

• University of Maryland
– http://www.cs.umd.edu

• Fraunhofer Gesellschaft
– http://www.iese.fhg.de

119
Standards
• IEEE Standard for Software Reviews and Audits
– http://www1.fatbrain.com/asp/bookinfo/bookinfo.asp?t
heisbn=999178165X

• NASA Software Formal Inspections Standard


– http://satc.gsfc.nasa.gov/Documents/fi/std/fistdtxt.txt

• NASA Software Formal Inspections Guidebook


– http://satc.gsfc.nasa.gov/fi/gdb/fitext.txt

120

You might also like