0% found this document useful (0 votes)
33 views77 pages

SQA Unit 5

Ppt

Uploaded by

Hrithik Sawant
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views77 pages

SQA Unit 5

Ppt

Uploaded by

Hrithik Sawant
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 77

Unit 5

Software Quality Assurance


* The ability to apply methodologies and procedures of the
highest professional level.
* Better mutual understanding and coordination among
development teams but especially between development and
maintenance teams.
* Greater cooperation between the software developer and
external participants in the project.
* Better understanding and cooperation between suppliers and
customers, based on the adoption of standards as part of the
contract.
Characteristics Quality Management Project Process Standards
Standards
The target unit Management of software A software development
development and/or and/or maintenance project
maintenance and the team
specific SQA units
The main focus Organization of SQA Methodologies for carrying
systems, infrastructure and out software development
requirements and maintenance projects

Standard’s “What” to achieve “How” to perform


objective

Standard's goal Assuring supplier’s Assuring the quality of a


software quality and specific software project’s
assessing its software products
process capability
Most prominent developers of SQA standards:
<> IEEE (Institute of Electric and Electronic Engineers)
Computer Society
<> ISO (International Standards Organization)
<> DOD (US Department of Defense)
<> ANSI (American National Standards Institute)
<> IEC (International Electrotechnical Commission)
<> EIA (Electronic Industries Association)
* Enable a software development organization to demonstrate
consistent ability to assure acceptable quality of its software
products or maintenance services. Certification is granted by an
external body.
* Serve as an agreed-upon basis for customer and supplier
evaluation of the supplier’s quality management system.
Accomplished by performance of a quality audit by the
customer.
* Support the organization's efforts to improve its quality
management system through compliance with the standard’s
requirements.
* Serve organizations as a tool for self-assessment of their
ability to carry out software development projects.
* Serve for improvement of development and maintenance
processes by application of the standard directions
* Help purchasing organizations determine the capabilities
of potential suppliers.
* Guide training of assessor by delineating qualifications
and training program curricula.
Capability Maturity Models – CMM and CMMI
assessment methodology
• Carnegie Mellon University’s Software Engineering Institute (SEI) took
the initial steps toward development of what is termed a capability
maturity model (CMM) in 1986,
Process Maturity models of the SEI
• The principles of CMM
• CMM assessment is based on the following concepts and principles:
• ■ Application of more elaborate management methods based on quantitative approaches
increases the organization’s capability to control the quality and improve the productivity
of the software development process.
• ■ The vehicle for enhancement of software development is composed of the five-level
capability maturity model. The model enables an organization to evaluate its
achievements and determine the efforts needed to reach the next capability level by
locating the process areas requiring improvement.
• ■ Process areas are generic; they define the “what”, not the “how”. This approach
enables the model to be applied to a wide range of implementation rganizations because:
• – It allows use of any life cycle model
• – It allows use of any design methodology, software development tool and programming language
• – It does not specify any particular documentation standard.
The evolution of CMM
• After 1993, the SEI expanded the original Software Development and
• Maintenance Capability Maturity Model (SW-CMM) through diversification.
• System Engineering CMM (SE-CMM) focuses on system engineering
• practices related to product-oriented customer requirements. It deals with
• product development: analysis of requirements, design of product systems,
• management and coordination of the product systems and their
• integration. In addition, it deals with the production of the developed
• product: planning production lines and their operation.
The evolution of CMM
• Trusted CMM (T-CMM) was developed to serve sensitive and classified
software systems that require enhanced software quality assurance.
• ■ System Security Engineering CMM (SSE-CMM) focuses on security
aspects of software engineering and deals with secured product
development processes, including security of development team members.
• ■ People CMM (P-CMM) deals with human resource development in
software organizations: improvement of professional capacities,
motivation, organizational structure, etc.
• ■ Software Acquisition CMM (SA-CMM) focuses on special aspects of
software acquisition by treating issues – contract tracking, acquisition risk
management, quantitative acquisition management, contact performance
management, etc. – that touch on software purchased from external
organizations.
The evolution of CMM
• Integrated Product Development CMM (IPD-CMM) serves as a
framework for integration of development efforts related to every
aspect of the product throughout the product life cycle as invested by
each department
• CMMI (Capability Maturity Model Integration)
CMM Model Levels and Key process areas
What is CMMI?
• CMMI (Capability Maturity Model Integration) is a proven
industry framework to improve product quality and development
efficiency for both hardware and software
• Sponsored by US Department of Defence in cooperation
with Carnegie Mellon University and the Software
Engineering Institute (SEI)
• Many companies have been involved in CMMI definition
such as Motorola and Ericsson
• CMMI has been established as a model to improve
business results
• CMMI, staged, uses 5 levels to describe the maturity of the
organization, same as predecessor CMM
• Vastly improved version of the CMM
• Emphasis on business needs, integration and
institutionalization
Slide 13 of 146
CMMI Staged Representation - 5 Maturity Levels

Level 5 Process performance


continually improved through
incremental and innovative
Optimizing technological improvements.
Level 4

ity
ur
Processes are controlled using

at
Quantitatively

sM
statistical and other quantitative
Managed
techniques.
es
oc
Level 3
Pr

Processes are well characterized and


understood. Processes, standards,
Defined procedures, tools, etc. are defined at the
Level 2 organizational (Organization X ) level.
Proactive.
Managed Processes are planned, documented, performed,
monitored, and controlled at the project level. Often
Level 1 reactive.

Initial Processes are unpredictable, poorly controlled, reactive.


Slide 14 of 146
ISO 9000 family
• Customer focus
• Leadership
• Involvement of people
• Process approach
• System approach to management
• Continual improvement
• Factual approach to decision making
• Mutually supportive supplier relationships
• The structure and content of IEEE software
engineering standards
• IEEE/EIA Std. 12207 – Software life cycle
processes
• IEEE Std. 1012 - verification and validation
• IEEE Std. 1028 - reviews
• A. Conceptual standards. Guiding principles and overall
approach
* IEEE 1061 – Software Quality Metrics Methodology
* IEEE/EIA 12207.0 — Information Technology Software
Life Cycle Processes
• B. Prescriptive standards of conformance. Requirements to
which a software developer must conform. ·
* IEEE 829 — Software Test Documentation
* IEEE 1012 – Software Verification And Validation
* IEEE 1028 – Software Reviews
• C. Guidance standards. Implementation of class B
standards.
* IEEE 1233 – Guide for Developing System Requirement
Specifications
* IEEE/EIA 12207.1 – Guide, Information technology – Software Life
Cycle Processes – Life Cycle Data
· ·
<> To establish an internationally recognized
model of common software life cycle processes
that can be referenced by the software industry
worldwide.
<> To promote understanding among business
parties by application of commonly recognized
processes, activities and tasks.
Source: IEEE (1992). From IEEE Std 10 45-19992. Copyright 1992 IEEE. All rights reserved.
General concepts
• Applicability of the standard in general and its adaptation by tailoring
• Applicability for all participants in the software life cycle
• Flexibility and responsiveness to technological change
• Software links to the system
• TQM consistency
• No certification requirements
• Baselining

Task-related concepts
• Responsibility for activities and tasks
• Modularity of components of software life cycle
• Levels of required conformance
• Nature of evaluation task
* Establish a common framework for V&V activities
and tasks for all software life cycle processes.
* Define V&V requirements, including their inputs
and outputs.
* Define software integrity levels and the
appropriate V&V tasks.
* Define the content of a SVVP (software V&V Plan)
document.
• Broad definition of V&V activities
• Software integrity levels and their V&V requirements
• Prescriptive requirements
* Detailed description of the performance methodology.
* Required inputs.
* Required outputs.
* Definition of integrity levels for which performance of the task
is not mandatory.
* Optional V&V tasks to be performed during selected life cycle
process.

• Independence of V&V activities


• Compliance and compatibility with international standards
• Special characteristics of reusable software V&V
• Application of V&V metrics
• Quantitative criteria for V&V tasks
(1) Management
(2) Acquisition
(3) Supply
(4) Development
(5) Operation
(6) Maintenance

A three level tree architecture:


- Processes (each includes 1-6 activities)
- Activities (each includes 3-10 tasks)
- Tasks
•Management reviews
•Technical reviews (in the book
“formal design reviews”)
•Inspections
•Walkthroughs
•Audits
To define systematic review procedures
that are:
* Applicable for reviews performed
throughout the software life cycle
* Conform with the review equirements
defined by other standards
• High formality
• Follow-up of corrections
• Compliance with international and IEEE
standards
(1) Introduction
(2) Responsibilities The responsibilities of the participants in the review.
(3) Input Mandatory and optional data items.
(4) Entry criteria Common criteria: a. a statement of the review's objectives. b.
Availability of the required input data.
(5) Procedure Required to include: management preparations, planning, team
preparation, examination of the products, follow up of corrections.
(6) Exit criteria What must be accomplished before the review can be concluded.
(7) Output items
(8) Data collection recommendations To be used to study the effectiveness and
efficiency of current practices.
(9) Improvements Formulate improved procedures, checklists and development
processes.
• Objectives of quality measurement
• Classification of software quality metrics
• Process metrics
• Product metrics
• Implementation of software quality metrics
• Limitations of software metrics
• The function point method
(1) A quantitative measure of the degree to which
an item possesses a given quality attribute.
(2) A function whose inputs are software data and
whose output is a single numerical value that can
be interpreted as the degree to which the
software possesses a given quality attribute.
1. Facilitate management control, planning and
managerial intervention.
Based on:
· Deviations of actual from planned performance.
· Deviations of actual timetable and budget
performance from planned.
2. Identify situations for development or maintenance
process improvement (preventive or corrective
actions). Based on:
· Accumulation of metrics information regarding the
performance of teams, units, etc.
General requirements
• Relevant
• Valid
• Reliable
• Comprehensive
• Mutually exclusive
Operative requirements
• Easy and simple
• Does not require independent data collection
• Immune to biased interventions by interested parties
Classification by phases of software system
• Process metrics – metrics related to the software
development process
• Product metrics – metrics related to software maintenance
Classification by subjects of measuements
• Quality
• Timetable
• Effectiveness (of error removal and maintenance services)
• Productivity
• KLOC — classic metric that measures the size of
software by thousands of code lines.
• Number of function points (NFP) — a measure of
the development resources (human resources)
required to develop a program, based on the
functionality specified for the software system.
Calculation of NCE Calculation of WCE

Error severity class Number of Errors Relative Weight Weighted Errors

a b c D=bxc

low severity 42 1 42

medium severity 17 3 51

high severity 11 9 99

Total 70 --- 192

NCE 70 --- ---

WCE --- 192


Process metrics categories

• Software process quality metrics


• Error density metrics
• Error severity metrics
• Software process timetable metrics
• Software process error removal effectiveness
metrics
• Software process productivity metrics
Code Name Calculation formula
Code Error Density NCE
CED CED = -----------
KLOC
Development Error Density NDE
DED DED = -----------
KLOC
Weighted Code Error Density WCE
WCED WCDE = ---------
KLOC
Weighted Development Error Density WDE
WDED WDED = ---------
KLOC
Weighted Code Errors per Function WCE
WCEF Point WCEF = ----------
NFP
Weighted Development Errors per WDE
WDEF Function Point WDEF = ----------
NFP

NCE = The number of code errors detected by code inspections and testing.
NDE = total number of development (design and code) errors) detected in the development process.
WCE = weighted total code errors detected by code inspections and testing.
WDE = total weighted development (design and code) errors detected in development process.
Code Name Calculation formula
WCE
ASCE Average Severity of Code ASCE = -----------
Errors NCE
Average Severity of WDE
DED ASDE = -----------
Development Errors NDE

NCE = The number of code errors detected by code inspections and testing.
NDE = total number of development (design and code) errors) detected in the
development process.
WCE = weighted total code errors detected by code inspections and testing.
WDE = total weighted development (design and code) errors detected in
development process.
Code Name Calculation formula
Time Table Observance MSOT
TTO TTO = -----------
MS
TCDAM
ADMC Average Delay of Milestone ADMC = -----------
Completion MS

MSOT = Milestones completed on time.


MS = Total number of milestones.
TCDAM = Total Completion Delays (days, weeks, etc.) for all milestones.
Code Name Calculation formula
Development Errors Removal NDE
DERE DERE = ----------------
Effectiveness NDE + NYF
WDE
DWERE Development Weighted DWERE = ------------------
Errors Removal Effectiveness WDE+WYF

NDE = total number of development (design and code) errors) detected in the
development process.
WCE = weighted total code errors detected by code inspections and testing.
WDE = total weighted development (design and code) errors detected in
development process.
NYF = number software failures detected during a year of maintenance service.
WYF = weighted number of software failures detected during a year of maintenance
service.
Code Name Calculation formula
DevH
DevP Development Productivity DevP = ----------
KLOC

FDevP Function point Development


Productivity
DevH
FDevP = ----------
NFP
ReKLOC
CRe Code Reuse Cre = --------------
KLOC
ReDoc
DocRe Documentation Reuse DocRe = -----------
NDoc
DevH = Total working hours invested in the development of the software system.
ReKLOC = Number of thousands of reused lines of code.
ReDoc = Number of reused pages of documentation.
NDoc = Number of pages of documentation.
* HD quality metrics:
* HD calls density metrics - measured by the number of calls.
* HD calls severity metrics - the severity of the HD issues raised.
* HD success metrics – the level of success in responding to HD
calls.
* HD productivity metrics.
* HD effectiveness metrics.
* Corrective maintenance quality metrics.
* Software system failures density metrics
* Software system failures severity metrics
* Failures of maintenance services metrics
* Software system availability metrics
* Corrective maintenance productivity and effectiveness metrics.
Code Name Calculation Formula
NHYC
HDD HD calls density HDD = --------------
KLMC
WHYC
WHDD Weighted HD calls density WHYC = ------------
KLMC
WHYC
WHDF Weighted HD calls per WHDF = ------------
function point NMFP

NHYC = the number of HD calls during a year of service.


KLMC = Thousands of lines of maintained software code.
WHYC = weighted HD calls received during one year of service.
NMFP = number of function points to be maintained.
Code Name Calculation Formula
WHYC
ASHC Average severity of HD calls ASHC = --------------
NHYC

NHYC = the number of HD calls during a year of service.


WHYC = weighted HD calls received during one year of service.
Code Name Calculation Formula
NHYOT
HDS HD service success HDS = --------------
NHYC

NHYNOT = Number of yearly HD calls completed on time during one year of service.
NHYC = the number of HD calls during a year of service.
Code Name Calculation Formula
HDYH
HDP HD Productivity HDP= --------------
KLNC
HDYH
FHDP Function Point HD Productivity FHDP = ----------
NMFP
HDYH
HDE HD effectiveness HDE = --------------
NHYC

HDYH = Total yearly working hours invested in HD servicing of the software system.
KLMC = Thousands of lines of maintained software code.
NMFP = number of function points to be maintained.
NHYC = the number of HD calls during a year of service.
Code Name Calculation Formula
SSFD Software System Failure NYF
SSFD = --------------
Density KLMC

WSSFD Weighted Software WYF


WFFFD = ---------
System Failure Density KLMC

WSSFF Weighted Software System WYF


WSSFF = ----------
Failures per Function point NMFP

NYF = number of software failures detected during a year of maintenance service.


WYF = weighted number of yearly software failures detected during one year of
maintenance service.
NMFP = number of function points designated for the maintained software.
KLMC = Thousands of lines of maintained software code.
Code Name Calculation Formula
WYF
ASSSF Average Severity of ASSSF = --------------
Software System Failures NYF

NYF = number of software failures detected during a year of maintenance service.


WYF = weighted number of yearly software failures detected during one year.
Code Name Calculation Formula
RepYF
MRepF Maintenance Repeated MRepF = --------------
repair Failure metric - NYF

NYF = number of software failures detected during a year of maintenance


service.
RepYF = Number of repeated software failure calls (service failures).
Code Name Calculation Formula
NYSerH - NYFH
FA Full Availability FA = -----------------------
NYSerH
NYSerH - NYVitFH
VitA Vital Availability VitA = -----------------------------
NYSerH
NYTFH
TUA Total Unavailability TUA = ------------
NYSerH
NYSerH = Number of hours software system is in service during one year.
NYFH = Number of hours where at least one function is unavailable (failed) during one year,
including total failure of the software system.
NYVitFH = Number of hours when at least one vital function is unavailable (failed) during
one year, including total failure of the software system.
NYTFH = Number of hours of total failure (all system functions failed) during one year.
NYFH ≥ NYVitFH ≥ NYTFH.
1 – TUA ≥ VitA ≥FA
Code Name Calculation Formula
CMaiP Corrective Maintenance CMaiYH
CMaiP = ---------------
Productivity KLMC

FCMP Function point Corrective CMaiYH


FCMP = --------------
Maintenance Productivity NMFP

CMaiE Corrective Maintenance CMaiYH


CMaiE = ------------
Effectiveness NYF
CMaiYH = Total yearly working hours invested in the corrective maintenance of the software
system.
NYF = number of software failures detected during a year of maintenance service.
NMFP = number of function points designated for the maintained software.
KLMC = Thousands of lines of maintained software code.
* Budget constraints in allocating the necessary
resources.
* Human factors, especially opposition of
employees to evaluation of their activities.
* Validity Uncertainty regarding the data's,
partial and biased reporting.
* Parameters used in development process
metrics:
KLOC, NDE, NCE.
* Parameters used in product (maintenance)
metrics:
KLMC, NHYC, NYF.
a. Programming style (KLOC).
b. Volume of documentation comments (KLOC).
c. Software complexity (KLOC, NCE).
d. Percentage of reused code (NDE, NCE).
e. Professionalism and thoroughness of design review and software
testing teams: affects the number of defects detected (NCE).
f. Reporting style of the review and testing results: concise reports vs.
comprehensive reports (NDE, NCE).
a. Quality of installed software and its documentation (NYF, NHYC).
b. Programming style and volume of documentation comments
included in the code be maintained (KLMC).
c. Software complexity (NYF).
d. Percentage of reused code (NYF).
e. Number of installations, size of the user population and level of
applications in use: (NHYC, NYF).
The function point method
The function point estimation process:
• Stage 1: Compute crude function points (CFP).
• Stage 2: Compute the relative complexity
adjustment factor (RCAF) for the project. RCAF
varies between 0 and 70.
• Stage 3: Compute the number of function points
(FP):
FP = CFP x (0.65 + 0.01 x RCAF)
Complexity level Total
Software
Simple average complex CFP
system
Count Weight Points Count Weight Points Count Weight Points
components Factor Factor Factor

A B C= D E F= G H I=
AxB DxE GxH
User inputs 3 4 6

User outputs 4 5 7
User online 3 4 6
queries
Logical files 7 10 15
External 5 7 10
interfaces
Total CFP
No Subject Grade
1 Requirement for reliable backup and recovery 0 1 2 3 4 5
2 Requirement for data communication 0 1 2 3 4 5
3 Extent of distributed processing 0 1 2 3 4 5
4 Performance requirements 0 1 2 3 4 5
5 Expected operational environment 0 1 2 3 4 5
6 Extent of online data entries 0 1 2 3 4 5
7 Extent of multi-screen or multi-operation online data input 0 1 2 3 4 5
8 Extent of online updating of master files 0 1 2 3 4 5
9 Extent of complex inputs, outputs, online queries and files 0 1 2 3 4 5
10 Extent of complex data processing 0 1 2 3 4 5
11 Extent that currently developed code can be designed for reuse 0 1 2 3 4 5
12 Extent of conversion and installation included in the design 0 1 2 3 4 5
13 Extent of multiple installations in an organization and variety of customer 0 1 2 3 4 5
organizations
14 Extent of change and focus on ease of use 0 1 2 3 4 5
Total = RCAF
Complexity level Total
Software
Simple average complex CFP
system
Count Weight Points Count Weight Points Count Weight Points
components Factor Factor Factor

A B C= D E F= G H I=
AxB DxE GxH
User inputs 1 3 3 --- 4 --- 1 6 6 9
User outputs --- 4 --- 2 5 10 1 7 7 17
User online 3 4 6
queries 1 3 1 4 1 6 13
Logical files 1 7 7 --- 10 --- 1 15 15 22
External 5 7 10
interfaces --- --- --- --- 2 20 20
Total CFP 81
No Subject Grade

1 Requirement for reliable backup and recovery 0 1 2 3 4 5


2 Requirement for data communication 0 1 2 3 4 5
3 Extent of distributed processing 0 1 2 3 4 5
4 Performance requirements 0 1 2 3 4 5
5 Expected operational environment 0 1 2 3 4 5
6 Extent of online data entries 0 1 2 3 4 5
7 Extent of multi-screen or multi-operation online data input 0 1 2 3 4 5
8 Extent of online updating of master files 0 1 2 3 4 5
9 Extent of complex inputs, outputs, online queries and files 0 1 2 3 4 5
10 Extent of complex data processing 0 1 2 3 4 5
11 Extent that currently developed code can be designed for reuse 0 1 2 3 4 5
12 Extent of conversion and installation included in the design 0 1 2 3 4 5
13 Extent of multiple installations in an organization and variety of customer 0 1 2 3 4 5
organizations
14 Extent of change and focus on ease of use 0 1 2 3 4 5
Total = RCAF 41
The ATTEND MASTER – function points
calculation

FP = CFP x (0.65 + 0.01 x RCAF)

FP = 81 x (0.65 + 0.01 x 41) = 85.86


Main advantages
• Estimates can be prepared at the pre-project stage.
• Based on requirement specification documents (not specific
dependent on development tools or programming languages), the
method’s reliability is relatively high.
Main disadvantages
• FP results depend on the counting instruction manual.
• Estimates based on detailed requirements specifications, which are
not always available.
• The entire process requires an experienced function point team and
substantial resources.
• The evaluations required result in subjective results.
• Successful applications are related to data processing. The method
cannot yet be universally applied.
SQAP ( Software quality Assurance Plan)
SQAP ( Software quality Assurance Plan)
SQAP ( Software quality Assurance Plan)
Thank You

You might also like