0% found this document useful (0 votes)
11 views8 pages

SPM Unit-1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views8 pages

SPM Unit-1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

UNIT-1

Conventional software management


Conventional software management practices are sound in theory, but practice is still tied to archaic
(outdated)technology and techniques. Conventional software economics provides a benchmark of
performance for conventional software management principles.
The best thing about software is its flexibility: It can be programmed to do almost anything.
The worst thing about software is also its flexibility: The "almost anything" characteristic has made it
difficult to plan, monitors, and control software development.Three important analyses of the state of the
software engineering industry are
1.Software development is still highly unpredictable. Only about 10% of software projects are delivered
successfully within initial budget and schedule estimates.
2.Management discipline is more of a discriminator in success or failure than are technology advances.
3.The level of software scrap and rework is indicative of an immature process.
All three analyses reached the same general conclusion: The success rate for software projects is very
low.The three analyses provide a good introduction to the magnitude of the software problem and the
current norms for conventional software management performance.
Waterfall Model
The Waterfall Model was the first Process Model to be introduced. It is also referred to as a linear-
sequential life cycle model. It is very simple to understand and use. In a waterfall model, each phase
must be completed before the next phase can begin and there is no overlapping in the phases.
The Waterfall model is the earliest SDLC approach that was used for software development.
The waterfall Model illustrates the software development process in a linear sequential flow. This means
that any phase in the development process begins only if the previous phase is complete. In this waterfall
model, the phases do not overlap.
Waterfall Model - Design
Waterfall approach was first SDLC Model to be used widely in Software Engineering to ensure success
of the project. In "The Waterfall" approach, the whole process of software development is divided into
separate phases. In this Waterfall model, typically, the outcome of one phase acts as the input for the
next phase sequentially.
The following illustration is a representation of the different phases of the Waterfall Model.

The sequential phases in Waterfall model are −


• Requirement Gathering and analysis − All possible requirements of the system to be
developed are captured in this phase and documented in a requirement specification document.
• System Design − The requirement specifications from first phase are studied in this phase and
the system design is prepared. This system design helps in specifying hardware and system
requirements and helps in defining the overall system architecture.
• Implementation − With inputs from the system design, the system is first developed in small
programs called units, which are integrated in the next phase. Each unit is developed and tested
for its functionality, which is referred to as Unit Testing.
• Integration and Testing − All the units developed in the implementation phase are integrated
into a system after testing of each unit. Post integration the entire system is tested for any faults
and failures.
• Deployment of system − Once the functional and non-functional testing is done; the product is
deployed in the customer environment or released into the market.
• Maintenance − There are some issues which come up in the client environment. To fix those
issues, patches are released. Also to enhance the product some better versions are released.
Maintenance is done to deliver these changes in the customer environment.
Waterfall Model - Application
Every software developed is different and requires a suitable SDLC approach to be followed based on
the internal and external factors. Some situations where the use of Waterfall model is most appropriate
are −
• Requirements are very well documented, clear and fixed.
• Product definition is stable.
• Technology is understood and is not dynamic.
• There are no ambiguous requirements.
• Ample resources with required expertise are available to support the product.
• The project is short.

Conventional software management performance


Barry Boehm's “Industrial Software Metrics Top 10 List” provides a set of empirical observations
that offer deep insight into the realities of software engineering. These metrics, derived from Boehm’s
experience and research, reflect cost, productivity, quality, and complexity issues in software
development. Below is a detailed explanation of each of the ten metrics:
1. Finding and fixing a software problem after delivery costs 100 times more than finding and
fixing the problem in early design phases.
• Explanation: This is one of the most famous software engineering principles. Bugs or design
flaws that are not caught early (in requirements or design phase) become significantly more
expensive to fix later, especially after deployment.
• Reason: In the early design stage, fixing a problem might just involve changing a diagram or a
few lines of code. After delivery, however, it could involve:
o Customer complaints
o Downtime or system failures
o Complex patches and redeployment
o Legal liabilities in extreme cases
• Lesson: Invest time in requirement analysis, design reviews, and early testing. Catching
bugs early is crucial.
2. You can compress software development schedules 25% of nominal, but no more.
• Explanation: This means you can only accelerate (compress) a software schedule by about
25% without significantly increasing risks or compromising quality.
• Why only 25%?
o Software development is a creative and cognitive task, not a mechanical one.
o Adding more developers to a late project often delays it further (Brooks' Law).
• Lesson: Overly aggressive scheduling usually results in:
o Poor quality
o Burnout
o Incomplete features
o Increased technical debt
3. For every $1 you spend on development, you will spend $2 on maintenance.
• Explanation: This highlights the long-term nature of software costs. Development is just the
beginning—maintenance (correcting bugs, adapting to changes, adding enhancements) costs
twice as much over the software's life cycle.
• Maintenance activities include:
o Corrective maintenance (fixing bugs)
o Adaptive maintenance (changing the software for new environments)
o Perfective maintenance (improving performance, usability)
• Lesson: Design software with maintainability in mind. Clean code, documentation, and modular
architecture help reduce future costs.

4. Software development and maintenance costs are primarily a function of the number of
source lines of code (SLOC).
• Explanation: The larger the software in terms of lines of code, the more complex and costly
it becomes to develop and maintain.
• Why?
o More code = more testing, more bugs, more interdependencies.
o Each additional line adds potential points of failure or interaction.
• Lesson: Favor efficient, reusable, and concise coding practices. Don’t write more code than
necessary. Use abstraction and modularization.

5. Variations among people account for the biggest differences in software productivity.
• Explanation: The most productive programmers can be 10x more effective than the least
productive ones.
• Why?
o Skill level
o Experience
o Problem-solving ability
o Communication and collaboration skills
• Lesson: Hiring and retaining top talent and fostering skill development is more effective than
just increasing the number of people.
6. The overall ratio of software to hardware costs is still growing. In 1955 it was 15:85; in 1985,
85:15.
• Explanation: Initially, hardware was more expensive than software. But as hardware became
cheaper and more powerful, software development emerged as the dominant cost factor.
• Today: Software drives innovation; even in hardware-driven devices like smartphones, software
defines their capabilities.
• Lesson: Budgeting and planning should give serious consideration to software-related costs,
not just hardware.
7. Only about 15% of software development effort is devoted to programming.
• Explanation: Contrary to popular belief, coding is a small part of the software engineering
process.
• Other phases include:
o Requirement analysis
o System design
o Testing
o Debugging
o Documentation
o Deployment
• Lesson: Effective project planning must allocate time and resources to non-coding tasks, which
are often more time-consuming and complex.
8. Software systems and products typically cost 3 times as much per SLOC as individual software
programs. Software-system products (i.e., system of systems) cost 9 times as much.
• Explanation: A standalone program is cheaper to develop than a system of integrated
programs or products.
• Why?
o Systems have inter-module dependencies.
o Need for rigorous testing, documentation, integration.
o Higher levels of complexity and required reliability.
• Lesson: As software scales, costs do not grow linearly. Complexity grows faster than size.
Design modular systems with clear boundaries.
9. Walkthroughs catch 60% of the errors.
• Explanation: Code walkthroughs or peer reviews are informal processes where developers
go through the code or design with others.
• Effectiveness:
o Can catch logic errors, misunderstandings, and design flaws early.
o Help improve code quality before testing.
• Lesson: Incorporate walkthroughs and peer reviews into your development cycle. They are
low-cost and high-impact tools for quality assurance.
10. 80% of the contribution comes from 20% of the contributors.
• Explanation: This is the Pareto Principle applied to software development. A small percentage
of the team usually contributes a disproportionately large portion of the work.
• Implication:
o High-performing developers or designers are key drivers.
o Not all team members have equal impact.
• Lesson: Identify, support, and retain high performers. But also work on improving the
contribution of others through training and better management.

evolution of software economics


SOFTWARE ECONOMICS
Software Economics in Software Engineering is mature research area that generally deals with
most difficult and challenging problems and issues of valuing software and determining or
estimation costs usually involved in its production. Boehm and Sullivan outline these difficulties
and challenges and also presented how software economics principles can be applied to improve
software design, development, and evolution.
Software economics is basically situated at intersection of information economics and even
software design and engineering. Most of software cost models are generally abstracted into
function of five basic parameters. These parameters are given below :
• Size –
Size is generally measured or qualified in term of number of source instructions or in SLOC
(Source line of code) or number of function points required to realize desired capabilities. The
size of end product or result is required to develop or create required functionality.
• Process –
The process is steps that are used to guide all of activities and produce end products, in particular
ability and capability of process to avoid or ignore activities that are not adding any value. It also
supports heading towards the target or goal and eliminate activities that are not essential or
important.
• Personnel –
The capabilities of personnel of software engineering in general, and particularly their experience
with issues or problems regarding computer science and issues regarding application domain of
project. It emphasizes on team and responsibilities of team.
• Environment –
It is simply made of various tools and techniques and automated procedures that are available
and used to support software development and effort in an efficient way.
• Quality –
The required quality along with its features, performance, reliability, scalability, portability,
usability, user interface utility, adaptability, and many more.
Between these parameters, relationship and estimated cost can be written in following way :
Effort = (Personnel) (Environment) (Quality) (Size Process)

Generations of Software Development –


There are three generations of software development as described below :
• Conventional Development (1960s and 1970s) –
During this generation, organizations used various custom tools, custom processes, and all
components of custom that are built or developed in primitive languages. The size is 100 %
custom. At this generation, conventional development is generally considered bad. It is because
it was always costly and over budget and schedule. It also does not fulfill requirements that are
necessary such as some components, symbolic languages, other languages like Fortran, PL/1,
etc. The quality of performance was always poor and less than great.
• Transition (1980s and 1990s) –
During this generation, organizations used various repeatable processes and off-the-shelf tools,
and more likely to use custom components) that are generally developed in high-level languages.
The size is 30 % component-based and 70 % custom. It is not predictable to decide whether it is
bad or good. Transition development is infrequently on budget and schedule. Some of
commercial components were simply available such as databases, networking along with
operating system, database management system, and graphical user interface. But due to an
increase in complexity, all languages and technologies available were not enough for desired
business performance.
• Modern(2000 and later) –
Modern development processes are generally measured and managed, integrated automated
environments, 70% off-the-shelf components. The size is 70 % component-based and 30
%custom. Modern development is usually in budget and in schedule.
Improved “process” requires “tools” that are improved. The improved process requires
environmental support. Various technologies other for environment automation, size reduction,
and process improvement are not independent of each other. In the new era, main key is only
complementary growth in all these technologies.

PRAGMATTIC SOFTWARE COST ESTIMATION :


1. Pragmatic means dealing with things. Sensibly realistically in a way based on practical rather than
theoretically considerations.
2. One critical problem in software cost estimation is lack of well documented case studies of projects.
3. Software industry inconsistently defined metrics for cost measurements.
4. It is difficult and hard to collect homogeneous set of project data.
5. Within one organization. There are several software cost estimation models and tools.
6. These cost estimation models include (a) COCOMO (b) Checkpoint (c) ESTIMACS (d) Knowledge
plan (e) SLIM (f) SOFTCOST etc.
7. These models are only estimates and are well documented.
8. Accuracy of cost models such as COCOMO is described as “within 20% of actuals”, 70% of the Time.
9. Most of the real-world cost models are bottom-up rather than top-down.
10. The software project manager defines the total cost of the software and then manipulates parameters
like size, people etc.
11. The software project manager has to analyze cost risks and project risks.
12) A good software cost estimation has the following attributes.
(A) The estimation cost must be supported by
a) project manager.
b) Architecture Team
c) Develop and test Team for performing the work.
(B) Accepted by all stake holders.
(C) Must be based on well-defined cost model.
(D) The cost estimation must be based on Database of relevant project experience that include (a) Similar
process
(b) Similar technologies
(c) Similar Environment
(d) Similar quality requirements and Similar people
(e) It must be enough detail that identify key risk area and probability of sources.
A good software cost estimate has the following attributes:
• It is conceived and supported by the project manager, architecture team, development team, and test
team accountable for performing the work.
• It is accepted by all stakeholders as ambitious but realizable.
• It is based on a well-defined software cost model with a credible basis.
• It is based on a database of relevant project experience that includes similar processes, similar
technologies, similar environments, similar quality requirements, and similar people. • It is defined in
enough detail so that its key risk areas are understood and the probability of success is objectively
assessed.
13) The following diagram shows the predominant cost estimation Process.

You might also like