Unit 3

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 32

The Management Spectrum | 4 P’s in

Software Project Planning





For properly building a product, there’s a very important concept that we all should
know in software project planning while developing a product. There are 4 critical
components in software project planning which are known as the 4P’s namely:
 Product
 Process
 People
 Project

These components play a very important role in your project that can help your
team meet its goals and objective. Now, Let’s dive into each of them a little in
detail to get a better understanding:
 People
The most important component of a product and its successful
implementation is human resources. In building a proper product, a well-
managed team with clear-cut roles defined for each person/team will lead
to the success of the product. We need to have a good team in order to
save our time, cost, and effort. Some assigned roles in software project
planning are project manager, team leaders, stakeholders,
analysts, and other IT professionals. Managing people successfully is a
tricky process which a good project manager can do.
 Product
As the name inferred, this is the deliverable or the result of the project.
The project manager should clearly define the product scope to ensure a
successful result, control the team members, as well technical hurdles
that he or she may encounter during the building of a product. The
product can consist of both tangible or intangible such as shifting the
company to a new place or getting a new software in a company.
 Process
In every planning, a clearly defined process is the key to the success of
any product. It regulates how the team will go about its development in
the respective time period. The Process has several steps involved like,
documentation phase, implementation phase, deployment phase, and
interaction phase.
 Project
The last and final P in software project planning is Project. It can also be
considered as a blueprint of process. In this phase, the project manager
plays a critical role. They are responsible to guide the team members to
achieve the project’s target and objectives, helping & assisting them with
issues, checking on cost and budget, and making sure that the project
stays on track with the given deadlines.
W5HH Principle

Barry Boehm gave a philosophy that prepares easy and manageable designs or
outlines for software projects. He also gave a technique to discuss objectives,
management, duties, and technical approach of the project and its necessary
resources. Then he named it the W5HH principle when few questions resulted in
project properties, definition, and resultant plan to make the project successful.
Those questions are :
The W5HH principle in software management exists to help project managers
guide objectives, timelines, responsibilities, management styles, and resources. In
this lesson, we’ll explore each part.
W5HH questions :
Why the system is going to be developed?
For the purpose of software work, all stakeholders must assess the validity of the
system product/project. Here Barry questions that whether the project’s purpose
will justify the cost, time spent on it by people?
What is activities are needed to be done in this?
In this Barry questions what task is needed to be done for a project currently.
When is this done?
Project Scheduling is done by the team after recognizing when project tasks will be
started and when they enter into the final stage to reach the goal.
Who are the reasons for these activities in this project?
Every member who is part of the software team is responsible for this. And their
roles are defined.
Where are these authoritatively located?
Not only do software practitioners have roles in this but also users, customers,
stakeholders also have roles and responsibilities organizationally.
How is the job technically and managerially finished?
All technical strategies, management rules of the project are defined after knowing
the scope of the project which is being built.
How much part of each resource is required?
This is known by software developers after the estimation of each resource as per
the needs of customers/users.
This W5HH principle of Bohem is appropriate irrespective of the scale or difficulty
of software projects being developed. These questions help in planning the outline
of the project for the software team.
The W5HH principle outlines a series of questions that can help project managers
more efficiently manage software projects. Each letter in W5HH stands for a
question in the series of questions to help a project manager lead. (Notice there are
five ”W” questions and two ”H” questions).
W5HH The Question What It Means

Why is the system being This focuses a team on the business reasons
Why?
developed? for developing the software.

This is the guiding principle in determining


What? What will be done?
the tasks that need to be completed.

When will it be This includes important milestones and the


When?
completed? timeline for the project.

Who? Who is responsible for This is where you determine which team
each function? member takes on which responsibilities. You
may also identify external stakeholders with a
claim in the project.

This step gives you time to determine what


Where are they
Where? other stakeholders have a role in the project
organizationally located?
and where they are found.

How will the job be In this step, a strategy for developing the
How? done technically and software and managing the project is
managerially? concluded upon.

The goal of this step is to figure out


How How much of each
the number of resources necessary to complete
Much? resource is needed?
the project.

Software Measurement and Metrics




Software Measurement: A measurement is a manifestation of the size, quantity,
amount, or dimension of a particular attribute of a product or process. Software
measurement is a titrate impute of a characteristic of a software product or the
software process.
It is an authority within software engineering. The software measurement process
is defined and governed by ISO Standard.
Software Measurement Principles
The software measurement process can be characterized by five activities-
1. Formulation: The derivation of software measures and metrics
appropriate for the representation of the software that is being
considered.
2. Collection: The mechanism used to accumulate data required to derive
the formulated metrics.
3. Analysis: The computation of metrics and the application of
mathematical tools.
4. Interpretation: The evaluation of metrics results in insight into the
quality of the representation.
5. Feedback: Recommendation derived from the interpretation of product
metrics transmitted to the software team.
Need for Software Measurement
Software is measured to:
 Create the quality of the current product or process.
 Anticipate future qualities of the product or process.
 Enhance the quality of a product or process.
 Regulate the state of the project concerning budget and schedule.
 Enable data-driven decision-making in project planning and control.
 Identify bottlenecks and areas for improvement to drive process
improvement activities.
 Ensure that industry standards and regulations are followed.
 Give software products and processes a quantitative basis for evaluation.
 Enable the ongoing improvement of software development practices.
Classification of Software Measurement
There are 2 types of software measurement:
1. Direct Measurement: In direct measurement, the product, process, or
thing is measured directly using a standard scale.
2. Indirect Measurement: In indirect measurement, the quantity or quality
to be measured is measured using related parameters i.e. by use of
reference.
Software Metrics
A metric is a measurement of the level at which any impute belongs to a system
product or process.
Software metrics are a quantifiable or countable assessment of the attributes of a
software product. There are 4 functions related to software metrics:
1. Planning
2. Organizing
3. Controlling
4. Improving
Characteristics of software Metrics
1. Quantitative: Metrics must possess a quantitative nature. It means
metrics can be expressed in numerical values.
2. Understandable: Metric computation should be easily understood, and
the method of computing metrics should be clearly defined.
3. Applicability: Metrics should be applicable in the initial phases of the
development of the software.
4. Repeatable: When measured repeatedly, the metric values should be the
same and consistent.
5. Economical: The computation of metrics should be economical.
6. Language Independent: Metrics should not depend on any
programming language.
Types of Software Metrics

Types of Software Metrics

1. Product Metrics: Product metrics are used to evaluate the state of the
product, tracing risks and undercover prospective problem areas. The
ability of the team to control quality is evaluated. Examples include lines
of code, cyclomatic complexity, code coverage, defect density, and code
maintainability index.
2. Process Metrics: Process metrics pay particular attention to enhancing
the long-term process of the team or organization. These metrics are used
to optimize the development process and maintenance activities of
software. Examples include effort variance, schedule variance, defect
injection rate, and lead time.
3. Project Metrics: The project metrics describes the characteristic and
execution of a project. Examples include effort estimation accuracy,
schedule deviation, cost variance, and productivity. Usually measures-
 Number of software developer
 Staffing patterns over the life cycle of software
 Cost and schedule
 Productivity
Advantages of Software Metrics
1. Reduction in cost or budget.
2. It helps to identify the particular area for improvising.
3. It helps to increase the product quality.
4. Managing the workloads and teams.
5. Reduction in overall time to produce the product,.
6. It helps to determine the complexity of the code and to test the code with
resources.
7. It helps in providing effective planning, controlling and managing of the
entire product.
Disadvantages of Software Metrics
1. It is expensive and difficult to implement the metrics in some cases.
2. Performance of the entire team or an individual from the team can’t be
determined. Only the performance of the product is determined.
3. Sometimes the quality of the product is not met with the expectation.
4. It leads to measure the unwanted data which is wastage of time.
5. Measuring the incorrect data leads to make wrong decision making.

Measuring Software Quality using


Quality Metrics


In Software Engineering, Software Measurement is done based on some Software
Metrics where these software metrics are referred to as the measure of various
characteristics of a Software.
In Software engineering Software Quality Assurance (SAQ) assures the quality of
the software. A set of activities in SAQ is continuously applied throughout the
software process. Software Quality is measured based on some software quality
metrics.
There is a number of metrics available based on which software quality is
measured. But among them, there are a few most useful metrics which are essential
in software quality measurement. They are –
1. Code Quality
2. Reliability
3. Performance
4. Usability
5. Correctness
6. Maintainability
7. Integrity
8. Security
Now let’s understand each quality metric in detail –
1. Code Quality – Code quality metrics measure the quality of code used for
software project development. Maintaining the software code quality by writing
Bug-free and semantically correct code is very important for good software project
development. In code quality, both Quantitative metrics like the number of lines,
complexity, functions, rate of bugs generation, etc, and Qualitative metrics like
readability, code clarity, efficiency, and maintainability, etc are measured.
2. Reliability – Reliability metrics express the reliability of software in different
conditions. The software is able to provide exact service at the right time or not
checked. Reliability can be checked using Mean Time Between Failure (MTBF)
and Mean Time To Repair (MTTR).
3. Performance – Performance metrics are used to measure the performance of the
software. Each software has been developed for some specific purposes.
Performance metrics measure the performance of the software by determining
whether the software is fulfilling the user requirements or not, by analyzing how
much time and resource it is utilizing for providing the service.
4. Usability – Usability metrics check whether the program is user-friendly or not.
Each software is used by the end-user. So it is important to measure that the end-
user is happy or not by using this software.
5. Correctness – Correctness is one of the important software quality metrics as
this checks whether the system or software is working correctly without any error
by satisfying the user. Correctness gives the degree of service each function
provides as per developed.
6. Maintainability – Each software product requires maintenance and up-
gradation. Maintenance is an expensive and time-consuming process. So if the
software product provides easy maintainability then we can say software quality is
up to mark. Maintainability metrics include the time required to adapt to new
features/functionality, Mean Time to Change (MTTC), performance in changing
environments, etc.
7. Integrity – Software integrity is important in terms of how much it is easy to
integrate with other required software which increases software functionality and
what is the control on integration from unauthorized software’s which increases the
chances of cyberattacks.
8. Security – Security metrics measure how secure the software is. In the age of
cyber terrorism, security is the most essential part of every software. Security
assures that there are no unauthorized changes, no fear of cyber attacks, etc when
the software product is in use by the end-user.
Software Cost Estimation
Last Updated : 08 Nov, 2023



Whenever we develop a software project, main questions that arise in our mind is
how much it will cost to develop and how much time it will take for development.
These estimates are necessary and needed before initiating development. But main
critical problem that arises during software cost estimation is lack of case studies
of projects usually created in a well-documented manner. The software industry
has inconsistently defined and explained metrics or atomic units of measure, data
from real and actual projects are largely and highly suspect in terms of consistency
and comparability. There are many questions as debates among developers and
vendors of software cost estimation models and tools.
The main topics of these debates are of given below :
 Which model of cost estimation should be used?
 Whether or not to measure software size in source lines of code or
function points.
 What constitutes a good estimate?
Nowadays, there are several models available of cost estimation like
COCOMO model
, Checkpoint, ESTIMACS, SLIM, Knowledge Plan, etc.). Among all of them,
COCOMO model is one of most open and well-documented cost estimation
models. At present, most of real-world use of cost models is bottom-up rather than
top-down. Below, diagram is given that illustrates and represents predominant
practice. The manager of software project defines and describes target cost of
software, and after then until target cost can be justified, it manipulates parameters
and size. The process is described in diagram is very necessary to analyze and
predict cost risks and understand sensitivities and trade-offs objectively. It simply
forces manager of software project to examine and find out risks associated with
achieving target costs and to discuss and explain this gained information with other
stakeholders.

Following are the attributes that


Good Software Cost Estimate
Contains :
 It is simply conceived i.e. planned and supported by project manager,
architecture team, development team, and test team responsible for
performing work and task.
 All the stakeholders generally accept it as ambitious but realizable.
 It is based on a well-defined and efficient cost model of software on a
credible basis.
 It is also based on a similar project experience database that includes and
contains similar processes, relevant technologies, relevant environments,
relevant quality requirements, and all similar people.
 It is also defined and explained in much amount of detail so that all of its
key risks are simply understood and probability of success is objectively
assessed.
 It contains any extra details, supporting documentation or any
information that could be relevant to the estimate.
 It stresses how important it is to keep the cost estimates updated and
revised on a frequent basis as the project moves forward and new
information becomes available.
 Any assumptions that were made throughout the estimation process are
documented.
 It generates a project schedule and calculates the amount of time needed
for each task or activity.
Extrapolating from a good estimate, an ideal estimate would be derived from a
mature cost model with an experience base that generally reflects more similar
projects that are done by same team with similar mature processes and tools.
Use of Cost Estimation –
 One needs to choose and determine how many engineers are required for
project to develop and establish a schedule during planning stage.
 While monitoring project’s progress, one needs to access whether project
is progressing towards achieving goal according to procedure and
whether it takes corrective action or not.
 It makes it possible to allocate resources such as labor, tools and supplies
in a way that maximizes effectiveness and reduces waste.
 Cost estimates are used as a starting point for talks and agreements
between parties when negotiating contracts or project requirements.
 It offers a shared understanding of the financial concerns and facilitates
open and honest communication among project stakeholders.
 It affects resource allocation and project duration, that is why the amount
of time needed for various project activities is a crucial component in
cost assessment.

Software Project Planning


A Software Project is the complete methodology of programming advancement from
requirement gathering to testing and support, completed by the execution
procedures, in a specified period to achieve intended software product.

Need of Software Project Management


Software development is a sort of all new streams in world business, and there's next
to no involvement in structure programming items. Most programming items are
customized to accommodate customer's necessities. The most significant is that the
underlying technology changes and advances so generally and rapidly that
experience of one element may not be connected to the other one. All such business
and ecological imperatives bring risk in software development; hence, it is
fundamental to manage software projects efficiently.
Software Project Manager
Software manager is responsible for planning and scheduling project development.
They manage the work to ensure that it is completed to the required standard. They
monitor the progress to check that the event is on time and within budget. The
project planning must incorporate the major issues like size & cost estimation
scheduling, project monitoring, personnel selection evaluation & risk management.
To plan a successful software project, we must understand:

o Scope of work to be completed


o Risk analysis
o The resources mandatory
o The project to be accomplished
o Record of being followed

Software Project planning starts before technical work start. The various steps of
planning activities are:

The size is the crucial parameter for the estimation of other activities. Resources
requirement are required based on cost and development time. Project schedule
may prove to be very useful for controlling and monitoring the progress of the
project. This is dependent on resources & development time.

What is software project estimation?


Software project estimation involves predicting the time, effort, and cost
required to complete a software project. Accurate estimation is critical for
planning, budgeting, and managing software development projects. Here are
some key aspects, techniques, challenges, and best practices for software
project estimation:

### Key Aspects of Software Project Estimation

1. **Scope Definition**: Clearly define the project's scope, including all


requirements and deliverables. A well-defined scope helps in making more
accurate estimates.

2. **Requirements Analysis**: Understand and analyze the functional and non-


functional requirements. Detailed requirements help in breaking down the
project into estimable tasks.

3. **Work Breakdown Structure (WBS)**: Divide the project into smaller,


manageable tasks or components. Each task should be specific and estimable.

4. **Historical Data**: Use data from previous projects with similar scope and
complexity to improve the accuracy of estimates.

5. **Team Expertise**: Leverage the knowledge and experience of team


members who are familiar with the technology and domain.

### Techniques for Software Project Estimation


1. **Expert Judgment**: Consult experts with experience in similar projects.
This can be done through meetings, interviews, or using the Delphi technique,
where multiple experts provide estimates refined through rounds of
discussion.

2. **Analogous Estimation**: Use historical data from similar projects to


estimate the current project. This method is relatively quick but depends on
the accuracy and relevance of past data.

3. **Parametric Estimation**: Use mathematical models to predict project


effort, cost, and duration. For instance, the Constructive Cost Model
(COCOMO) uses algorithms based on historical project data and project
parameters.

4. **Top-Down Estimation**: Estimate the project as a whole and then break it


down into smaller components. Useful in the early stages of project planning
when details are sparse.

5. **Bottom-Up Estimation**: Estimate each component or task individually


and then aggregate the estimates to get the total project estimate. This
method is detailed and time-consuming but can be more accurate.

6. **Three-Point Estimation**: Use three estimates for each task: the optimistic
estimate (O), the pessimistic estimate (P), and the most likely estimate (M). The
final estimate is calculated using the formula: (O + 4M + P) / 6.

7. **Function Point Analysis (FPA)**: Measure the functionality delivered to the


user based on the user’s external view of the functional requirements. Quantify
software functions like inputs, outputs, user interactions, files, and interfaces.
8. **Story Points and Agile Estimation**: Common in agile methodologies,
where teams estimate the relative effort required to implement user stories.
Use story points, which are abstract units of measure representing the size and
complexity of a task.

### Challenges in Software Project Estimation

- **Changing Requirements**: Frequent changes in requirements can make


initial estimates obsolete.

- **Uncertainties**: Technical, operational, and business uncertainties can


affect estimates.

- **Overconfidence**: Teams may underestimate complexity and overestimate


their ability to deliver.

- **Communication Gaps**: Misunderstandings between stakeholders and


developers can lead to inaccurate estimates.

### Best Practices

- **Iterative Estimation**: Regularly revise and update estimates as more


information becomes available.

- **Risk Management**: Identify potential risks and incorporate buffer time to


handle uncertainties.

- **Stakeholder Involvement**: Ensure continuous communication with


stakeholders to understand requirements and expectations.

- **Documentation**: Keep detailed records of estimation assumptions,


methodologies, and historical data for future reference.
Accurate software project estimation is fundamental to successful project
management, helping teams deliver on time, within budget, and to the
expected quality standards.

What is Problem Decomposition?


Last Updated : 29 Feb, 2024



If a computer system is a multiprocessing system, then a single problem/program
must be divided into subproblems in order to assign them to the processors. In
order to perform this task a technique, Problem decomposition is used. It is the
process of decomposing a problem/program into multiple
subproblems/subprograms. It is the basic building block of Parallel Computing.
Decomposition is needed because a problem needs to be divided into different
tasks and then mapped to the processors, whereas a task is a subproblem resulting
from the decomposition of a problem.
Techniques for Problem Decomposition
In order to decompose a problem/program, the following techniques can be used
by a computer:
 Recursive Decomposition: This technique of decomposition is a
general-purpose technology that can be used to decompose any sort of
problem in computation. It works basically on the basis of the “Divide
and Conquer” principle, which basically divides a problem into
subproblems (Divide) and then assign them to different processors
(Conquer). A simple example is the sorting of an array using the Quick
Sort Algorithm, which basically divides the array into simplest units and
then processes them in order to sort them in ascending or descending
order.
 Data Decomposition: This technique of decomposition is again general
purpose. It divides the data in the program into parts and then assigns
them to instructions(tasks). Data Decomposition can be considered in a
matrix multiplication problem. Let say we have two matrices A and B,
and their product is stored in another matrix C.
Matrix A: Matrix B:
Matrix C:
A1 A2 B1 B2 C1
C2
A3 A4 B3 B4 C3
C4
So, by data decomposition following tasks will be generated to get the product
of A and B, and storing the new matrix in C.
Task 1: C1= A1*B1
Task 2: C2= A2*B3
Task 3: C3= A3*B2
Task 4: C4= A4*B4
Now, these tasks will be assigned to four processors.
 Explorative Decomposition: This technique of decomposition is a
special-purpose technique, it will be used for a certain types of problems.
A problem will be decomposed using explorative decomposition, if by
decomposing it a search space is acquired, from where every element
(subproblem) is processed by the processors. An example of this type of
decomposition is used in Puzzle games, where we have a search space
and we have to check the position of every part of the puzzle, to solve it.
 Speculative Decomposition: This technique is again a special-purpose
technique. In this technique, the problem is divided and assigned to the
processors without any investigation or research, whether the decision
made is correct or wrong (speculative). Its best example is a program
comprising of nested if. In a program, if an if statement contains another
if, and the program is already decomposed and the lines of code are
assigned to multiple processors without even checking the conditions,
the processors will run both the if statement and its inner if, but after
some time when the condition is checked, e.g. false and the decision is
made, then the other if statement will also be discarded. Its limitation is
that it doesn’t work on the basis of correct decision making, it just
divides and maps to the processors, but it is faster than other
decomposition techniques.
COCOMO Model – Software Engineering

Cocomo (Constructive Cost Model) is a regression model based on LOC, i.e.,


the number of Lines of Code. This article focuses on discussing the Cocomo
Model in detail.
What is the Cocomo Model?
The Cocomo Model is a procedural cost estimate model for software projects and
is often used as a process of reliably predicting the various parameters associated
with making a project such as size, effort, cost, time, and quality. It was proposed
by Barry Boehm in 1981 and is based on the study of 63 projects, which makes it
one of the best-documented models.
The key parameters that define the quality of any software products, which are also
an outcome of the Cocomo are primarily Effort and schedule:
1. Effort: Amount of labor that will be required to complete a task. It is
measured in person-months units.
2. Schedule: This simply means the amount of time required for the
completion of the job, which is, of course, proportional to the effort put
in. It is measured in the units of time such as weeks, and months.
1. Organic
A software project is said to be an organic type if the team size required is
adequately small, the problem is well understood and has been solved in the past
and also the team members have a nominal experience regarding the problem.
2. Semi-detached
A software project is said to be a Semi-detached type if the vital characteristics
such as team size, experience, and knowledge of the various programming
environments lie in between organic and embedded. The projects classified as
Semi-Detached are comparatively less familiar and difficult to develop compared
to the organic ones and require more experience better guidance and creativity. Eg:
Compilers or different Embedded Systems can be considered Semi-Detached
types.
3. Embedded
A software project requiring the highest level of complexity, creativity, and
experience requirement falls under this category. Such software requires a larger
team size than the other two models and also the developers need to be sufficiently
experienced and creative to develop such complex models.
Detailed Structure of COCOMO Model
Detailed COCOMO incorporates all characteristics of the intermediate version
with an assessment of the cost driver’s impact on each step of the software
engineering process. The detailed model uses different effort multipliers for each
cost driver attribute. In detailed Cocomo, the whole software is divided into
different modules and then we apply COCOMO in different modules to estimate
effort and then sum the effort.
The Six phases of detailed COCOMO are:
1. Planning and requirements
2. System design
3. Detailed design
4. Module code and test
5. Integration and test
6. Cost Constructive model
Phases of COCOMO Model

Different models of Cocomo have been proposed to predict the cost estimation at
different levels, based on the amount of accuracy and correctness required. All of
these models can be applied to a variety of projects, whose characteristics
determine the value of the constant to be used in subsequent calculations. These
characteristics of different system types are mentioned below. Boehm’s definition
of organic, semidetached, and embedded systems:
Importance of the COCOMO Model
1. Cost Estimation: To help with resource planning and project budgeting,
COCOMO offers a methodical approach to software development cost
estimation.
2. Resource Management: By taking team experience, project size, and
complexity into account, the model helps with efficient resource
allocation.
3. Project Planning: COCOMO assists in developing practical project
plans that include attainable objectives, due dates, and benchmarks.
4. Risk management: Early in the development process, COCOMO assists
in identifying and mitigating potential hazards by including risk
elements.
5. Support for Decisions: During project planning, the model provides a
quantitative foundation for choices about scope, priorities, and resource
allocation.
6. Benchmarking: To compare and assess various software development
projects to industry standards, COCOMO offers a benchmark.
7. Resource Optimization: The model helps to maximize the use of
resources, which raises productivity and lowers costs.
Types of COCOMO Model
1. Basic Model
E = a(KLOC)^b
Time = c(Effort)^d
Person required = Effort/ time
The above formula is used for the cost estimation of the basic COCOMO model
and also is used in the subsequent models. The constant values a, b, c, and d for the
Basic Model for the different categories of the system:
Software
Projects a b c d

Organic 2.4 1.05 2.5 0.38

Semi-Detached 3.0 1.12 2.5 0.35

Embedded 3.6 1.20 2.5 0.32

1. The effort is measured in Person-Months and as evident from the


formula is dependent on Kilo-Lines of code. The development time is
measured in months.
2. These formulas are used as such in the Basic Model calculations, as not
much consideration of different factors such as reliability, and expertise
is taken into account, henceforth the estimate is rough.
Below are the programs for Basic COCOMO:
Output
The mode is Organic
Effort = 10.289 Person-Month
Development Time = 6.06237 Months
Average Staff Required = 2 Persons
2. Intermediate Model
The basic Cocomo model assumes that the effort is only a function of the number
of lines of code and some constants evaluated according to the different software
systems. However, in reality, no system’s effort and schedule can be solely
calculated based on Lines of Code. For that, various other factors such as
reliability, experience, and Capability. These factors are known as Cost Drivers
and the Intermediate Model utilizes 15 such drivers for cost estimation.
Classification of Cost Drivers and their Attributes:
Product attributes:
1. Required software reliability extent
2. Size of the application database
3. The complexity of the product
4. Run-time performance constraints
5. Memory constraints
6. The volatility of the virtual machine environment
7. Required turnabout time
8. Analyst capability
9. Software engineering capability
10.Application experience
11. Virtual machine experience
12.Programming language experience
13.Use of software tools
14.Application of software engineering methods
15.Required development schedule
CASE Studies and Examples
1. NASA Space Shuttle Software Development: NASA estimated the
time and money needed to build the software for the Space Shuttle
program using the COCOMO model. NASA was able to make well-
informed decisions on resource allocation and project scheduling by
taking into account variables including project size, complexity, and
team experience.
2. Big Business Software Development: The COCOMO model has been
widely used by big businesses to project the time and money needed to
construct intricate business software systems. These organizations were
able to better plan and allocate resources for their software projects by
using COCOMO’s estimation methodology.
3. Commercial Software goods: The COCOMO methodology has proven
advantageous for software firms that create commercial goods as well.
These businesses were able to decide on pricing, time-to-market, and
resource allocation by precisely calculating the time and expense of
building new software products or features.
4. Academic Research Initiatives: To estimate the time and expense
required to create software prototypes or carry out experimental studies,
academic research initiatives have employed COCOMO. Researchers
were able to better plan their projects and allocate resources by using
COCOMO’s estimate approaches.
Advantages of the COCOMO Model
1. Systematic cost estimation: Provides a systematic way to estimate the
cost and effort of a software project.
2. Helps to estimate cost and effort: This can be used to estimate the cost
and effort of a software project at different stages of the development
process.
3. Helps in high-impact factors: Helps in identifying the factors that have
the greatest impact on the cost and effort of a software project.
4. Helps to evaluate the feasibility of a project: This can be used to
evaluate the feasibility of a software project by estimating the cost and
effort required to complete it.
Disadvantages of the COCOMO Model
1. Assumes project size as the main factor: Assumes that the size of the
software is the main factor that determines the cost and effort of a
software project, which may not always be the case.
2. Does not count development team-specific characteristics: Does not
take into account the specific characteristics of the development team,
which can have a significant impact on the cost and effort of a software
project.
3. Not enough precise cost and effort estimate: This does not provide a
precise estimate of the cost and effort of a software project, as it is based
on assumptions and averages.
Best Practices for Using COCOMO
1. Recognize the Assumptions Underpinning the Model: Become
acquainted with the COCOMO model’s underlying assumptions, which
include its emphasis on team experience, size, and complexity.
Understand that although COCOMO offers useful approximations,
project results cannot be predicted with accuracy.
2. Customize the Model: Adapt COCOMO’s inputs and parameters to your
project’s unique requirements, including organizational capacity,
development processes, and industry standards. By doing this, you can
be confident that the estimations produced by COCOMO are more
precise and appropriate for your situation.
3. Utilize Historical Data: To verify COCOMO inputs and improve
estimating parameters, collect and examine historical data from previous
projects. Because real-world data takes project-specific aspects and
lessons learned into account, COCOMO projections become more
accurate and reliable.
4. Verify and validate: Compare COCOMO estimates with actual project
results, and make necessary adjustments to estimation procedures in light
of feedback and lessons discovered. Review completed projects to find
errors and enhance future project estimation accuracy.
5. Combine with Other Techniques: To reduce biases or inaccuracies in
any one method and to triangulate results, add COCOMO estimates to
other estimation techniques including expert judgment, similar
estimation, and bottom-up estimation.
Conclusion
For both software engineers and project managers, COCOMO is an applicable and
useful tool at a time when effective project planning is essential to success. Its
continued application and adaption in a variety of settings show how valuable it is
in the always-changing field of software development.

Software Engineering-The Software


Equation
The software equation is a dynamic multivariable model that assumes a specific distribution
of effort over the life of a software development project. The model has been derived from
productivity data collected for over 4000 contemporary software projects. Based on these
data, an estimation model of the form

E = ([LOC B^0.333/P]^3)*(1/t^4) where


E = effort in person-months or person-years
t = project duration in months or years
B = “special skills factor”
P = “productivity parameter” that reflects:

• Overall process maturity and management practices


• The extent to which good software engineering practices are used
• The level of programming languages used
• The state of the software environment
• The skills and experience of the software team
• The complexity of the application
Typical values might be P = 2,000 for development of real-time embedded software; P =
10,000 for telecommunication and systems software; P = 28,000 for business systems
applications. The productivity parameter can be derived for local conditions using historical
data collected from past development efforts. It is important to note that the software
equation has two independent parameters:
(1) an estimate of size (in LOC) and (2) an indication of project duration in calendar months
or years.
To simplify the estimation process and use a more common form for their estimation model,
Putnam and Myers suggest a set of equations derived from the software equation. Minimum
development time is defined as

tmin = 8.14 (LOC/P)0.43 in months for tmin > 6 months


E = 180 Bt3 in person-months for E ≥ 20 person-months
tmin = 8.14 (33200/12000)0.43
tmin = 12.6 calendar months
E = 180 0.28 (1.05)3
E = 58 person-months

MAKE OR BUY
Introduction

Are you outsourcing enough? This was one of the main questions
asked by management consultants during the outsourcing boom.
Outsourcing was viewed as one of the best ways of getting things
done for a fraction of the original cost.

Outsourcing is closely related to make or buy decision. The


corporations made decisions on what to make internally and what
to buy from outside in order to maximize the profit margins.

As a result of this, the organizational functions were divided into


segments and some of those functions were outsourced to expert
companies, who can do the same job for much less cost.

Make or buy decision is always a valid concept in business. No


organization should attempt to make something by their own,
when they stand the opportunity to buy the same for much less
price.

This is why most of the electronic items manufactured and


software systems developed in the Asia, on behalf of the
organizations in the USA and Europe.

Four Numbers You Should Know

When you are supposed to make a make-or-buy decision, there


are four numbers you need to be aware of. Your decision will be
based on the values of these four numbers. Let's have a look at
the numbers now. They are quite self-explanatory.

 The volume
 The fixed cost of making
 Per-unit direct cost when making
 Per-unit cost when buying

Now, there are two formulas that use the above numbers. They
are 'Cost to Buy' and 'Cost to Make'. The higher value loses and
the decision maker can go ahead with the less costly solution.

Cost to Buy (CTB) = Volume x Per-unit cost when buying


Cost to Make (CTM) = Fixed costs + (volume x Per-unit
direct cost )

Reasons for Making

There are number of reasons a company would consider when it


comes to making in-house. Following are a few:

 Cost concerns
 Desire to expand the manufacturing focus
 Need of direct control over the product
 Intellectual property concerns
 Quality control concerns
 Supplier unreliability
 Lack of competent suppliers
 Volume too small to get a supplier attracted
 Reduction of logistic costs (shipping etc.)
 To maintain a backup source
 Political and environment reasons
 Organizational pride
Reasons for Buying

Following are some of the reasons companies may consider when


it comes to buying from a supplier:

 Lack of technical experience


 Supplier's expertise on the technical areas and the domain
 Cost considerations
 Need of small volume
 Insufficient capacity to produce in-house
 Brand preferences
 Strategic partnerships
The Process

The make or buy decision can be in many scales. If the decision is


small in nature and has less impact on the business, then even
one person can make the decision. The person can consider the
pros and cons between making and buying and finally arrive at a
decision.

When it comes to larger and high impact decisions, usually


organizations follow a standard method to arrive at a decision.
This method can be divided into four main stages as below.

1. Preparation
 Team creation and appointment of the team leader
 Identifying the product requirements and analysis
 Team briefing and aspect/area destitution
2. Data Collection
 Collecting information on various aspects of make-or-buy
decision
 Workshops on weightings, ratings, and cost for both make-
or-buy
3. Data Analysis
 Analysis of data gathered
4. Feedback
 Feedback on the decision made

By following the above structured process, the organization can


make an informed decision on make-or-buy. Although this is a
standard process for making the make-or-buy decision, the
organizations can have their own varieties.

Conclusion

Make-or-buy decision is one of the key techniques for


management practice. Due to the global outsourcing, make-or-
buy decision making has become popular and frequent.

Since the manufacturing and services industries have been


diversified across the globe, there are a number of suppliers
offering products and services for a fraction of the original price.
This has enhanced the global product and service markets by
giving the consumer the eventual advantage.

If you make a make-or-buy decision that can create a high


impact, always use a process for doing that. When such a process
is followed, the activities are transparent and the decisions are
made for the best interest of the company.

Automation Estimation Tools


Last Updated : 15 Nov, 2023



The decomposition technique and empirical estimation model are available as part
of a range of software tools. Such automated estimation tools are helpful in
estimating cast and effort and conducting “what-if” analysis for important project
variables, such as delivery data or staffing. All automated estimation tools display
the same general characteristics, and all perform the following generic functions-
1. Sizing of Project Deliverable : Estimated the size of one or more work
products i.e., external representation of software, software itself,
distributed functionality, descriptive information, all are approximate
first.
2.
3.
4. Selecting Project Activities : The required process framework is
selected and the software engineering project is specified.
5.
6.
7. Predicting Staffing Levels : The number of people available is
specified. This is an important task, because the relationship between the
people available and work is highly inauspicious.
8.
9.
10.Predicting Software Effort : The estimation tool related to the use of
some models from the size of project deliverable to the effort required
(from producing them).
11.
12.
13.Predicting Software Cost : Software costs can be calculated by
assigning labor rates to project activities.
14.
15.
16.Predicting Software Schedules : Having knowledge of effort, staffing
level and project activities, a draft schedule can be produced by
allocating lober in software engineering activities based on the
recommended model for effort distribution.
Here are the few automation estimation tools:
1. Time monitoring tools: Programmes such as Harvest or Toggl assist
keep track of how much time is spent on activities, but they also offer
insights into previous information, which helps make future estimations
more accurate.
2. Tools for Test Automation: Tools such as Selenium
or Appium automate the testing process during the testing phase.
3. Tools for Continuous Integration/Continuous Deployment
(CI/CD): These tools facilitate a more efficient and error-free release
process while also accelerating development.
4. Planning and Estimation Tools: Together estimating the amount of
work needed for projects or user stories is made easier by tools like
Planning Poker.
5. Requirements Management Tools: Software such as IBM
Engineering connects the process of gathering, monitoring and
maintaining project requirements is automated with requirements
management systems.
6. Machine Learning-Based Estimation Tools: Based on previous
information, team performance and other project criteria, these tools use
machine learning algorithms to generate more precise and based on fact
estimations.
7. Tools for Resource Management: Applications such as Resource
Guru facilitate effective team resource scheduling and management.
8. Code Review Tools: By evaluating code for quality, security and
maintainability, tools such as Code Climate can automate certain steps in
the code review process.
Applying different estimation tools to the same project data results in a relatively
large change in the predicted results. Furthermore, more important, the estimated
values are after significant different than the actual values. This reinforces the
notion that the output of the estimation devices should be used as a data point from
which the estimated are made. Automated estimation from these data estimates the
model projects, costs, staff trading implemented by the tool and, in some cases, the
development schedule and the effort required to meet the associated risk.
WICOMO (Wang Institute Cost Model) developed at the Wang institute, and
DECplan developed by digital equipment corporation are automated estimation
tools that are based on
COCOMO
. Each device needs to provide the user with preliminary 20 c estimated. These
approximations are classified by programming language and type (i.e., customized
code, reused code, new code). the user also specifies the value for the cost driver
attributes. Each of the tools produces an estimated project duration (in months),
effort in staff-month, average staffing per month, average productivity in LOC/pm,
and cost per month. SLIM is an automated costing system based on the Rayleigh
Putnam model SLIM applies the Putnam software model, linear programming,
statistical simulation, and program evaluation and review technique, or
PERT techniques
to derive software project estimates. Once the software size is established, SLIM
calculates the size deviation, a sensitivity profile that indicates the possible
deviation of cost and effort, and a consistency with the data collected for software
systems of similar size the inspection. The planner can implement a linear
programming analysis that considers development costs on both cost and effort and
month-by month distribution of effort and a consistency check with data collected
for software systems of similar size.
Short note on Project Scheduling
Last Updated : 08 Jul, 2020



A schedule in your project’s time table actually consists of sequenced activities and
milestones that are needed to be delivered under a given period of time.
Project schedule simply means a mechanism that is used to communicate and
know about that tasks are needed and has to be done or performed and which
organizational resources will be given or allocated to these tasks and in what time
duration or time frame work is needed to be performed. Effective project
scheduling leads to success of project, reduced cost, and increased customer
satisfaction. Scheduling in project management means to list out activities,
deliverables, and milestones within a project that are delivered. It contains more
notes than your average weekly planner notes. The most common and important
form of project schedule is Gantt chart.

Process :
The manager needs to estimate time and resources of project while scheduling
project. All activities in project must be arranged in a coherent sequence that
means activities should be arranged in a logical and well-organized manner for
easy to understand. Initial estimates of project can be made optimistically which
means estimates can be made when all favorable things will happen and no threats
or problems take place.
The total work is separated or divided into various small activities or tasks during
project schedule. Then, Project manager will decide time required for each activity
or task to get completed. Even some activities are conducted and performed in
parallel for efficient performance. The project manager should be aware of fact that
each stage of project is not problem-free.
Problems arise during Project Development Stage :
 People may leave or remain absent during particular stage of
development.
 Hardware may get failed while performing.
 Software resource that is required may not be available at present, etc.
The project schedule is represented as set of chart in which work-breakdown
structure and dependencies within various activities are represented. To accomplish
and complete project within a given schedule, required resources must be available
when they are needed. Therefore, resource estimation should be done before
starting development.
Resources required for Development of Project :
 Human effort
 Sufficient disk space on server
 Specialized hardware
 Software technology
 Travel allowance required by project staff, etc.
Advantages of Project Scheduling :
There are several advantages provided by project schedule in our project
management:
 It simply ensures that everyone remains on same page as far as tasks get
completed, dependencies, and deadlines.
 It helps in identifying issues early and concerns such as lack or
unavailability of resources.
 It also helps to identify relationships and to monitor process.
 It provides effective budget management and risk mitigation.

Project Scheduling: concepts, task sets, defining a task


network, scheduling, earned value analysis:
Project scheduling involves several key concepts and methods to effectively plan, organize, and
manage tasks within a project. Here's an overview of the core elements you mentioned:

### Concepts of Project Scheduling

1. **Task Sets**: A project consists of various tasks or activities that need to be completed to
achieve the project's objectives. Task sets refer to groups of related tasks that contribute to specific
project milestones or deliverables.

2. **Defining a Task Network**: This involves creating a structured representation of tasks and their
dependencies within the project. Tasks are linked together based on their sequencing and
interdependencies, forming a task network or a project schedule network diagram.

3. **Scheduling**: Scheduling is the process of determining when each task will start and finish
based on task dependencies, resource availability, and project constraints. This often involves
creating a project timeline or schedule that outlines the sequence and duration of tasks.

4. **Earned Value Analysis**: Earned Value Analysis (EVA) is a method used to measure a project's
performance and progress by comparing the planned value of work (budgeted cost for work
scheduled, BCWS) with the actual value of work completed (earned value, EV) and the actual cost of
work performed (actual cost, AC). This analysis helps in assessing project performance against the
baseline plan.
### Task Sets and Dependencies

- **Task Sets**: These are groups of related tasks that contribute to specific project objectives. For
example, a task set for building a house might include tasks like laying the foundation, framing,
plumbing, electrical work, and finishing.

- **Dependencies**: Tasks within a project often have dependencies, where the completion of one
task affects the start or finish of another. There are four types of task dependencies:

- **Finish-to-Start (FS)**: The dependent task cannot start until its predecessor task finishes.

- **Start-to-Start (SS)**: The dependent task cannot start until its predecessor task starts.

- **Finish-to-Finish (FF)**: The dependent task cannot finish until its predecessor task finishes.

- **Start-to-Finish (SF)**: The dependent task cannot finish until its predecessor task starts.

### Defining a Task Network

- **Task Network Diagram**: This diagram visually represents tasks and their dependencies using
nodes (tasks) and arrows (dependencies). It helps in understanding the sequence of tasks and
identifying critical paths (the longest sequence of dependent tasks) in the project.

### Scheduling Techniques

- **Critical Path Method (CPM)**: CPM is a technique used to determine the longest sequence of
dependent tasks (critical path) in a project. Tasks on the critical path have zero float or slack,
meaning any delay in these tasks will delay the project.

- **Program Evaluation and Review Technique (PERT)**: PERT is a probabilistic technique that uses
three estimates (optimistic, pessimistic, and most likely) to calculate task durations. It's useful for
projects with high uncertainty.

### Earned Value Analysis (EVA)

- **Basic Components**:

- **Planned Value (PV)**: The estimated value of work planned to be done.


- **Earned Value (EV)**: The value of work actually completed.

- **Actual Cost (AC)**: The actual cost incurred for the work performed.

- **Key Metrics**:

- **Schedule Performance Index (SPI)**: Ratio of EV to PV, indicating schedule efficiency.

- **Cost Performance Index (CPI)**: Ratio of EV to AC, indicating cost efficiency.

- **Variance Analysis**: Comparing planned versus actual performance to identify deviations and
take corrective actions.

By leveraging these concepts and techniques, project managers can effectively plan, schedule,
monitor, and control project activities to ensure successful project completion within scope, time,
and budget constraints.

You might also like