V-Model: Unit#5
V-Model: Unit#5
V-Model: Unit#5
V-Model
The V-model is a type of SDLC model where
process executes in a sequential manner in V-
shape. It is also known as Verification and Validation
model. It is based on the association of a testing
phase for each corresponding development stage.
Development of each step directly associated with
the testing phase. The next phase starts only after
completion of the previous phase i.e. for each
development activity, there is a testing activity
corresponding to it.
3.
Baseline
Producing software from a specification is like walking on water - it's
easier if it's frozen. Barry Boehm
A baseline is a reference point in the software development life cycle
marked by the completion and formal approval of a set of predefined work
products. The objective of a baseline is to reduce a project's vulnerability to
uncontrolled change by fixing and formally change controlling various key
deliverables (configuration items) at critical points in the development life
cycle. Baselines are also used to identify the aggregate of software and
hardware components that make up a specific release of a system.
Cf Baseline purpose. The purpose of a baseline is to provide:
Baseline Progression
Typical Baseline
Components
Competitive Edge
Being certified in CMMI for example, can put the
company in higher competitive edge and make it
gain more sales due to the evidence of existing
mature software process based on standard method.
Proven outcome
Time pressure
Due to the nature of the companies to deliver the
projects on time, they faced a lot of time pressure
which make it harder for them to dedicate time to
the SPI project. While I see this as weakness and
actually a driver for SPI.
SPI took a long time and it is a costly process while It
is necessary if you have issues as discussed before.
Budget Constraints
As we just mentioned SPI is a costly process, because
it needs time and dedicated resources, and not only
that but also skilled resources especially in SPI. And
you may need SPI consultant and train the resources
and orient them on SPI initiative.
Inadequate metrics
Most of the small companies do not have metrics to
measure and compare their progress or
improvement which make it sometimes impossible
to identify measure the improvements of the SPI.
Staff turnover
Sometimes, the company has a high staff turnover
which can be an issue to impose the SPI culture
change and this can lead to endless SPI.
Micro Organization
Some organization are very small and have very few
resources. The SPI will be too big for that kind of
companies.
1. DMAIC
2. DMADV
DMAIC
It specifies a data-driven quality strategy for improving
processes. This methodology is used to enhance an
existing business process.
DMADV
It specifies a data-driven quality strategy for designing
products and processes. This method is used to create
new product designs or process designs in such a way
that it results in a more predictable, mature, and detect
free performance.
Left Topics-
Process definition techniques
CMMI
Unit 4
Software Configuration Management?
Software Configuration Management is defined as a process to systematically
manage, organize, and control the changes in the documents, codes, and
other entities during the Software Development Life Cycle. It is abbreviated as
the SCM process in software engineering. The primary goal is to increase
productivity with minimal mistakes.
SCM is the discipline which
o Identify change
o Monitor and control change
o Ensure the proper implementation of change made to the item.
o Auditing and reporting on the change made.
Baselines
Change Control
Configuration Status Accounting
Configuration Identification:
Configuration identification is a method of determining the scope of the
software system. With the help of this step, you can manage or control
something even if you don't know what it is. It is a description that contains the
CSCI type (Computer Software Configuration Item), a project identifier and
version information.
Example:
Baseline:
A baseline is a formally accepted version of a software configuration item. It is
designated and fixed at a specific time while conducting the SCM process. It
can only be changed through formal change control procedures.
Change Control:
Change control is a procedural method which ensures quality and consistency
when changes are made in the configuration object. In this step, the change
request is submitted to software configuration manager.
1. Configuration Manager
2. Developer
3. Auditor
The auditor is responsible for SCM audits and reviews.
Need to ensure the consistency and completeness of release.
4. Project Manager:
5. User
The end user should understand the key SCM terms to ensure he has the
latest version of the software
The SCMP can follow a public standard like the IEEE 828 or
organization specific standard
It defines the types of documents to be management and a document
naming. Example Test_v1
SCMP defines the person who will be responsible for the entire SCM
process and creation of baselines.
Fix policies for version management & change control
Define tools which can be used during the SCM process
Configuration management database for recording configuration
information.
Concurrency Management:
When two or more tasks are happening at the same time, it is known as
concurrent operation. Concurrency in context to SCM means that the same
file being edited by multiple persons at the same time.
If concurrency is not managed correctly with SCM tools, then it may create
many pressing issues.
Version Control:
SCM uses archiving method or saves every change made to file. With the
help of archiving or save feature, it is possible to roll back to the previous
version in case of issues.
Synchronization:
Users can checkout more than one files or an entire copy of the repository.
The user then works on the needed file and checks in the changes back to the
repository.They can synchronize their local copy to stay updated with the
changes made by other team members.
1. Git: Git is a free and open source tool which helps version control. It is
designed to handle all types of projects with speed and efficiency.
Conclusion:
Configuration Management helps organizations to systematically
manage, organize, and control the changes in the documents, codes,
and other entities during the Software Development Life Cycle.
The primary goal of the SCM process is to increase productivity with
minimal mistakes
The main reason behind configuration management process is that
there are multiple people working on software which is continually
updating. SCM helps establish concurrency, synchronization, and
version control.
A baseline is a formally accepted version of a software configuration
item
Change control is a procedural method which ensures quality and
consistency when changes are made in the configuration object.
Configuration status accounting tracks each release during the SCM
process
Software Configuration audits verify that all the software product
satisfies the baseline needs
Project manager, Configuration manager, Developer, Auditor, and user
are participants in SCM process
The SCM process planning begins at the early phases of a project.
Git, Team foundation Sever and Ansible are few popular SCM tools
Configuration Control
(Aliases: change control, change management)
It is not the strongest of the species that survive, nor the most intelligent, but the
ones most responsive to change. Charles Darwin
The customer regularly talks directly to software developers asking them to make
'little changes' without consulting the project manager.
The developers are keen to show off the new technology they are using. They
slip in the odd 'neat feature' that they know the customer will love.
Solution: Implement configuration control. Document all requests for change and have
them considered by a Configuration Control Board.
Change Control
Change Control is the process of identifying, documenting, approving or rejecting, and
controlling changes to the project baselines (including scope baselines, schedule
baselines, cost baselines, etc.). In other words, it is used to control changes to all aspects
of an approved project plan. An effective Change Control system ensures that:
Proposed changes are reviewed and their impact is analyzed, prior to approving
or rejecting them.
All requests and changes are properly documented to provide a clear audit trail.
To reduce the effort needed to read and understand source code; [1]
To enable code reviews to focus on more important issues than
arguing over syntax and naming standards.
To enable code quality review tools to focus their reporting mainly on
significant issues other than syntax and style preferences.
A consistent and descriptive file naming convention serves many purposes, often related to
branding, information management, and usability. The overall goal of intentional file naming is
to increase readability in file names.
COCOMO Model
Cocomo (Constructive Cost Model) is a regression model based on LOC, i.e number of
Lines of Code. It is a procedural cost estimate model for software projects and often
used as a process of reliably predicting the various parameters associated with making
a project such as size, effort, cost, time and quality. It was proposed by Barry Boehm in
1970 and is based on the study of 63 projects, which make it one of the best-
documented models.
The key parameters which define the quality of any software products, which are also
an outcome of the Cocomo are primarily Effort & Schedule:
Effort: Amount of labor that will be required to complete a task. It is measured in
person-months units.
Schedule: Simply means the amount of time required for the completion of the
job, which is, of course, proportional to the effort put. It is measured in the units of
time such as weeks, months.
Different models of Cocomo have been proposed to predict the cost estimation at
different levels, based on the amount of accuracy and correctness required. All of these
models can be applied to a variety of projects, whose characteristics determine the
value of constant to be used in subsequent calculations. These characteristics
pertaining to different system types are mentioned below.
The initial estimate (also called nominal estimate) is determined by an equation of the form
used in the static single variable models, using KDLOC as the measure of the size. To
determine the initial effort Ei in person-months the equation used is of the type is shown
below
Ei=a*(KDLOC)b
The value of the constant a and b are depends on the project type.
1. Organic
2. Semidetached
3. Embedded
1.Organic: A development project can be treated of the organic type, if the project deals
with developing a well-understood application program, the size of the development team is
reasonably small, and the team members are experienced in developing similar methods of
projects. Examples of this type of projects are simple business systems, simple
inventory management systems, and data processing systems.
For three product categories, Bohem provides a different set of expression to predict effort
(in a unit of person month)and development time from the size of estimation in KLOC(Kilo
Line of code) efforts estimation takes into account the productivity loss due to holidays,
weekly off, coffee breaks, etc.
According to Boehm, software cost estimation should be done through three stages:
1. Basic Model
2. Intermediate Model
3. Detailed Model
1. Basic COCOMO Model: The basic COCOMO model provide an accurate size of the project
parameters. The following expressions give the basic COCOMO estimation model:
Effort=a1*(KLOC) a2 PM
Tdev=b1*(efforts)b2 Months
Where
KLOC is the estimated size of the software product indicate in Kilo Lines of Code,
Effort is the total effort required to develop the software product, expressed in person
months (PMs).
For the three classes of software products, the formulas for estimating the effort based on
the code size are shown below:
For the three classes of software products, the formulas for estimating the development
time based on the effort are given below:
Some insight into the basic COCOMO model can be obtained by plotting the estimated
characteristics for different software sizes. Fig shows a plot of estimated effort versus
product size. From fig, we can observe that the effort is somewhat superliner in the size of
the software product. Thus, the effort required to develop a product increases very rapidly
with project size.
The development time versus the product size in KLOC is plotted in fig. From fig it can be
observed that the development time is a sub linear function of the size of the product, i.e.
when the size of the product increases by two times, the time to develop the product does
not double but rises moderately. This can be explained by the fact that for larger products,
a larger number of activities which can be carried out concurrently can be identified. The
parallel activities can be carried out simultaneously by the engineers. This reduces the time
to complete the project. Further, from fig, it can be observed that the development time is
roughly the same for all three categories of products. For example, a 60 KLOC program can
be developed in approximately 18 months, regardless of whether it is of organic,
semidetached, or embedded type.
From the effort estimation, the project cost can be obtained by multiplying the required
effort by the manpower cost per month. But, implicit in this project cost computation is the
assumption that the entire project cost is incurred on account of the manpower cost alone.
In addition to manpower cost, a project would incur costs due to hardware and software
required for the project and the company overheads for administration, office space, etc.
It is important to note that the effort and the duration estimations obtained using the
COCOMO model are called a nominal effort estimate and nominal duration estimate. The
term nominal implies that if anyone tries to complete the project in a time shorter than the
estimated duration, then the cost will increase drastically. But, if anyone completes the
project over a longer period of time than the estimated, then there is almost no decrease in
the estimated cost value.
Example1: Suppose a project was estimated to be 400 KLOC. Calculate the effort and
development time for each of the three model i.e., organic, semi-detached & embedded.
Effort=a1*(KLOC) a2 PM
Tdev=b1*(efforts)b2 Months
Estimated Size of project= 400 KLOC
(i)Organic Mode
(ii)Semidetached Mode
E = 3.0 * (400)1.12=2462.79 PM
D = 2.5 * (2462.79)0.35=38.45 PM
Example2: A project size of 200 KLOC is to be developed. Software development team has
average experience on similar type of projects. The project schedule is not very tight.
Calculate the Effort, development time, average staff size, and productivity of the project.
Solution: The semidetached mode is the most appropriate mode, keeping in view the size,
schedule and experience of development time.
Hence E=3.0(200)1.12=1133.12PM
D=2.5(1133.12)0.35=29.3PM
P = 176 LOC/PM
2. Intermediate Model: The basic Cocomo model considers that the effort is only a
function of the number of lines of code and some constants calculated according to the
various software systems. The intermediate COCOMO model recognizes these facts and
refines the initial estimates obtained through the basic COCOMO model by using a set of 15
cost drivers based on various attributes of software engineering.
Classification of Cost Drivers and their attributes:
Hardware attributes -
Personnel attributes -
o Analyst capability
o Software engineering capability
o Applications experience
o Virtual machine experience
o Programming language experience
Project attributes -
E=ai (KLOC) bi*EAF
D=ci (E)di
Project ai bi
The effort is determined as a function of program estimate, and a set of cost drivers are
given according to every phase of the software lifecycle.
1. Stage-I:
It supports estimation of prototyping. For this it uses Application Composition
Estimation Model. This model is used for the prototyping stage of application
generator and system integration.
2. Stage-II:
It supports estimation in the early design stage of the project, when we less know
about it. For this it uses Early Design Estimation Model. This model is used in early
design stage of application generators, infrastructure, system integration.
3. Stage-III:
It supports estimation in the post architecture stage of a project. For this it
uses Post Architecture Estimation Model. This model is used after the completion
of the detailed architecture of application generator, infrastructure, system
integration.
Difference between COCOMO 1 and COCOMO 2
COCOMO 1 Model:
The Constructive Cost Model was first developed by Barry W. Boehm. The model is for
estimating effort, cost, and schedule for software projects. It is also called as Basic
COCOMO. This model is used to give an approximate estimate of the various
parameters of the project. Example of projects based on this model is business system,
payroll management system and inventory management systems.
COCOMO 2 Model:
The COCOMO-II is the revised version of the original Cocomo (Constructive Cost
Model) and is developed at the University of Southern California. This model calculates
the development time and effort taken as the total of the estimates of all the individual
subsystems. In this model, whole software is divided into different modules. Example of
projects based on this model is Spreadsheets and report generator.
COCOMO I COCOMO II
It provides estimates pf effort and standard deviation around the most likely
schedule. estimate.
This model is based upon the linear This model is based upon the non linear reuse
This model is also based upon the This model is also based upon reuse model
requirements. estimate.
is 3 and 15 cost drivers are assigned and 17 cost drivers are assigned
COCOMO I COCOMO II
Functional Requirement
A functional requirement document defines the functionality of a system or one of its
subsystems. It also depends upon the type of software, expected users and the type of
system where the software is used.
Functional user requirements may be high-level statements of what the system should
do but functional system requirements should also describe clearly about the system
services in detail.
TRACEABILITY
In planning to use this new approach, Lance wants to get support from his team, as well as
project stakeholders. He starts with the disadvantages and advantages of using this approach.
From there, he plans to provide them with a specific example using the approach.
Bottom-Up (Dis)Advantages
Lance's decision to adopt bottom-up estimating came after examining the tradeoffs and
determining that the advantages outweigh the disadvantages. The advantage of bottom-up
estimating is that it leads to greater accuracy. This is exactly what Lance needs. The accuracy
results because this approach takes into consideration each component of the project work.
Accuracy is also achieved because the estimates for each component are given by the
individuals responsible for these components: the ones who know the work well.
The primary disadvantage of bottom-up estimating is the time it takes to complete. While other
forms of estimating can use the high-level requirements used to start the project process as a
basis, bottom-up estimating requires low-level components. In order to take into consideration
each component of the project work, these components must first be identified, through
decomposition. This process is long, and can be even more so when a large amount of work or
complex work is involved.
Another disadvantage of bottom-up estimating is that it can be costly. The time spent
decomposing project work is not free. Additionally, the estimation done for each component is
given by the individuals responsible for completing the components. These team members are
typically not involved in the project during the planning phase. Bringing in individuals, especially
if they are contracted, increases the cost of planning, which increases the cost of the project.
In general, bottom-up estimating is not the best choice for projects that do not allow for long
periods of planning or projects that have contracted resources that typically do not start on the
project much earlier than when the work is going to be completed. Lance can use this approach
because he has a devoted project team who can assist with estimates and because the
stakeholders are more concerned with accuracy than speed.
TOP DOWN ESTIMATION
Cost of a New Project
Lisa is a project manager at Dolphin Boat Company, a business that makes boats of all kinds.
The company wants to add a new model to their speedboat line. However, before they commit
to this project, they need to know if it's a worthwhile endeavor.
Lisa has been asked to provide an estimate on how much the project will likely cost, as this will
help the company determine if the project is feasible or if they should just shelve it.
Lisa has been asked to provide an estimate on how much the project will likely cost, as this will
help the company determine if the project is feasible or if they should just shelve it.
Top-Down Estimating
Top-down estimating is a technique used by upper-level management to estimate the total
cost of a project by using information from a previous, similar project. In other words, she's
going to estimate the cost of the current project based on the last time they introduced a new
boat model. Top-down estimates may also be based on the experiences of those involved in
developing the cost estimate and expert judgement.
Lisa looks at some of her and the company's previous projects. One previous project also
involved building a speedboat that used the same engine and was similar in size to the new
boat the company wishes to build. Jackpot! The cost of the previous project was $100,000, so
Lisa estimates cost of the current project will also be roughly $100,000.
Lisa looks at some of her and the company's previous projects. One previous project also
involved building a speedboat that used the same engine and was similar in size to the new
boat the company wishes to build. Jackpot! The cost of the previous project was $100,000, so
Lisa estimates cost of the current project will also be roughly $100,000.
Top-Down Estimating Advantages
PLANNING POKER
It seems so interesting how the idea of a game can be weaved into a
technical terminology. Poker is one such term. Planning poker is an
estimation technique used in agile methodology. It has its origin from
an old estimation methodology known as Delphi method(an
estimation technique for sales and marketing). It is an approach where
a group of experts assimilate to analyse the size of project goals to be
accomplished. Just like in a game of poker, where the cards are faced
down and the numbers are not spoken by any member, they reveal it
when its their chance. Planning poker technique follows the same
ideology. Individuals contribute their ideas that generates a consensus
based conclusion.
Types of Cards :
Benefits :
Demerits :
Conclusion :
The planning poker technique lays a robust foundation for building an
application in planned and quantifiable terms. Interestingly, the
technique acts as a recreation activity, the team members have a
candid conversation and thus get a chance to plan a project playfully,
which may otherwise turn out to be a burdensome task. It is thus the
most efficient estimation technique.
(From Tutorialpt)
Planning Poker Estimation
Planning Poker is a consensus-based technique for estimating, mostly used to
estimate effort or relative size of user stories in Scrum.
Planning Poker combines three estimation techniques − Wideband Delphi Technique,
Analogous Estimation, and Estimation using WBS.
Planning Poker was first defined and named by James Grenning in 2002 and later
popularized by Mike Cohn in his book "Agile Estimating and Planning”, whose
company trade marked the term.
Kickoff Meeting
Estimation Meeting
Step 5.3 − Each team member reads aloud the detailed task list that he/she made,
identifying any assumptions made and raising any questions or issues. The task
estimates are not disclosed.
The individual detailed task lists contribute to a more complete task list when
combined.
Step 5.4 − The team then discusses any doubt/problem they have about the tasks they
have arrived at, assumptions made, and estimation issues.
Step 5.5 − Each team member then revisits his/her task list and assumptions, and
makes changes if necessary. The task estimates also may require adjustments based
on the discussion, which are noted as +N Hrs. for more effort and –N Hrs. for less
effort.
The team members then combine the changes in the task estimates to arrive at the
total project estimate.
Step 5.6 − The moderator collects the changed estimates from all the team members
and plots them on the Round 2 line.
In this round, the range will be narrower compared to the earlier one, as it is more
consensus based.
Step 5.7 − The team then discusses the task modifications they have made and the
assumptions.
Step 5.8 − Each team member then revisits his/her task list and assumptions, and
makes changes if necessary. The task estimates may also require adjustments based
on the discussion.
The team members then once again combine the changes in the task estimate to
arrive at the total project estimate.
Step 5.9 − The moderator collects the changed estimates from all the members again
and plots them on the Round 3 line.
Again, in this round, the range will be narrower compared to the earlier one.
Step 5.10 − Steps 5.7, 5.8, 5.9 are repeated till one of the following criteria is met −
Step 6 − The Project Manager then assembles the results from the Estimation
meeting.
Step 6.1 − He compiles the individual task lists and the corresponding estimates into a
single master task list.
Step 6.2 − He also combines the individual lists of assumptions.
Step 6.3 − He then reviews the final task list with the Estimation team.
Disadvantages
Reviews/ Testing
Customer complaints
Workgroup level -
Organisational level -
Casual chain