CSC 425 Project Management
CSC 425 Project Management
CSC 425 Project Management
Course contents
Team Management
Project Scheduling
Software measurement and estimation techniques
Risk analysis
Software quality assurance
Software Configuration Management
Project Management tools
Introduction
This course is ideal for those who want to prepare for a career in
computer science without the need to focus solely on one particular
subject area at this stage. Its focus on project management allows
you to develop specific professional skills further widening your
career options
IT project management is the process of planning, organizing and
delineating responsibility for the completion of an organizations'
specific information technology (IT) goals.
Organizes
Plans
executes projects
Example
Technology today is evolving at a rapid pace, enabling faster change and progress,
causing an acceleration of the rate of change. However, it is not only technology
trends and emerging technologies that are evolving, a lot more has changed, making
IT professionals realize that their role will not stay the same in the contact less world
tomorrow. And an IT professional in 2024 will constantly be learning, unlearning, and
relearning (out of necessity, if not desire).
What does this mean for you? It means staying current with emerging technologies
and latest technology trends. And it means keeping your eyes on the future to know
which skills you’ll need to know to secure a safe job tomorrow and even learn how to
get there. Here are the top 18 emerging technology trends you should watch for and
make an attempt at in 2024, and possibly secure one of the jobs that will be created by
these new technology trends. Starting the list of new tech trends with the talk of the
town, gen-AI!
1. Generative-AI
Generative AI, a cutting-edge technology, has revolutionized various industries by
enabling machines to create content that resembles human-generated work. It
encompasses a wide range of applications, from text generation to image synthesis
and even music composition. After mastering generative AI, individuals can pursue
exciting job roles in fields such as artificial intelligence research, data science, and
creative industries. The ever-expanding applications of generative AI promise a bright
future for those who master this technology, offering opportunities to shape how we
interact and create content in the digital age. Some of the top job roles include:
AI Researcher, where you can delve deep into the development of advanced
generative models
Data Scientist, using generative AI to extract valuable insights from data
Content Creator, harnessing generative AI for innovative storytelling
AI Ethics Consultant, addressing the ethical implications of AI-generated content
2. Computing Power
Computing power has already established its place in the digital era, with almost
every device and appliance being computerized. And it’s here for even more as data
science experts have predicted that the computing infrastructure we are building right
now will only evolve for the better in the coming years. At the same time, we have 5G
already; gear up for an era of 6G with more power in our hands and devices
surrounding us. Even better, computing power is generating more tech jobs in the
industry but would require specialized qualifications for candidates to acquire.
From data science to robotics and IT management, this field will power the largest
percentage of employment in every country. The more computing our devices will
need, the more technicians, IT teams, relationship managers, and the customer care
economy will flourish. One essential branch under this field that you can learn today
is RPA, i.e. Robotic Process Automation. Here are the top jobs you can target after
RPA:
Data Scientist
AI Engineer
Robotics Researcher
AI Architect
Robotics Designer
3. Smart(er) Devices
Artificial intelligence has played an essential role in making our world smarter and
smoother. It is not just simulating humans but going the extra mile to make our lives
hassle-free and simpler. These smarter devices are here to stay in 2024 and even
further, as data scientists are working on AI home robots, appliances, work devices,
wearables, and so much more! Almost every job needs smart software applications to
make our work life more manageable. Smarter devices are another addition to the IT
industry that is of high requirement and demand as more companies transform into
digital spaces. Almost every higher-level job requires good IT and automation
proficiency to thrive. Here are the best jobs you can venture into:
IT Manager
Data Scientists
Product Testers
Product Managers
Automation Engineers
IT Researchers
Datafication
Datafication is simply transforming everything in our lives into devices or software
powered by data. So, in short, Datafication is the modification of human chores and
tasks into data-driven technology. From our smartphones, industrial machines, and
office applications to AI-powered appliances and everything else, data is here to stay
for longer than we can ever remember! So, to keep our data stored correctly and
securely and safely, it has become an in-demand specialization in our economy.
Machine Learning the subset of AI, is also being deployed in all kinds of industries,
creating a huge demand for skilled professionals. Forrester predicts AI, machine
learning, and automation will create 9 percent of new U.S. jobs by 2025, jobs
including robot monitoring professionals, data scientists, automation specialists, and
content curators, making it another new technology trend you must keep in mind too!
Mastering AI and machine learning will help you secure jobs like:
AI Research Scientist
AI Engineer
Machine Learning Engineer
AI Architect
6. Extended Reality
Extended reality comprises all the technologies that simulate reality, from Virtual
Reality, Augmented Reality to Mixed Reality and everything else in-between. It is a
significant technology trend right now as all of us are craving to break away from the
so-called real boundaries of the world. By creating a reality without any tangible
presence, this technology is massively popular amongst gamers, medical specialists,
and retail and modeling.
Regarding extended reality, gaming is a crucial area for popular careers that don’t
require high-level qualifications but rather a passion for online gaming. You can
pursue game design, animation or even editing programs to pursue a successful career
in this specialization. Meanwhile, check out the best jobs in AR, VR as well as ER:
To create a safer space for digital users, cybersecurity and ethical hacking are the
major specializations you can check out. In these two, there is an array of jobs you
can discover from junior to senior levels. For ethical hacking, you might have to take
up professional certifications, while for cybersecurity, a diploma or even a master’s
qualification is sufficient to aim for a high-salary role. Here are the top jobs you can
find in cybersecurity and ethical hacking:
Cybersecurity Analyst
Penetration Tester
Security Engineer
Security Architect
Security Automation Engineer
Network Security Analyst
8. 3D Printing
A key trend in innovation and technology is 3D printing which is used to formulate
prototypes. This technology has been impactful in the biomedical and industrial
sectors. None of us thought of printing a real object from a printer, while right now,
it’s a reality. So, 3D printing is another innovation that’s here to stay. For companies
in the data and healthcare sector that require a lot of 3D printing for their products,
various jobs pay well and are international. You only need a sound knowledge of AI,
Machine Learning, Modeling, and 3D printing. Let’s check out the best jobs in this
specialization:
CX Program Manager
3D Printer Engineer
Emulation Prototyping Engineer
Robotics Trainer
AI Engineer
Operations Manager
Organ & Prosthetic Designer
9. Genomics
Imagine a technology that can study and use your DNA to improve your health,
helping you fight diseases and whatnot! Genomics is precisely that technology that
peruses upon the make-up of genes, DNAs, their mapping, structure, etc. Further, this
can help quantify your genes and result in finding diseases or any possible problems
that can later be a health issue. When it comes to a specialization like Genomics, one
can find a variety of technical as well as non-technical roles. Technical jobs in this
area are all about designing, analyzing, and diagnostics, while non-technical jobs are
concerned with higher levels of research and theoretical analysis. Here are the top
jobs in Genomics:
Bioinformatics Analyst
Genome Research Analyst
Full Stack Developer
Software Engineer
Bioinformatician
Genetics Engineer
This gave rise to another technology trend - energy solutions! This alternative energy
arena is also boosting environment-related and data-oriented careers. These careers
pertain to those in Science specializations and social science qualifications. Let’s take
a look at the top jobs you can find in New Energy:
Although Forrester Research estimates RPA automation will threaten the livelihood of
230 million or more knowledge workers or approximately 9 percent of the global
workforce, RPA is also creating new jobs while altering existing jobs. McKinsey
finds that less than 5 percent of occupations can be totally automated, but about 60
percent can be partially automated.
For you as an IT professional looking to the future and trying to understand latest
technology trends, RPA offers plenty of career opportunities, including developer,
project manager, business analyst, solution architect and consultant. And these jobs
pay well- making it the next technology trend you must keep a watch on! Mastering
RPA will help you secure high-paying jobs like:
RPA Developer
RPA Analyst
RPA Architect
As the quantity of data organizations is dealing with continues to increase, they have
realized the shortcomings of cloud computing in some situations. Edge computing is
designed to help solve some of those problems as a way to bypass the latency caused
by cloud computing and getting data to a data center for processing. It can exist “on
the edge,” if you will, closer to where computing needs to happen. For this reason,
edge computing can be used to process time-sensitive data in remote locations with
limited or no connectivity to a centralized location. In those situations, edge
computing can act like mini datacenters.
Keeping in line with cloud computing (including new-age edge and quantum
computing) will help you grab amazing jobs like:
In 2024, we can expect these forms of technologies being further integrated into our
lives. Usually working in tandem with some of the other emerging technologies we’ve
mentioned in this list, AR and VR have enormous potential in training, entertainment,
education, marketing, and even rehabilitation after an injury. Either could be used to
train doctors to do surgery, offer museum goers a deeper experience, enhance theme
parks, or even enhance marketing, as with this Pepsi Max bus shelter.
While some employers might look for optics as a skill-set, note that getting started in
VR doesn’t require a lot of specialized knowledge - basic programming skills and a
forward-thinking mindset can land a job; another reason why this new technology
trend should make up to your list of lookouts!
15. Blockchain
Although most people think of blockchain technology in relation to cryptocurrencies
such as Bitcoin, blockchain offers security that is useful in many other ways. In
simplest terms, blockchain can be described as data you can only add to, not take
away from, or change. Hence the term “chain” because you’re making a chain of data.
Not being able to change the previous blocks is what makes it so secure. In addition,
blockchains are consensus-driven, so no one entity can take control of the data. With
blockchain, you don’t need a trusted third party to oversee or validate transactions.
If you are intrigued by Blockchain and its applications and want to make your career
in this trending technology, then this is the right time to start. To get into Blockchain,
you need hands-on experience in programming languages, OOPS fundamentals, flat
and relational databases, data structures, web app development, and networking.
Mastering blockchain can help you scale up in a variety of fields and industries:
Risk Analyst
Tech Architect
Crypto Community Manager
Front End Engineer
As consumers, we’re already using and benefitting from IoT. We can lock our doors
remotely if we forget to when we leave for work and preheat our ovens on our way
home from work, all while tracking our fitness on our Fitbits. However, businesses
also have much to gain now and in the near future. The IoT can enable better safety,
efficiency and decision making for businesses as data is collected and analyzed. It can
enable predictive maintenance, speed up medical care, improve customer service, and
offer benefits we haven’t even imagined yet.
And we’re only in the beginning stages of this new technology trend: Forecasts
suggest that by 2030 around 50 billion of these IoT devices will be in use around the
world, creating a massive web of interconnected devices spanning everything from
smartphones to kitchen appliances. And if you wish to step foot in this trending
technology, you will have to learn about Information security, AI and machine
learning fundamentals, networking, hardware interfacing, data analytics, automation,
understanding of embedded systems, and must have device and design knowledge.
17. 5G
The next technology trend that follows the IoT is 5G. Where 3G and 4G technologies
have enabled us to browse the internet, use data-driven services, increased bandwidths
for streaming on Spotify or YouTube and so much more, 5G services are expected to
revolutionize our lives. By enabling services that rely on advanced technologies like
AR and VR, alongside cloud-based gaming services like Google Stadia, NVidia
GeForce Now and much more. It is expected to be used in factories, HD cameras that
help improve safety and traffic management, smart grid control and smart retail too.
Just about every telecom company like Verizon, Tmobile, Apple, Nokia Corp,
QualComm, are now working on creating 5G applications. 5G Network subscriptions
will reach 4.4 billion by the end of 2027, making it an emerging technology trend you
must watch out for, and also save a spot in.
18. Cyber Security
Cyber security might not seem like an emerging new technology trend, given that it
has been around for a while, but it is evolving just as other technologies are. That’s in
part because threats are constantly new. The malevolent hackers trying to access data
illegally will not give up any time soon, and they will continue to find ways to get
through even the toughest security measures. It’s also partly because new technology
is being adapted to enhance security. As long as we have hackers, cybersecurity will
remain a trending technology because it will constantly evolve to defend against those
hackers.
Ethical Hacker
Malware Analyst
Security Engineer
Chief Security Officer
offering a promising career path for someone who wants to get into and stick with this
new trending technology.
20. DevOps
DevOps is a set of practices that focuses on collaboration and communication
between software development (Dev) and IT operations (Ops) teams. It aims to
automate and streamline the software development and deployment lifecycle,
allowing for more frequent and reliable releases. DevOps practices include continuous
integration, continuous delivery, infrastructure as code, and automated testing.
Adopting DevOps leads to faster development cycles, improved software quality, and
greater agility in responding to changes and customer needs.
21. Metaverse
The metaverse is a virtual, interconnected digital universe where users can interact
with each other and digital environments in real-time. It combines augmented reality
(AR), virtual reality (VR), and various technologies to create immersive, shared
experiences. Companies are exploring metaverse applications in gaming, social
networking, education, healthcare, and beyond. This trend represents a convergence
of digital and physical worlds and is expected to have far-reaching impacts on
communication, entertainment, and business collaboration.
Unit one
Team Management
The programme or project team ('the team') is a group of individuals
with appropriate and complementary professional, technical or
specialist skills. Under the direction of the programme manager or
the project manager, the team is responsible for carrying out the
work detailed in the programme or project plan.
Team management is all about working with your team to help them
collaborate and be more productive. It also refers to the activities
and tools that allow teams to work better together. That means
managing assignments, schedules, workload and more.
The Programme or Project Team, under the direction of the Programme or Project
Manager, is responsible for carrying out the work detailed in the programme or
project plan.
Purpose of a programme or project team
The programme or project team ('the team') is a group of individuals with appropriate
and complementary professional, technical or specialist skills. Under the direction of
the programme manager or the project manager, the team is responsible for carrying
out the work detailed in the programme or project plan. The size and make-up of the
team will depend on the nature of the work being undertaken and, on occasion, may
be supplemented by specialists at key points in the programme or project. It may also
include staff from different organisations working together as part of the team.
Teams working in a programme environment will tend to need less technical but more
business and management based skills and experience than those working in a project
environment.
Before we start going into the details of these strategies, you need to understand how
an efficient project team operates in project management, what does project teams do,
and who are all involved in it. A project team consists of different individuals with
varying levels of authority. Their way of operation is dependent on the organizational
culture and the methodology in use.
Project manager
Team leader
Team members
Project committee
Project sponsor
Project stakeholder
Working with the project manager throughout the project life cycle
Completing the assigned project deliverables and meeting all requirements
Documenting the process
Contributing to the team’s overall performance
Presenting possible solutions to the managers in case of a bottleneck
Keeping the project manager informed of the project progress
You need to motivate your team to achieve continuous improvement. Managers who
lead their team by example and groom future leaders have a history of delivering
excellent results in the industry. We’ll take a detailed look at the 10 most effective
strategies you can use to manage your team and be a project management expert. The
strategies are:
Members that are too skilled but can’t operate as team members will likely slow you
down. Similarly, someone only with interpersonal skills can’t add value to the team.
When building a team, looking for the right project management skills is necessary
and finding the perfect balance is the key.
The biggest giveaway of ineffective leaders is their tendency to keep the information
on a need-to-know basis. It’s such a toxic leadership trait that it has a name of its
own–mushroom management. With transparency in your projects, you can reap the
maximum benefits from each member’s skill set, improve internal accountability,
keep the progress in check, and do much more.
You can also effectively improve the communication and collaboration of your team
through transparency. We’ll further discuss these in the following sections.
Ideas can come from anyone irrespective of their position. That’s why Agile focuses
on creating a culture of respect where everyone gets a chance to be involved. It’s your
job as a leader to satisfy the doubts of your members and consider their ideas if they
have some potential.
Even when you don’t accept an idea for a genuine reason, you should explain why
appreciatively, and encourage your team members to keep on participating actively
during the project plan.
Delegating tasks helps you become a better manager as well. If you are always busy
micromanaging the smallest of things, you’ll obviously be unable to focus on the
bigger picture and come up with an efficient way of completing the tasks at hand.
If the conflict is about the approach you should take or any other professional matter,
remind everyone of the true objective, and contain the situation. However, if the
conflict is of personal nature, try to give some space to team members. You must
encourage them to work things out as amicably as possible.
No matter what kind of conflict you face, your leadership skills will surely be tested
in that scenario.
9. Be receptive to feedback
Just like you ask your customers for feedback, you should also take feedback from
your project team to improve your leadership style. While leadership is somewhat a
natural skill, it’s important for you to further hone it and customize it according to
your team’s personality traits.
Some members of your project team might excel when they are free to add their style
while some members perform better with a little oversight. You can find more about
your team’s preference after spending time with them. Some leaders keep a
suggestion box or rely on periodic forms to learn more about their teams and change
their style if needed.
Cloud-based software tools like Kissflow Project have everything you need to always
stay on top and manage your project teams with precision even when you are working
remotely. If you’re looking to switch to a Kissflow low-code platform that’s built
around teamwork and transparency, Free Demo for Kissflow Low-Code Platform
UNIT TWO
PROJECT SCHEDULING
A schedule in your project’s time table actually consists of sequenced activities and
milestones that are needed to be delivered under a given period of time.
Project schedule simply means a mechanism that is used to communicate and know
about that tasks are needed and has to be done or performed and which organizational
resources will be given or allocated to these tasks and in what time duration or time
frame work is needed to be performed. Effective project scheduling leads to success
of project, reduced cost, and increased customer satisfaction. Scheduling in project
management means to list out activities, deliverables, and milestones within a project
that are delivered. It contains more notes than your average weekly planner notes. The
most common and important form of project schedule is Gantt chart.
Scheduling Process:
The manager needs to estimate time and resources of project while scheduling project.
All activities in project must be arranged in a coherent sequence that means activities
should be arranged in a logical and well-organized manner for easy to understand.
Initial estimates of project can be made optimistically which means estimates can be
made when all favorable things will happen and no threats or problems take place.
The total work is separated or divided into various small activities or tasks during
project schedule. Then, Project manager will decide time required for each activity or
task to get completed. Even some activities are conducted and performed in parallel
for efficient performance. The project manager should be aware of fact that each stage
of project is not problem-free.
Problems arise during Project Development Stage :
Human effort
Sufficient disk space on server
Specialized hardware
Software technology
Travel allowance required by project staff, etc.
It simply ensures that everyone remains on same page as far as tasks get completed,
dependencies, and deadlines.
It helps in identifying issues early and concerns such as lack or unavailability of
resources.
It also helps to identify relationships and to monitor process.
It provides effective budget management and risk mitigation.
UNIT THREE
Software measurement
Software measurement is a quantified attribute of a characteristic of
a software product or the software process.
Internal attributes are those that can be measured purely in terms of the:
process,
product,
resources.
External attributes are those that can be measured only with respect to its
relation with the environment. For example: The total number of failures
experienced by a user, the length of time it takes to search the database and
retrieve information.
The different attributes that can be measured for each of the entities are as follows −
Processes
Processes are collections of software-related activities. Following are some of the
internal attributes that can be measured directly for a process −
The number of incidents of a specified type arising during the process or one
of its activities
Products
Products are not only the items that the management is committed to deliver but also
any artifact or document produced during the software life cycle.
The different internal product attributes are size, effort, cost, specification, length,
functionality, modularity, reuse, redundancy, and syntactic correctness. Among these
size, effort, and cost are relatively easy to measure than the others.
The different external product attributes are usability, integrity, efficiency, testability,
reusability, portability, and interoperability. These attributes describe not only the
code but also the other documents that support the development effort.
Resources
These are entities required by a process activity. It can be any input for the software
production. It includes personnel, materials, tools and methods.
The different internal attributes for the resources are age, price, size, speed, memory
size, temperature, etc. The different external attributes are productivity, experience,
quality, usability, reliability, comfort etc.
Deriving the questions from each goal that must be answered to determine if
the goals are being met
To help generate the goals, questions, and metrics, Basili & Rombach provided a
series of templates.
According to the maturity level of the process, the type of measurement and the
measurement program will be different. Following are the different measurement
programs that can be applied at each of the maturity level.
Level 1: Ad hoc
At this level, the inputs are ill- defined, while the outputs are expected. The transition
from input to output is undefined and uncontrolled. For this level of process maturity,
baseline measurements are needed to provide a starting point for measuring.
Level 2: Repeatable
At this level, the inputs and outputs of the process, constraints, and resources are
identifiable. A repeatable process can be described by the following diagram.
The input measures can be the size and volatility of the requirements. The output may
be measured in terms of system size, the resources in terms of staff effort, and the
constraints in terms of cost and schedule.
Level 3: Defined
At this level, intermediate activities are defined, and their inputs and outputs are
known and understood. A simple example of the defined process is described in the
following figure.
The input to and the output from the intermediate activities can be examined,
measured, and assessed.
Level 4: Managed
At this level, the feedback from the early project activities can be used to set priorities
for the current activities and later for the project activities. We can measure the
effectiveness of the process activities. The measurement reflects the characteristics of
the overall process and of the interaction among and across major activities.
Level 5: Optimizing
At this level, the measures from activities are used to improve the process by
removing and adding process activities and changing the process structure
dynamically in response to measurement feedback. Thus, the process change can
affect the organization and the project as well as the process. The process will act as
sensors and monitors, and we can change the process significantly in response to
warning signs.
At a given maturity level, we can collect the measurements for that level and all levels
below it.
Identifying the Level of Maturity
Process maturity suggests to measure only what is visible. Thus, the combination of
process maturity with GQM will provide most useful measures.
At level 3, intermediate activities are defined with entry and exit criteria for
each activity
The goal and question analysis will be the same, but the metric will vary with
maturity. The more mature the process, the richer will be the measurements. The
GQM paradigm, in concert with the process maturity, has been used as the basis for
several tools that assist managers in designing measurement programs.
GQM helps to understand the need for measuring the attribute, and process maturity
suggests whether we are capable of measuring it in a meaningful way. Together they
provide a context for measurement.
While it's true that software estimation methods have some inherent flaws and pitfalls
due to the limitations of human experience and intuition, a combination of techniques
can help mitigate estimation risks. It increases the chance your teams will make
informed decisions, manage risks, and deliver projects on time and within budget.
Teams should review the product backlog in whatever project management tools you
use, whether that be Jira, Asana, Linear or something else, and have a thoughtful and
routine effort estimation process.
You’ll notice that there are pitfalls in every estimation technique. My view is that AI
will soon be ready to help estimate far better. Understanding different ways of
estimating will help us to be critical in how we think about AI-assisted estimation.
More on that later.
1. Planning Poker
It promotes collaboration, tries to avoid anchoring bias, and ensures that every team
member's opinion is heard.
Planning Poker is a great technique, but it can take a while, especially if you have a
big team or a complex project.
2. Three-Point Method
The Three-Point Method is used to estimate the time required to complete a project by
considering the best-case, worst-case, and most-likely scenarios.
These estimates are usually represented using three values: optimistic, pessimistic,
and realistic. The average of the three values is then used to calculate the overall
estimation.
The Three-Point Method is particularly useful when dealing with uncertain tasks or
projects. It helps teams to account for the best and worst-case scenarios and consider
the likelihood of each scenario.
One problem with the Three-Point Method is that it sometimes doesn't yield accurate
estimates for uncertain tasks or projects. It's important to ensure that the team has
enough information about the task or project before starting the estimation process.
Good for: Large teams (though works at any size), complex and uncertain projects.
3. PERT Estimation
Team members identify the tasks required to complete the project and estimate the
expected duration for each task. The team members then identify the dependencies
between the tasks and create a network diagram. The expected duration of the project
is then calculated using a weighted average of the expected duration of each task.
But it can be time-consuming and requires a lot of information about the project.
Good for: Large teams (though works at any size), complex and uncertain projects,
and projects requiring risk management.
4. Analogous Estimation
Start by identifying similar past projects and estimate the effort required to complete
the current project based on the effort required for the past projects. This process is
particularly useful when dealing with similar projects or tasks.
Analogous estimation should be a quick and easy technique to use as it relies on past
project data. It helps teams to save time and resources by avoiding the need to
estimate each task or project from scratch.
You need to watch out for false positives – some similar-looking tickets may not be
so simple under the hood. Diving deeper into requirements can help.
Good for: Any situation where historical data is available and easy to access and
analyse, unless you have AI help. You'll need strong codebase knowledge (or AI) in
your team to capitalise on this method.
LSU categorises tasks based on their size and uncertainty. Identify the tasks required
to complete the project and categorise them as large, small, or uncertain.
The effort required for each category is then estimated, and the total effort for the
project is calculated.
The LSU Method is particularly useful when dealing with projects that involve a mix
of small and large tasks with varying degrees of uncertainty. It helps teams to account
for the size and uncertainty of each task and provides a more accurate estimation of
the overall effort required for the project.
One potential problem with the LSU Method is that it may be difficult to categorise
tasks accurately, especially if the team members have different opinions on what
constitutes a large, small, or uncertain task. It's important to ensure that the team has a
shared understanding of the task categorisation before starting the estimation process.
Popular but Problematic Software Estimation Techniques:
1. T-shirt sizing
Its simplicity attracts many agile teams as a method to assign story points, but it's a
magnet for inaccurate estimations. It's just too reliant on subjective opinions rather
than objective data, and too reliant on a system that is open to interpretation – in my
experience projects driven by T-shirt sizing often go sideways unexpectedly.
2. Bucket system
This one's really similar to our T-shirts. It works by dividing the project into buckets
or categories based on the level of effort required. Too much lack of clarity, too much
opinion.
3. Affinity mapping
4. Dot voting
Dot Voting estimates the effort required to complete a project by allowing team
members to vote on their estimation using dots. Same stuff – subjective opinions with
no built-in chance to surface underlying complexity or uncertainty.
They leverage vast amounts of data and sophisticated algorithms to generate highly
accurate predictions that are free from human bias. In sports, for example, teams that
use AI-powered prediction models have a significant competitive advantage over
those that rely on human estimation alone.
These models analyse large amounts of historical data, identify patterns and trends
that humans might miss, and generate reliable predictions for everything from player
performance to game outcomes.
As AI-powered prediction models continue to evolve and improve, they will become
an essential project planning tool for any organisation that wants to stay ahead of the
competition. Optimised resource management and task management will rely on AI.
Those that fail to adopt these technologies risk falling behind and losing their
competitive edge. In short, the future belongs to those who embrace AI-powered
prediction models.
UNIT FOUR
Risk analysis
This
example risk matrix shows the likelihood of a risk and its impact.
Anticipate and reduce the effect of harmful results from adverse events.
Evaluate whether the potential risks of a project are balanced by its benefits -- this
aids in the decision-making process when evaluating whether to move forward with
the project.
Plan responses for technology or equipment failure or loss from adverse events,
both natural and human-made.
Identify the impact of and prepare for changes in the enterprise environment,
including the likelihood of new competitors entering the market or changes to
government regulatory policies.
Allocate resources, such as time, money and employees, efficiently where they're
most needed.
Minimize losses. Identifying, rating and comparing the overall impact of risks to the
organization, in terms of both financial and organizational impacts, can help
management preemptively create a risk plan.
Strengthen security. Identifying potential gaps in security can help organizations
determine the steps they need to take to eliminate the weaknesses and strengthen
security.
Mitigate risks. Putting security controls in place can help organizations mitigate the
most important risks.
Improve resource optimization. Prioritizing risks and allocating resources more
effectively can help organizations address the most significant risks.
Increase awareness. Creating awareness among employees, decision-makers and
stakeholders about security measures and risks by highlighting best practices during
the risk analysis process can aid organizations.
Manage costs. Understanding the financial impact of potential security risks can
help organizations develop cost-effective methods for implementing these
information security policies and procedures.
Uncertain results. Since risk analysis is probabilistic in nature, it can never provide a
precise and correct evaluation of risk exposure and could end up overlooking some
risks. For instance, risk analysis is unable to forecast unforeseen, black swan events.
Complexity. Risk analysis is often a complex procedure since detecting and
evaluating all potential dangers requires considering a variety of risk factors.
Time consumption. The preparation, collection and analysis of data for a complete
risk analysis often requires a lot of time and effort.
Overemphasis on analysis. Organizations that place an excessive amount of
emphasis on the analysis might devote too much time assessing risks and not
enough time taking steps to address them. Additionally, it could cause companies to
divert resources from other, more profitable uses.
1. Identify the risk. The reason for performing a risk assessment is to evaluate an IT
system or other aspect of the organization to determine the risks to the software,
hardware, data and IT employees. What are the possible adverse events that could
occur, such as human error, fire, flooding or earthquakes? What is the potential that
the integrity of the system will be compromised or that it won't be available?
2. Perform a risk assessment. Getting input from management and department heads
is critical to the risk assessment process. The risk assessment survey is a way to
begin documenting specific risks or potential threats within each department.
3. Analyze the risks. Once the risks are identified, the risk analysis process should
determine the likelihood that each risk will occur, as well as the consequences linked
to each risk and how they might affect the objectives of a project.
4. Develop a risk management plan. Based on an analysis of which assets are valuable
and which threats might affect those assets negatively, the risk analysis should
produce a risk management plan and control recommendations that can be used to
mitigate, transfer, accept or avoid the risk.
5. Implement the risk management plan. The ultimate goal of risk assessment is to
implement measures to remove or reduce the risks. Starting with the high-risk
elements, resolve or at least mitigate each risk so it's no longer a threat.
6. Monitor the risks. The ongoing process of identifying, treating and managing risks
should be an important part of any risk analysis process.
Qualitative vs. quantitative risk analysis
The two main approaches to risk analysis are qualitative and quantitative. Qualitative
risk analysis typically means assessing the likelihood that a risk will occur based on
subjective qualities and the impact it could have on an organization using predefined
ranking scales. The impact of risks is often categorized into three levels: low, medium
or high. The probability that a risk will occur can also be expressed the same way or
categorized as the likelihood it will occur, ranging from 0% to 100%.
After receiving a project proposal for a luxury resort, the owner of the construction
company conducted a risk analysis to uncover potential hazards, liabilities and risk
mitigation strategies.
A car manufacturing plant performs a risk analysis to examine potential hazards in
the manufacturing process. This analysis pinpoints risks such as equipment failure
and accidents, as well as evaluates their likelihood and potential consequences.
An international shipping project being planned by a transport company involves a
risk analysis for potential project hazards, such as shipping costs, product damage
and delays. According to the study, the project is feasible, so the business decides to
reduce risks by purchasing shipment insurance and increasing its contingency
reserve.
UNIT FIVE
SQA’s ultimate goal is to catch a product’s shortcomings and deficiencies before the
general public sees it. If mistakes get caught in-house, it means fewer headaches for
the development team and a lot less angry customers.
These are the characteristics common to all software quality assurance processes:
Additionally, all software quality assurance programs contain the following ten vital
elements:
However, software quality assurance professionals ensure the product meets all the
company's quality standards and meets the client's expectations and demands. That
process covers more than just bad coding.
The SQAP identifies the team’s SQA responsibilities, identifies the SQA work
products, and lists any areas that require reviewing and auditing.
1. Purpose
2. Reference
3. Software configuration management
4. Problem reporting and corrective action
5. Tools, technologies, and methodologies
6. Code control
7. Records: Collection, maintenance, and retention
8. Testing methodology
SQA Techniques
Here are some examples of how quality assurance professionals implement SQA.
Auditing
This technique involves QA professionals inspecting the work to see if all standards
are followed.
Reviewing
In-house and outside stakeholders meet to examine the product, make comments on
what they find, and get approval.
Code Inspection
This technique is a formal code review using static testing to find bugs and defects.
This inspection requires a trained peer or mediator, not the original code author. The
inspection is based on established rules, checklists, and entry and exit criteria.
Design Inspection.
Design inspection employs a checklist that covers the following design areas:
Simulation.
Functional Testing.
This technique is a form of black-box testing where the QA person verifies what the
system does without caring about how it got there.
Walkthroughs.
Walkthroughs are peer reviews where the developer guides development team
members through the product. Members then raise queries, suggest alternatives, and
make comments about possible errors, standard violations, or any possible issues.
Stress Testing.
Nothing shows you how good a program is than running it under high-demand
conditions.
Six Sigma.
This is a well-respected quality assurance philosophy that strives for nearly perfect
products or services. Six Sigma’s main objective is a 99.76 % defect-free product.
What Are the Benefits of Software Quality
Assurance?
By now, you’re probably coming around to the idea that software quality assurance is
essential. Let’s seal the deal by listing some of its most significant advantages.
It saves money. Errors are costly. If a company releases a flawed application, they
will have to follow it up by releasing fixes, patches, and sometimes even complete
upgrades. These cost money. Furthermore, software companies can lose business
(as in, money!) if they have a reputation for poor quality, buggy software.
It saves time. CrossTalk, the Journal of Defense Software Engineering, reports it
could take up to 150 times longer to fix an error in production than to fix that error
in the design stage.
It prevents breakdowns and similar catastrophes. Taking a cue from the first two
points, breakdowns cost money, are time-consuming, and deny customers access to
the product or service. If there’s anything worse than a program with a few kinks
and bugs in it, it’s an application that ultimately fails.
It boosts consumer confidence. You can spend so much time creating a good
reputation, only to lose it overnight. Conversely, customers will flock to companies
that are known for producing quality releases.
It increases your market share. High-quality software puts your company in a
stronger, more dominant market position.
It cuts maintenance costs. Get the release right the first time, and your company can
forget about it and move on to the next big thing. Release a product with chronic
issues, and your business bogs down in a costly, time-consuming, never-ending cycle
of repairs.
It increases product safety. Although product safety sounds like something more
applicable to a physical product like a bike helmet, electrical appliance, or
automobile. However, “safety” becomes relevant when you factor in the concept of
cybersecurity. Many applications rely on an Internet connection, and if your product
leaves your customers vulnerable to data breaches, the results are catastrophic.
UNIT SIX
History
The history of software configuration management (SCM) in computing can be traced
back as early as the 1950s, when CM (configuration management), originally for
hardware development and production control, was being applied to software
development. Early software had a physical footprint, such as cards, tapes, and other
media. The first software configuration management was a manual operation. With
the advances in language and complexity, software engineering, involving
configuration management and other methods, became a major concern due to issues
like schedule, budget, and quality. Practical lessons, over the years, had led to the
definition, and establishment, of procedures and tools. Eventually, the tools became
systems to manage software changes.[4] Industry-wide practices were offered as
solutions, either in an open or proprietary manner (such as Revision Control System).
With the growing use of computers, systems emerged that handled a broader scope,
including requirements management, design alternatives, quality control, and more;
later tools followed the guidelines of organizations, such as the Capability Maturity
Model of the Software Engineering Institute.
The goals of SCM are generally
With the introduction of cloud computing and DevOps the purposes of SCM tools
have become merged in some cases. The SCM tools themselves have become virtual
appliances that can be instantiated as virtual machines and saved with state and
version. The tools can model and manage cloud-based virtual resources, including
virtual appliances, storage units, and software bundles. The roles and responsibilities
of the actors have become merged as well with developers now being able to
dynamically instantiate virtual servers and related resources.[
Configuration Identification
Configuration identification is a method of determining the scope of the software
system. With the help of this step, you can manage or control something even if you
don’t know what it is. It is a description that contains the CSCI type (Computer
Software Configuration Item), a project identifier and version information.
Identification of configuration Items like source code modules, test case, and
requirements specification.
Identification of each CSCI in the SCM repository, by using an object-oriented
approach
The process starts with basic objects which are grouped into aggregate objects.
Details of what, why, when and by whom changes in the test are made
Every object has its own features that identify its name that is explicit to all other
objects
List of resources required such as the document, the file, tools, etc.
Example:
Instead of naming a File login.php its should be named login_v1.2.php where v1.2
stands for the version number of the file
Baseline
A baseline is a formally accepted version of a software configuration item. It is
designated and fixed at a specific time while conducting the SCM process. It can only
be changed through formal change control procedures.
Change Control
Change control is a procedural method which ensures quality and consistency when
changes are made in the configuration object. In this step, the change request is
submitted to software configuration manager.
2. Developer
The developer needs to change the code as per standard development activities or
change requests. He is responsible for maintaining configuration of code.
The developer should check the changes and resolves conflicts
3. Auditor
4. Project Manager:
5. User
The end user should understand the key SCM terms to ensure he has the latest version
of the software
The SCMP can follow a public standard like the IEEE 828 or organization specific
standard
It defines the types of documents to be management and a document naming.
Example Test_v1
SCMP defines the person who will be responsible for the entire SCM process and
creation of baselines.
Fix policies for version management & change control
Define tools which can be used during the SCM process
Configuration management database for recording configuration information.
Concurrency Management:
When two or more tasks are happening at the same time, it is known as concurrent
operation. Concurrency in context to SCM means that the same file being edited by
multiple persons at the same time.
If concurrency is not managed correctly with SCM tools, then it may create many
pressing issues.
Version Control:
SCM uses archiving method or saves every change made to file. With the help of
archiving or save feature, it is possible to roll back to the previous version in case of
issues.
Synchronization:
Users can checkout more than one files or an entire copy of the repository. The user
then works on the needed file and checks in the changes back to the repository.They
can synchronize their local copy to stay updated with the changes made by other team
members.
Conclusion
Configuration Management best practices helps organizations to systematically
manage, organize, and control the changes in the documents, codes, and other
entities during the Software Development Life Cycle.
The primary goal of the SCM process is to increase productivity with minimal
mistakes
The main reason behind configuration management process is that there are
multiple people working on software which is continually updating. SCM helps
establish concurrency, synchronization, and version control.
A baseline is a formally accepted version of a software configuration item
Change control is a procedural method which ensures quality and consistency when
changes are made in the configuration object.
Configuration status accounting tracks each release during the SCM process
Software Configuration audits verify that all the software product satisfies the
baseline needs
Project manager, Configuration manager, Developer, Auditor, and user are
participants in SCM process
The SCM process planning begins at the early phases of a project.
Git, Team foundation Sever and Ansible are few popular SCM tools.