Cloud Computing1
Cloud Computing1
Cloud Computing1
CLOUD COMPUTING
UNIT -I
Cloud Computing is the delivery of computing services such as servers, storage, databases,
networking, software, analytics, intelligence, and more, over the Cloud (Internet).
But if we choose Cloud Computing, a cloud vendor is responsible for the hardware purchase and
maintenance. They also provide a wide variety of software and platform as a service. We can
take any required services on rent. The cloud computing services will be charged based on usage.
The cloud computing model affords the opportunity to deliver applications via the Internet,
preclude the costs of owning and operating data centers, and leverage the work of software
developers. Cloud computing and services are typically based on the ownership of the
infrastructure (and to whom services are offered) and based on the general architecture visible to
users (e.g., are they providing a platform for applications, or are they providing complete
application software solutions as a service). Based on these services provided Cloud computing
can be divided into three main models:
Software-as-a-Service (SaaS)
Infrastructure-as-a-Service (IaaS) and
Platform-as-a-Service (PaaS)
These three services make up the Cloud Computing Stack, with SaaS on top, PaaS in the middle,
and IaaS on the bottom.
3
There are more than 3 service models that are widely in use today. More service models like
‘Data Analytics as a Service’ and ‘HPC/Grid as a Service’ are emerging as useful models. To
select the right service model factors such as availability of suitable application software, need
for development and testing environment, need for effective computing infrastructure control and
management required distribution of data, services, and infrastructure, existence, and complexity
of enterprise IT, infrastructure, and datacenter/warehouse are essential.
Cloud computing can also be categorized based on the deployment models. These classifications
are based on the ability of an organization to manage business needs and secure assets
1. Public cloud
2. Private cloud
3. Hybrid cloud
Many of us use cloud computing every day. And, that too without realizing it. When you ask
Google for an answer, your computer or laptop isn't playing many parts in finding the answers
you need.
Words involved in your queries are sent to one of hundreds of thousands of clustered PCs
managed by Google. And, upon successful search, you get the answer. The entire process
happens in a split second. When you do a Google search, the real work in finding your answers
might be done by a computer/server sitting in California, Dublin, Tokyo, or New Zealand.
The Web-based email system is one of the common examples. Hotmail is one of the oldest
which came along and carried email off into the cloud.
4
Preparing documents over the Net is a relatively newer example of cloud computing. Log on to a
web-based service such as Google Documents, Google Forms, etc. and you can create a
document, spreadsheet, presentation, or whatever you like using Web-based software. The
document you produce is stored remotely, on a Web server, so you can access it from any
Internet-connected computer, anywhere in the world, any time you like, and download when you
want it.
When you use your computer for a web-based service like this, you are outsourcing your
computing needs to companies like Google. Google invests in software development and keeps it
up-to-date. They generate revenue by offering a host of paid services and through advertising as
well.
Infrastructure as a Service (IaaS) - This means buying access to raw computing hardware
over the Net, such as servers or storage. Web hosting like GoDaddy is a simple example
of IaaS: you pay a monthly subscription or a per-megabyte/gigabyte fee to have a hosting
company serve up files for your website from their servers.
Platform as a Service (PaaS) – In this module, you develop your web-based applications
so they run on systems software and hardware of the cloud provider. Like an e-commerce
website where the shopping cart, checkout, and payment mechanism are running on a
merchant's server. Big Basket, Flipkart, etc. are few examples of PaaS.
Public cloud is a classic cloud computing model. In this model, the users can access a large pool
of computing power over the internet (whether that is IaaS, PaaS, or SaaS). One of the
significant benefits here is the ability to rapidly scale a service. The cloud computing suppliers
have vast amounts of computing power, which they share out between a large number of
customers -- the 'multi-tenant architecture. The huge scale of public cloud implies that they have
enough spare capacity that they can easily cope up with the customer needing more resources.
This is why it is often used for low-sensitive applications which demand a varying amount of
resources.
Private cloud allows organizations to benefit from some of the advantages of public cloud -- but
without the concerns about relinquishing control over data and services because it is tucked away
behind the corporate firewall. Companies can control exactly where their data is being held and
can build the infrastructure in a way they want -- largely for IaaS or PaaS projects -- to give
developers access to a pool of computing power that scales on-demand without putting security
at risk. This additional security comes at a cost, as few companies will have the scale of AWS,
Microsoft, or Google. This means they will not be able to create the same economies of scale to
get costs down. Still, for companies requiring additional security, a private cloud may be a useful
stepping stone, helping them to understand cloud services or rebuild internal applications for the
cloud, before shifting them into the public cloud.
A hybrid cloud is where all users are in reality. It is a bit of this, a bit of that. Here the companies
can store some data in the public cloud, few projects in the private cloud, manage multiple
vendors, and usage of different levels of cloud usage. According to research by TechRepublic,
the main reasons for choosing a hybrid cloud include disaster recovery planning and the desire to
avoid hardware costs when expanding their existing data center.
o Virtualization
o Grid Computing
o Utility Computing
Virtualization :
7
Virtualization is the process of creating a virtual environment to run multiple applications and
operating systems on the same server. The virtual environment can be anything, such as a
The concept of Virtualization in cloud computing increases the use of virtual machines. A virtual
machine is a software computer or software program that not only works as a physical
computer but can also function as a physical machine and perform tasks such as running
applications .
computing solutions according to the change of business needs. It can work without or with
cloud computing. The advantages of using SOA is that it is easy to maintain, platform
Service Provider and Service consumer are the two major roles within SOA.
o In the air force, SOA infrastructure is used to deploy situational awareness systems.
Grid Computing :
combines various different computing resources from multiple locations to achieve a common
8
goal. In grid computing, the grid is connected by parallel nodes to form a computer cluster.
These computer clusters are in different sizes and can run on any operating system.
2. Provider: It is a computer which contributes its resources in the network resource pool.
Mainly, grid computing is used in the ATMs, back-end infrastructures, and marketing research.
Utility Computing :
Utility computing is the most trending IT service model. It provides on-demand computing
resources (computation, storage, and programming services via API) and infrastructure based
on the pay per use method. It minimizes the associated costs and maximizes the efficient use of
resources. The advantage of utility computing is that it reduced the IT cost, provides greater
Large organizations such as Google and Amazon established their own utility services for
Cloud based services provide information technology (IT) as a service over the Internet or
dedicated network, with delivery on demand, and payment based on usage. Cloud based
services range from full applications and development platforms, to servers, storage, and
virtual desktops.
Multi-tenancy - resources are pooled and shared among multiple users to gain economies of
scale
Network-access - resources are accessed via web-browser or thin client using a variety of
configurations
Cloud service providers provide various applications in the field of art, business, data storage
The most widely used cloud computing applications are given below -
1. Art Applications
Cloud computing offers various art applications for quickly and easily design attractive cards,
booklets, and images. Some most commonly used cloud art applications are given below:
i Moo
Moo is one of the best cloud art applications. It is used for designing and printing business
ii. Vistaprint
Vistaprint allows us to easily design various printed marketing products such as business cards,
Adobe creative cloud is made for designers, artists, filmmakers, and other creative
2. Business Applications
Business applications are based on cloud service providers. Today, every organization requires
the cloud business application to grow their business. It also ensures that business applications
i. MailChimp
iii. Salesforce
Salesforce platform provides tools for sales, service, marketing, e-commerce, and more. It also
iv. Chatter
Chatter helps us to share important information about the organization in real time.
v. Bitrix24
collaboration tools.
vi. Paypal
Paypal offers the simplest and easiest online payment mode using a secure internet account.
Paypal accepts the payment through debit cards, credit cards, and also from Paypal account
holders.
11
vii. Slack
Slack stands for Searchable Log of all Conversation and Knowledge. It provides a user-
friendly interface that helps us to create public and private channels for communication.
viii. Quickbooks
Quickbooks works on the terminology "Run Enterprise anytime, anywhere, on any device.
provides online accounting solutions for the business. It allows more than 20 users to work
Cloud computing allows us to store information (data, files, images, audios, and videos) on the
cloud and access this information using an internet connection. As the cloud provider is
responsible for providing security, so they offer various backup recovery application for
A list of data storage and backup applications in the cloud are given below -
i. Box.com
workflow, and collaboration. It allows us to store different files such as Excel, Word, PDF, and
images on the cloud. The main advantage of using box is that it provides drag & drop
service for
files and easily integrates with Office 365, G Suite, Salesforce, and more than 1400 tools.
ii. Mozy
Mozy provides powerful online backup solutions for our personal and business data. It
iii. Joukuu
Joukuu provides the simplest way to share and track cloud-based backup files. Many users use
12
Google G Suite is one of the best cloud storage and backup application. It includes Google
Calendar, Docs, Forms, Google+, Hangouts, as well as cloud storage and tools for managing
cloud apps. The most popular app in the Google G Suite is Gmail. Gmail offers free email
services to users.
4. Education Applications
Cloud computing in the education sector becomes very popular. It offers various online
distance learning platforms and student information portals to the students. The advantage of
using cloud in the field of education is that it offers strong virtual classroom environments, Ease
of accessibility, secure data storage, scalability, greater reach for the students, and minimal
Google Apps for Education is the most widely used platform for free web-based email,
Chromebook for Education is one of the most important Google's projects. It is designed
for the purpose that it enhances education innovation.
It allows educators to quickly implement the latest technology solutions into the classroom and
make it available to their students.
and schools.
5. Entertainment Applications
Entertainment industries use a multi-cloud strategy to interact with the target audience. Cloud
computing offers various entertainment applications such as online games and video
conferencing.
i. Online games
Today, cloud gaming becomes one of the most important entertainment media. It offers
various online games that run remotely from the cloud. The best cloud gaming services are
Video conferencing apps provides a simple and instant connected experience. It allows us to
communicate with our business partners, friends, and relatives using a cloud-based video
conferencing. The benefits of using video conferencing are that it reduces cost, increases
6. Management Applications :
Cloud computing offers various cloud management tools which help admins to manage all
types of cloud activities, such as resource deployment, data integration, and disaster recovery.
These management tools also provide administrative control over the platforms, applications,
and infrastructure.
i. Toggl
Toggl helps users to track allocated time period for a particular project.
ii. Evernote
Evernote allows you to sync and save your recorded notes, typed notes, and other notes in one
convenient place. It is available for both free as well as a paid version.
It uses platforms like Windows, macOS, Android, iOS, Browser, and Unix.
14
iii. Outright
Outright is used by management users for the purpose of accounts. It helps to track income,
iv. GoToMeeting
GoToMeeting provides Video Conferencing and online meeting apps, which allows you to start
a meeting with your business partners from anytime, anywhere using mobile phones or tablets.
Using GoToMeeting app, you can perform the tasks related to the management such as join
meetings in seconds, view presentations on the shared screen, get alerts for upcoming
meetings, etc.
7. Social Applications
Social cloud applications allow a large number of users to connect with each other using social
i. Facebook
Facebook is a social networking website which allows active users to share files, photos,
videos, status, more to their friends, relatives, and business partners using the cloud storage
system. On Facebook, we will always get notifications when our friends like and comment on
the posts.
ii. Twitter
Twitter is a social networking site. It is a microblogging system. It allows users to follow high
profile celebrities, friends, relatives, and receive news. It sends and receives short posts called
tweets.
15
iii. Yammer
Yammer is the best team collaboration tool that allows a team of employees to chat, share
iv. LinkedIn
Virtualization :
Virtualization is the process of creating a virtual environment to run multiple applications and
operating systems on the same server. The virtual environment can be anything, such as a
The concept of Virtualization in cloud computing increases the use of virtual machines. A virtual
machine is a software computer or software program that not only works as a physical
computer but can also function as a physical machine and perform tasks such as running
Types of Virtualization
i. Hardware virtualization
1) Hardware Virtualization:
When the virtual machine software or virtual machine manager (VMM) is directly installed on
the hardware system is known as hardware virtualization.
16
The main job of hypervisor is to control and monitoring the processor, memory and other
hardware resources.
After virtualization of hardware system we can install different operating system on it and run
different applications on those OS.
Usage:
Hardware virtualization is mainly done for the server platforms, because controlling virtual
machines is much easier than controlling a physical server.
When the virtual machine software or virtual machine manager (VMM) is installed on the Host
operating system instead of directly on the hardware system is known as operating system
virtualization.
Usage:
Operating System Virtualization is mainly used for testing the applications on different platforms
of OS.
3) Server Virtualization:
When the virtual machine software or virtual machine manager (VMM) is directly installed on
the Server system is known as server virtualization.
Usage:
Server virtualization is done because a single physical server can be divided into multiple servers
on the demand basis and for balancing the load.
4) Storage Virtualization:
Storage virtualization is the process of grouping the physical storage from multiple network
storage devices so that it looks like a single storage device.
Usage:
Load balancing is the method that allows you to have a proper balance of the amount of work
being done on different pieces of device or hardware equipment. Typically, what happens is
that the load of the devices is balanced between different servers or between the CPU and hard
Load balancing was introduced for various reasons. One of them is to improve the speed and
performance of each single device, and the other is to protect individual devices from hitting
Cloud load balancing is defined as dividing workload and computing properties in cloud
distributing resources among multiple computers, networks or servers. Cloud load balancing
involves managing the movement of workload traffic and demands over the Internet.
of the servers, mainly for the popular web servers. There are two primary solutions to
server. However, the new server may also be overloaded soon, demanding another
of servers is built. That's why it is more cost-effective and more scalable to build a server
Cloud-based servers can achieve more precise scalability and availability by using farm server
load balancing. Load balancing is beneficial with almost any type of service, such as HTTP,
1. Static Algorithm
Static algorithms are built for systems with very little variation in load. The entire traffic is
divided equally between the servers in the static algorithm. This algorithm requires in-depth
knowledge of server resources for better performance of the processor, which is determined at
2. Dynamic Algorithm
The dynamic algorithm first finds the lightest server in the entire network and gives it priority
for load balancing. This requires real-time communication with the network which can help
increase the system . Here, the current state of the system is used to control the load.
The characteristic of dynamic algorithms is to make load transfer decisions in the current
system state. In this system, processes can move from a highly used machine to an
As the name suggests, round robin load balancing algorithm uses round-robin method to assign
jobs. First, it randomly selects the first node and assigns tasks to other nodes in a round-robin
Processors assign each process circularly without defining any priority. It gives fast response in
Cloud Elasticity :
19
The Elasticity refers to the ability of a cloud to automatically expand or compress the
infrastructural resources on a sudden-up and down in the requirement so that the workload
can be managed efficiently. This elasticity helps to minimize infrastructural cost. This is not
applicable for all kind of environment, it is helpful to address only those scenarios where the
resources requirements fluctuate up and down suddenly for a specific time interval. It is not
quite practical to use where persistent resource infrastructure is required to handle the heavy
workload.
It is most commonly used in pay-per-use, public cloud services. Where IT managers are willing
to pay only for the duration to which they consumed the resources.
Cloud Scalability :
Cloud scalability is used to handle the growing workload where good performance is also
needed to work efficiently with software or applications. Scalability is commonly used where
Types of Scalability :
In this type of scalability, we increase the power of existing resources in the working
2. Horizontal Scalability –
3. Diagonal Scalability –
It is a mixture of both Horizontal and Vertical scalability where the resources are added both
It is a short term planning and adopted just Scalability is a long term planning and
to deal with an unexpected increase in adopted just to deal with an expected increase
4 demand or seasonal demands. in demand.
Today, organizations have many exciting opportunities to reimagine, repurpose and reinvent
their businesses with the cloud. The last decade has seen even more businesses rely on it for
quicker time to market, better efficiency, and scalability. It helps them achieve lo ng-term digital
goals as part of their digital strategy.
Though the answer to which cloud model is an ideal fit for a business depends on your
organization's computing and business needs. Choosing the right one from the various types of
cloud service deployment models is essential. It would ensure your business is equipped with the
performance, scalability, privacy, security, compliance & cost-effectiveness it requires. It is
important to learn and explore what different deployment types can offer - around what
particular problems it can solve.
Read on as we cover the various cloud computing deployment and service models to help
discover the best choice for your business.
It works as your virtual computing environment with a choice of deployment model depending
on how much data you want to store and who has access to the Infrastructure.
21
Most cloud hubs have tens of thousands of servers and storage devices to enable fast loading. It
is often possible to choose a geographic area to put the data "closer" to users. Thus, deployment
models for cloud computing are categorized based on their location. To know which model
would best fit the requirements of your organization, let us first learn about the various types.
Public Cloud
The name says it all. It is accessible to the public. Public deployment models in the cloud are
perfect for organizations with growing and fluctuating demands. It also makes a great choice for
companies with low-security concerns. Thus, you pay a cloud service provider for networking
services, compute virtualization & storage available on the public internet. It is also a great
delivery model for the teams with development and testing. Its configuration and deployment are
quick and easy, making it an ideal choice for test environments.
o Minimal Investment - As a pay-per-use service, there is no large upfront cost and is ideal
for businesses who need quick access to resources
o No Hardware Setup - The cloud service providers fully fund the entire Infrastructure
22
o No Infrastructure Management - This does not require an in-house team to utilize the
public cloud.
o Data Security and Privacy Concerns - Since it is accessible to all, it does not fully protect
against cyber-attacks and could lead to vulnerabilities.
o Reliability Issues - Since the same server network is open to a wide range of users, it can
lead to malfunction and outages
o Service/License Limitation - While there are many resources you can exchange with
tenants, there is a usage cap.
Private Cloud
Now that you understand what the public cloud could offer you, of course, you are keen to know
what a private cloud can do. Companies that look for cost efficiency and greater control over
data & resources will find the private cloud a more suitable choice.
It means that it will be integrated with your data center and managed by your IT team.
Alternatively, you can also choose to host it externally. The private cloud offers bigger
opportunities that help meet specific organizations' requirements when it comes to customization.
It's also a wise choice for mission-critical processes that may have frequently changing
requirements.
23
o Data Privacy - It is ideal for storing corporate data where only authorized personnel gets
access
o Security - Segmentation of resources within the same Infrastructure can help with better
access and higher levels of security.
o Supports Legacy Systems - This model supports legacy systems that cannot access the
public cloud.
o Higher Cost - With the benefits you get, the investment will also be larger than the public
cloud. Here, you will pay for software, hardware, and resources for staff and training.
o Fixed Scalability - The hardware you choose will accordingly help you scale in a certain
direction
o High Maintenance - Since it is managed in-house, the maintenance costs also increase.
Community Cloud
The community cloud operates in a way that is similar to the public cloud. There's just one
difference - it allows access to only a specific set of users who share common objectives and use
cases. This type of deployment model of cloud computing is managed and hosted internally or by
a third-party vendor. However, you can also choose a combination of all three.
24
o Smaller Investment - A community cloud is much cheaper than the private & public
cloud and provides great performance
o Setup Benefits - The protocols and configuration of a community cloud must align with
industry standards, allowing customers to work much more efficiently.
Hybrid Cloud
As the name suggests, a hybrid cloud is a combination of two or more cloud architectures. While
each model in the hybrid cloud functions differently, it is all part of the same architecture.
Further, as part of this deployment of the cloud computing model, the internal or external
providers can offer resources.
Let's understand the hybrid model better. A company with critical data will prefer storing on a
private cloud, while less sensitive data can be stored on a public cloud. The hybrid cloud is also
frequently used for 'cloud bursting'. It means, supposes an organization runs an application on-
premises, but due to heavy load, it can burst into the public cloud.
25
o Cost-Effectiveness - The overall cost of a hybrid solution decreases since it majorly uses
the public cloud to store data.
o Security - Since data is properly segmented, the chances of data theft from attackers are
significantly reduced.
o Flexibility - With higher levels of flexibility, businesses can create custom solutions that
fit their exact requirements
What is Replication?
Data replication is the process by which data residing on a physical/virtual server(s) or cloud
instance (primary instance) is continuously replicated or copied to a secondary server(s) or cloud
instance (standby instance). Organizations replicate data to support high availability, backup,
and/or disaster recovery. Depending on the location of the secondary instance, data is either
synchronously or asynchronously replicated. How the data is replicated impacts Recovery Time
Objectives (RTOs) and Recovery Point Objectives (RPO).
26
For example, if you need to recover from a system failure, your standby instance should be on
your local area network (LAN). For critical database applications, you can then replicate data
synchronously from the primary instance across the LAN to the secondary instance. This makes
your standby instance “hot” and in sync with your active instance, so it is ready to take over
immediately in the event of a failure. This is referred to as high availability (HA).
Cloud monitoring :
Cloud monitoring is a method of reviewing, observing, and managing the operational workflow
in a cloud-based IT infrastructure. Manual or automated management techniques confirm the
availability and performance of websites, servers, applications, and other cloud infrastructure.
This continuous evaluation of resource levels, server response times, and speed predicts possible
vulnerability to future issues before they arise.
The cloud has numerous moving components, and for top performance, it’s critical to safeguard
that everything comes together seamlessly. This need has led to a variety of monitoring
techniques to fit the type of outcome that a user wants. The main types of cloud monitoring are:
Database monitoring :
Because most cloud applications rely on databases, this technique reviews processes, queries,
availability, and consumption of cloud database resources. This technique can also track queries
and data integrity, monitoring connections to show real-time usage data. For security purposes,
access requests can be tracked as well. For example, an uptime detector can alert if there’s
database instability and can help improve resolution response time from the precise moment that
a database goes down.
Website monitoring :
A website is a set of files that is stored locally, which, in turn, sends those files to other
computers over a network. This monitoring technique tracks processes, traffic, availability, and
resource utilization of cloud-hosted sites.
This technique tracks multiple analytics simultaneously, monitoring storage resources and
processes that are provisioned to virtual machines, services, databases, and applications. This
technique is often used to host infrastructure-as-a-service (IaaS) and software-as-a-service (SaaS)
solutions. For these applications, you can configure monitoring to track performance metrics,
processes, users, databases, and available storage. It provides data to help you focus on useful
features or to fix bugs that disrupt functionality.
Scaling for increased activity is seamless and works in organizations of any size
Software-defined networking (SDN) is the decoupling of the network control logic from the
devices performing the function, such as routers, which control the movement of information in
the underlying network. This approach simplifies the management of infrastructure, which may
be specific to one organization or partitioned to be shared among several.
SDN features controllers that overlay above the network hardware in the cloud or on-premises,
offering policy-based management. Technically speaking,
the network control plane and forwarding plane are separated from the data plane (or underlying
infrastructure), enabling the organization to program network control directly. This differs
significantly from traditional data center environments. In a traditional environment, a router or
switch — whether in the cloud or physically in the data center — will only be aware of the status
of network devices adjacent to it. With SDN, the intelligence is centralized and prolific; it can
view and control everything.
The components of software-defined networking
Software-defined networking (SDN) consists of two main components that may or may not be
located in the same physical area:
Applications that relay information about the network or requests for specific resource
availability or allocation.
SDN controllers that communicate with the applications to determine the destination of
data packets. The controllers are the load balancers within SDN.
28
The term “Virtual Network” is sometimes erroneously used synonymously with the term SDN.
These two concepts are distinctly different, but they do work well together.
Network functions virtualization (NFV) is the replacement of network appliance hardware with
virtual machines. The virtual machines use a hypervisor to run networking software and
processes such as routing and load balancing.
NFV allows for the separation of communication services from dedicated hardware, such as
routers and firewalls. This separation means network operations can provide new services
dynamically and without installing new hardware. Deploying network components with network
functions virtualization takes hours instead of months like with traditional networkingAdditional
reasons to use network functions virtualization include:
While NFV separates networking services from dedicated hardware appliances, software-defined
networking, or SDN, separates the network control functions such as routing, policy definition
and applications from network forwarding functions. With SDN, a virtual network control plane
decides where to send traffic, enabling entire networks to be programmed through one pane of
glass. SDN allows network control functions to be automated, which makes it possible for the
network to respond quickly to dynamic workloads. A software-defined network can sit on top of
either a virtual network or a physical network, but a virtual network does not require SDN to
operate. Both SDN and NFV rely on virtualization technology to function.
IAM is a cloud service that controls the permissions and access for users and cloud resources.
IAM policies are sets of permission policies that can be attached to either users or cloud
resources to authorize what they access and what they can do with it.
The concept “identity is the new perimeter” goes as far back as the ancient times of 2012, when
AWS first announced their IAM service. We’re now seeing a renewed focus on IAM due to the
rise of abstracted cloud services and the recent wave of high-profile data breaches.
Services that don’t expose any underlying infrastructure rely heavily on IAM for security. For
example, consider an application that follows this flow: a Simple Notification Service (SNS)
topic triggers a Lambda function, which in turn puts an item in a DynamoDB table. In this type
of application, there is no network to inspect, so identity and permissions become the most
significant aspects of security.
30
As an example of the impact of a strict (or over-permissive) IAM profile, let’s consider the
Lambda function. The function is only supposed to put items in the DynamoDB table. What
happens if the function has full DynamoDB permissions? If the function is compromised for
whatever reason, the DynamoDB table is immediately compromised as well, since the function
could be leveraged to exfiltrate data.
If the IAM profile follows the “least-privilege” principle and only allows the function to put
items in the table, the blast radius will be greatly reduced in the case of an incident. A hands-on
example of this can be found in this CNCF webinar.
Managing a large number of privileged users with access to an ever-expanding set of services is
challenging. Managing separate IAM roles and groups for these users and resources adds yet
another layer of complexity. Cloud providers like AWS and Google Cloud help customers solve
these problems with tools like the Google Cloud IAM recommender (currently in beta) and the
AWS IAM access advisor. These tools attempt to analyze the services last accessed by users and
resources, and help you find out which permissions might be over-privileged.
specifies who governs when there is a service interruption and describes penalties if service
levels are not met.
31
A cloud infrastructure can span geographies, networks and systems that are both physical and
virtual. While the exact metrics of a cloud SLA can vary by service provider, the areas covered
are uniform:
speed;
responsiveness; and
efficiency.
The SLA document aims to establish a mutual understanding of the services, prioritized areas,
responsibilities, guarantees and warranties provided by the service provider. It clearly outlines
metrics and responsibilities among the parties involved in cloud configurations, such as the
specific amount of response time to report or address system failures.