BCA 2nd Year

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 31

1.

peer-to-peer (P2P) network is created when two or more PCs are connected and share
resources without going through a separate server computer.
2. In the peer to peerComputing model we simply use the same Workgroup for all the
computers and a unique name for each computer in a computer network.
3. There is no master or controller or central server in this computer network and computers join
hands to share files, printers and Internet access.
4. It is practical for workgroups of a dozen or less computers making it common environments,
where each PC acts as an independent workstation and maintaining its own security that
stores data on its own disk but which can share it with all other PCs on the network.
5. Software for peer-to-peer network is included with most modern desktop operating systems
such as Windows and Mac OS.
5. Peer to peer relationship is suitable for small networks having less than 10 computers on a
single LAN.
6. A peer-to-peer (P2P) network is group of computers, each of which acts as a node for sharing
files within the group. Instead of having a central server to act as a shared drive, each
computer acts as the server for the files stored upon it. When a P2P network is established
over the Internet,
ARPnet

Advantages of Peer to Peer Networks


Peer to peer networks have following advantages:
1. Such networks are easy to set up and maintain as each computer manages itself.
2. It eliminates extra cost required in setting up the server.
3. Since each device is master of its own, they an: not dependent on other computers for their
operations.

Disadvantages of Peer to Peer Networks


1. In peer-to-peer network, the absence of centralized server make it difficult to backup data
as data is located on different workstations.
2. Security is weak as each system manages itself only.
3. There is no central point of data storage for file archiving.

BASIS FOR
CLIENT-SERVER PEER-TO-PEER
COMAPAISON

Basic There is a specific server and Clients and server are not

specific clients connected to the distinguished; each node act as client


BASIS FOR
CLIENT-SERVER PEER-TO-PEER
COMAPAISON

server. and server.

Service The client request for service and Each node can request for services

server respond with the service. and can also provide the services.

Focus Sharing the information. Connectivity.

Data The data is stored in a centralized Each peer has its own data.

server.

Server When several clients request for As the services are provided by

the services simultaneously, a several servers distributed in the

server can get bottlenecked. peer-to-peer system, a server in not

bottlenecked.

Expense The client-server are expensive Peer-to-peer are less expensive to

to implement. implement.

Stability Client-Server is more stable and Peer-toPeer suffers if the number of

scalable. peers increases in the system.


A Distributed system is a collection of independent computers, interconnected via a network,
capable of collaborating on a task. A Distributed computing is a model of computation that is
firmly related to Distributed Systems, refers to as multiple computer systems located at
different places linked together over a network and use to solve higher level computation
According to the definitions; All the computers are tied together in a network either a Local
Area Network (LAN) or Wide Area Network (WAN), communicating with each other so that
different portions of a Distributed applications run on different computers from any
geographical location. Every node on the Distributed computing is autonomous machines (do
not physically share memory or processors but thereby sharing resources such as printers and
databases).
Distributed Systems have broken down into two parts: the front end and the back end. The
front end, the part of the application that the user interacts with to determine
what information wants to examine and how to organize it, runs on the user's computer. The
back end, the part of the application that finds and sorts the requested information, runs on a
central computer somewhere else. This type of distributed computing also referred to as
"client-server architecture," splits up the functioning of applications across some separate
computers.
distributed applications run on multiple systems simultaneously. Traditional applications
need to be installed on every system and make it hard to maintain. However, In Distributed
computing, applications run on both simultaneously. With distributed computing, if a
workstation that goes down, another workstation can resume the jobs.
Advantage of distributed Computing
Performance – By using the combined processing and storage capacity of many nodes.
Performance level can be reached that are out of scope of centralized machine.
Scalability – Resources such as processes and storage capacity can be increased
incrementally
Inherent distribution – some application like web are naturally distributed
Reliability – by having redundant component the impact of hardware and software fault on
user can be reduced
Disadvantage ofdistributed Computing
Multiple point to be failure –
 Complexities
 Security – more opportunity for unauthorized attack
 Expensive
Cloud Computing architecture comprises of many cloud components, which are loosely
coupled. We can broadly divide the cloud architecture into two parts:

 Front End
 Back End
It is the responsibility of the back-end to provide the security of data for cloud users along
with the traffic control mechanism. The server also provides the middleware which helps to
connect devices & communicate with each other.

Server
The server helps to compute the resource sharing and offers other services such as resource
allocation and de-allocation, monitoring the resources, providing security etc.

Storage
Cloud keeps multiple replicas of storage. If one of the storage resources fails, then it can be
extracted from another one, which makes cloud computing more reliable.

Management
It helps to maintain and configure the infrastructure.

Cloud Storage
Cloud storage is a cloud computing model in which data is stored on remote servers accessed
from the internet, or "cloud." It is maintained, operated and managed by a cloud storage
service provider on a storage server that are built on virtualization techniques.
Cloud storage works through data center virtualization, providing end users and applications
with a virtual storage architecture that is scalable according to application requirements. In
general, cloud storage operates through a web-based API that is remotely implemented
through its interaction with the client application's in-house cloud storage infrastructure for
input/output (I/O) and read/write (R/W) operations.
When delivered through a public service provider, cloud storage is known as utility storage.
Private cloud storage provides the same scalability, flexibility and storage mechanism with
restricted or non-public access.

Benefits of Cloud Storage


Storing data in the cloud lets IT departments transform three areas:
1. Total Cost of Ownership. With cloud storage, there is no hardware to purchase, storage to
provision, or capital being used for "someday" scenarios. You can add or remove capacity
on demand, quickly change performance and retention characteristics, and only pay for
storage that you actually use. Less frequently accessed data can even be automatically
moved to lower cost tiers in accordance with auditable rules, driving economies of scale.

2. Time to Deployment. When development teams are ready to execute, infrastructure should
never slow them down. Cloud storage allows IT to quickly deliver the exact amount of
storage needed, right when it's needed. This allows IT to focus on solving complex
application problems instead of having to manage storage systems.

3. Information Management. Centralizing storage in the cloud creates a tremendous leverage


point for new use cases. By using cloud storage lifecycle management policies, you can
perform powerful information management tasks including automated tiering or locking
down data in support of compliance requirements.
Types of Cloud Storage
There are three types of cloud data storage: object storage, file storage, and block storage.
Each offers their own advantages and have their own use cases:
1. Object Storage - Applications developed in the cloud often take advantage of object
storage's vast scalability and metadata characteristics.

2. File Storage - Some applications need to access shared files and require a file system. This
type of storage is often supported with a Network Attached Storage (NAS) server

3. Block Storage - Other enterprise applications like databases or ERP systems often require
dedicated, low latency storage for each host. This is analogous to direct-attached storage
(DAS) or a Storage Area Network (SAN).
The hypervisor can be defined as the firmware (a permanent set of instruction or code
programmed into the read-only memory & is a low-level program) that acts as a manager for
the virtual machine. It is also called Virtual Machine Monitor (VMM) which creates & runs
the virtual machine. It provides the guest OS with a virtual operating platform to manages the
execution of other applications. There are two types of the hypervisor.
These are:

 Native Hypervisor
 Hosted Hypervisor

In cloud technology, virtualized resources are kept & maintained by the service provider or
the department of IT; these resources comprise of servers, memory, network switches,
firewalls, load-balancers & storage. In the cloud computing architecture, the cloud
infrastructure referred to the back-end components of the cloud.

Management Software firstly helps to configure the infrastructure then maintaining it.
The Deployment software, on the other hand, is used to deploy & combine all applications on
the cloud.

Network, as we all know is the key part of cloud technology allowing users to connect to the
cloud via the internet. Multiple copies of data are kept stored in the cloud. This is because, if
any storage resource fails - then the data can be extracted from another one. So, storage is
another essential component of cloud infrastructure.

Software as a service (SaaS) is a software distribution model in which a third-party provider


hosts application and makes them available to customers over the Internet. SaaS removes the
need for organizations to install and run applications on their own computers or in their own
data centers.SaaS is also known as "On-Demand Software".SaaS allows users to ‘rent’ or
subscribe to a software application and execute it online, rather than purchasing it to install
on in-house computers. The Cloud services provider installs, operates, and maintains the
required type of software application on behalf of the company. This reduces the installation
and maintenance costs typically associated with IT platforms or infrastructures.
SaaS is universally accessible: the user can run the software on any platform or device.

Advantages of SaaS cloud computing layer


1) SaaS is easy to buy

SaaS pricing is based on a monthly fee or annual fee, SaaS allows organizations to access
business functionality at a low cost which is less than licensed applications. Applications are
ready to use once the user subscribes

Unlike traditional software which is sold as a licensed based with an up-front cost (and often
an optional ongoing support fee), SaaS providers generally pricing the applications using a
subscription fee, most commonly a monthly or annually fee.

2) Less hardware required for SaaS

The software is hosted remotely, so organizations don't need to invest in additional hardware.

3) Low Maintenance required for SaaS

Software as a service removes the necessity of installation, set-up, and often daily unkeep and
maintenance for organizations. Initial set-up cost for SaaS is typically less than the enterprise
software. SaaS vendors actually pricing their applications based on some usage parameters,
such as number of users using the application. So SaaS does easy to monitor.

4) No special software or hardware versions required

All users will have the same version of software and typically access it through the web
browser. SaaS reduces IT support costs by outsourcing hardware and software maintenance
and support to the IaaS provider.

5) Scalable usage

Cloud services like SaaS offer high scalability, which gives customers the option to access
more, or fewer, services or features on-demand. Additional storage or services can be
accessed on demand without needing to install new software and hardware

6) Automatic updates:

Rather than purchasing new software, customers can rely on a SaaS provider to automatically
perform updates and patch management. This further reduces the burden on in-house IT staff.
7) Accessibility and persistence: Since SaaS applications are delivered over the Internet,
users can access them from any Internet-enabled device and location.

Disadvantages of SaaS cloud computing layer


1) Security

Actually data is stored in cloud, so security may be an issue for some users. However, cloud
computing is not more secure than in-house deployment.

2) Latency issue

Because the data and application are stored in cloud at a variable distance from the end user,
so there is a possibility that there may be more latency while interacting with the application
than a local deployment. So, SaaS model is not suitable for applications whose demand
response times are in milliseconds.

3) Total Dependency on Internet

Without internet connection, most SaaS applications are not usable.

4) Switching between SaaS vendors is difficult

Switching SaaS vendors involves the difficult and slow task of transferring the very large
data files over the Internet and then converting and importing them into another SaaS also.
Infrastructure as a Service (IaaS)
IaaS is one of the layers of cloud computing platform wherein the customer organization
outsources its IT infrastructure such as servers, networking, processing, storage, virtual
machines and other resources. Customers access these resources over internet i.e. cloud
computing platform, on a pay-per-use model.

Iaas, earlier called Hardware as a Service (HaaS), is a cloud computing platform based
model.

In traditional hosting services, IT infrastructure was rented out for a specific periods of time,
with pre-determined hardware configuration. The client paid for the configuration and time,
regardless of the actual use. With the help of IaaS cloud computing platform layer, clients can
dynamically scale the configuration to meet changing requires, and are billed only for the
services actually used.

IaaS cloud computing platform layer eliminates the need for every organization to maintain
the IT infrastructure.

IaaS is offered in three models: public, private, and hybrid cloud. Private cloud implies that
the infrastructure resides at the customer-premise. In case of public cloud, it is located at the
cloud computing platform vendor's data center; and hybrid cloud is a combination of two
with customer choosing the best of both worlds.

Capability to provision processing, storage, networks and other fundamental computing


resources, offering the customer the ability to deploy and run arbitrary software, which can
include operating systems and applications. IaaS puts these IT operations into the hands of a
third party. Options to minimize the impact if the cloud provider has
a service interruption

Advantages of IaaS cloud computing layer

1) You can dynamically choose a CPU, memory and storage configuration as per your needs.

2) You easily access the vast computing power available on IaaS cloud platform.

3) You can eliminate the need of investment in rarely used IT hardware.

4) IT infra will be handled by the IaaS cloud computing platform vendors.


Platform as a Service (PaaS)

Platform as a Service (PaaS) or platform-based service is a category of cloud computing


services that provides a platform allowing customers to develop, run, and manage
applications without the complexity of building and maintaining the infrastructure
typically associated with developing and launching an app.
A developer is able to write the application as well as deploy it directly into this layer
easily.

PaaS extend and abstract the IaaS layer by removing the hassle of managing the individual
virtual machine.

In PaaS cloud computing platform, back end scalability is handled by the cloud service
provider and the end user does not have to worry about to manage the infrastructure.

Advantages of PaaS cloud computing layer

1) Simplified Development

Developers can focus on development and innovation without worrying about the
infrastructure.

2) Lower risk

No requirements of up-front investment in hardware and software. Developers only need a


PC and an internet connection to start building applications.

3) Prebuilt business functionality

Some PaaS vendors also provide already defined business functionality so that users can
avoid building everything from very scratch and hence can directly start the projects only.

4) Instant community

PaaS vendors frequently provides online communities where developer can get the ideas,
share experiences and seek advice from others.

5) Scalability

Applications deployed can scale from one to thousands of users without any changes to
the applications.
Disadvantages of PaaS cloud computing layer

1) Vendor lock-in

One have to write the applications according to the platform provided by PaaS vendor so
migration of an application to another PaaS vendor would be a problem.

2) Data Privacy

Corporate data, whether it can be critical or not, will be private so if it is not located
within the walls of the company there can be a risk in terms of privacy of data.

3) Integration with the rest of the systems applications

It may happen that some applications are local and some are in cloud. So there will be
chances of increased complexity when we want to use data which in the cloud with the
local data.

Top vendors who are providing PaaS cloud computing platform

Google Apps Engine (GAE)


SalesFroce.com
Windows Azure
AppFog
Openshift
Cloud Foundary from VMware

Desktop as a service (DaaS) is a cloud computing offering in which a third party hosts the
back end of a virtual desktop infrastructure (VDI) deployment.

With DaaS, desktop operating systems run inside virtual machines on servers in a cloud
provider's data center. All the necessary support infrastructure, including storage and
network resources, also lives in the cloud. As with on-premises VDI, a DaaS provider
streams virtual desktops over a network to a customer's endpoint devices, where end users
may access them through client software or a web browser.

Desktop as a Service (DaaS) is a desktop virtualization service that is hosted on the cloud,
so users can access their virtual desktops and applications wherever they go, using
whichever device they need. DaaS is purchased through a subscription and offers a
multitenancy environment. With a DaaS, the Virtual Desktop Infrastructure (VDI) is
deployed and managed by a service provider, which means that the responsibilities of
maintaining security, upgrades, data backup and storage are outsourced to your service
provider.
Because a virtual desktop is stored on a remote server, it is separated from the physical
device that is used to access it. With a DaaS, data gets saved automatically from the
virtual desktop because it is synced with the Cloud. Customers generally manage their
applications and desktop images, while the service provider handles all the back-end
infrastructure maintenance.

Advantages of DaaS are:


Data security
Reliability
Personalization
Increase in performance
Uninterrupted connectivity
Disaster recovery
Minimized complexity
Total cost reduction
Easy platform migration
VDI (Virtual Desktop Infrastructure) is a technology used to create a virtualized desktop
environment on a remote server setup. VDI segments the servers into various virtual desktops
which the users can access remotely through their devices. These virtual desktops are hosted
on Virtual Machines (VM) that are controlled through management software.
As far as the users are concerned, VDI gives you the freedom of accessing your desktop from
anywhere at any time through a VDI client software.
VDI can be classified as persistent and non-persistent. Persistent VDI is customized for a
personal user, and the user can log in to the same desktop each time. Non-persistent VDI
consists of desktops that revert to their initial state after the user logs out.

Benefits of VDI
1. Access
The most distinguishing feature of VDI is remote access. Traditional desktops can be viewed
as connected (or, one can say ‘restricted’) to a single system. As soon as you are away from
the system, you could not access your desktop anymore. With VDI, you can access your
desktop from anywhere, day or night.

2. Security
Applications and data are all stored on your local hardware like laptops or PCs. In case the
computer is stolen or damaged, all the data is lost.
With VDI, as remote data centers store the data with high-level redundancy, you do not need
to worry about data loss. Even if you lose the device, you can access your desktop from any
other device.

3. Device Portability
VDI technology enables you to access your desktop from various devices. As in the case of
VDI, the desktop is not bound to the hardware; it can be accessed from multiple devices. You
can use mobile, laptops, tablets or thin clients to view your desktop.
Easy Desktop Provisioning – Since with VDI you don’t have to configure each system
manually, it very easy to provision the desktops in VDI. The virtual desktops can be
provisioned almost instantaneously as the settings have to be mirrored from a desktop image.
4. Data Center Facilities
When you are availing VDI from a cloud service provider, the desktops are hosted on
servers situated in high-performance data centers. You get all the facilities and features
associated with the data center namely advanced security, high-end infrastructure and disaster
recovery plan among others.

5. Cost Reduction
By availing VDI services from a cloud provider, you eliminate the cost of hardware. You can
access your desktop from any device and can use the most outdated hardware in your office.
A thin client, mobile or tablet can also be used for the same purpose.
Web Service

Introduction he Internet and World Wide Web (WWW) have captured the world’s
imagination. Internet is represented in network as a cloud. Cloud computing is where
application and files are hosted on a cloud consisting of thousands of computers and servers,
all linked together and accessible via internet. Any web service or application offered via
cloud computing is called a cloud service. With this simple but powerful interface, a user can
download a file after accessing any web service from another computer with only a click of
the mouse. Moreover, advances in technology continue to extend the functionality of the
Internet. As Web services becomes increasingly popular, network congestion and server
overloading have becoming significant problems. So efforts are being made to address these
problems and improve web performance.

A Web service, in the context of .NET, is a component that resides on a Web server and
provides information and services to other network applications using standard Web
protocols such as HTTP and Simple Object Access Protocol (SOAP).

.NET Web services provide asynchronous communications for XML applications that
operate over a .NET communications framework. They exist so that users on the Internet can
use applications that are not dependent on their local operating system or hardware and are
generally browser-based.

The main advantage of a Web service is that its consumers can use the service without
knowing about the details of its implementation, such as the hardware platform, programming
language, object model, etc. Web service provides a loose coupling between heterogeneous
systems with the help of XML messages, provide interoperability.

Web services are designed to provide the messaging infrastructure necessary for
communication across platforms using industry standards. Web services also use
asynchronous communication to address the latency issue that arises due to requests from
remote locations across the Internet. This allows the execution of background tasks for the
client (such as responding to user interactions) until the actual completion of the Web service
request.
On-demand computing is a business computing model in which computing resources are
made available to the user on an “as needed” basis. Rather than all at once, on-demand
computing allows cloud hosting companies to provide their clients with access to computing
resources as they become necessary.

On-demand computing (ODC) is an enterprise-level model of technology and computing in


which resources are provided on an as-needed and when-needed basis. ODC makes
computing resources such as storage capacity, computational speed and software applications
available to users as and when needed for specific temporary projects, known or unexpected
workloads, routine work, or long-term technological and computing requirements.

Web services and other specialized tasks are sometimes referenced as types of ODC.

ODC is succinctly defined as “pay and use” computing power. It is also known as OD
computing or utility computing.

For example, if a customer needs to utilize additional servers for the duration of a project,
they can do so and then drop back to the previous level after the project is completed.

The on-demand model was developed to overcome the common challenge to an enterprise of
being able to meet fluctuating demands efficiently. Because an enterprise's demand on
computing resources can vary drastically from one time to another, maintaining sufficient
resources to meet peak requirements can be costly. Conversely, if an enterprise tried to cut
costs by only maintaining minimal computing resources, it is likely there will not be
sufficient resources to meet peak requirements.

The on-demand model provides an enterprise with the ability to scale computing resources up
or down with the click of a button, an API call or a business rule. The model is characterized
by three attributes: scalability, pay-per-use and self-service. Whether the resource is an
application program that helps team members collaborate or additional storage for archiving
images, the computing resources are elastic, metered and easy to obtain.

Many on-demand computing services in the cloud are so user-friendly that non-technical end
users can easily acquire computing resources without any help from the organization's
information technology (IT) department. This has advantages because it can improve
business agility, but it also has disadvantages because shadow IT can pose security risks. For
this reason, many IT departments carry out periodic cloud audits to identify greynet on-
demand applications and other rogue IT.

Discovering Cloud Services Development Services and Tools:

 Cloud computing is at an early stage of its development. This can be seen by observing the
large number of small and start-up companies offering cloud development tools.
 In a more established industry, the smaller players eventually fall by the wayside as larger
companies take center stage.

 Cloud services development services and tools are offered by a variety of companies, both
large and small.

 The most basic offerings provide cloud-based hosting for applications developed from
scratch.

 The more fully featured offerings include development tools and pre-built applications that
developers can use as the building blocks for their own unique web-based applications.

Amazon

 Amazon, one of the largest retailers on the Internet, is also one of the primary providers of
cloud development services.

 Amazon has spent a lot of time and money setting up a multitude of servers to service its
popular website, and is making those vast hardware resources available for all developers to
use.

 The service in question is called the Elastic Compute Cloud, also known as EC2. This is a
commercial web service that allows developers and companies to rent capacity on Amazon’s
proprietary cloud of servers— which happens to be one of the biggest server farms in the
world.

 EC2 enables scalable deployment of applications by letting customers request a set number
of virtual machines, onto which they can load any application of their choice.

 Thus, customers can create, launch, and terminate server instances on demand, creating a
truly “elastic” operation. Amazon’s service lets customers choose from three sizes of virtual
servers:

 Small, which offers the equivalent of a system with 1.7GB of memory,160GB of


storage, and one virtual 32-bit core processor.

 Large, which offers the equivalent of a system with 7.5GB of memory,850GB of


storage, and two 64-bit virtual core processors.
 Extra large, which offers the equivalent of a system with 15GB of memory,1.7TB of
storage, and four virtual 64-bit core processors
(In other words, you pick the size and power you want for your virtual server, and Amazon does the
rest)

 EC2 is just part of Amazon’s Web Services (AWS) set of offerings, which provides developers
with direct access to Amazon’s software and machines.

 By tapping into the computing power that Amazon has already constructed, developers can
build reliable, powerful, and low-cost web-based applications.

 Amazon provides the cloud (and access to it), and developers provide the rest. They pay only
for the computing power that they use.

 AWS is perhaps the most popular cloud computing service to date. Amazon claims a market
of more than 330,000 customers—a combination of developers, start-ups, and established
companies.

Google App Engine


 Google is a leader in web-based applications, so it’s not surprising that the company also
offers cloud development services.

 These services come in the form of the Google App Engine, which enables developers to
build their own web applications utilizing the same infrastructure that powers Google’s
powerful applications.

 The Google App Engine provides a fully integrated application environment. Using Google’s
development tools and computing cloud, App Engine applications are easy to build, easy to
maintain, and easy to scale.

 All you have to do is develop your application (using Google’s APIs and the Python
programming language) and upload it to the App Engine cloud; from there, it’s ready to
serve your users.

 As you might suspect, Google offers a robust cloud development environment. It includes
the following features:

 Dynamic web serving


 Full support for all common web technologies
 Persistent storage with queries, sorting, and transactions
 Automatic scaling and load balancing
 APIs for authenticating users and sending email using Google Accounts

 In addition, Google provides a fully featured local development environment that simulates
the Google App Engine on any desktop computer.

 And here’s one of the best things about Google’s offering: Unlike most other cloud hosting
solutions, Google App Engine is completely free to use—at a basic level, anyway.

 A free App Engine account gets up to 500MB of storage and enough CPU strength and
bandwidth for about 5 million page views a month.

 If you need more storage, power, or capacity, Google intends to offer additional resources
(for a charge) in the near future.
IBM

 It’s not surprising, given the company’s strength in enterprise-level computer hardware, that
IBM is offering a cloud computing solution.

 The company is targeting small- and medium-sized businesses with a suite of cloud-based
ondemand services via its Blue Cloud initiative.

 Blue Cloud is a series of cloud computing offerings that enables enterprises to distribute
their computing needs across a globally accessible resource grid.
 One such offering is the Express Advantage suite, which includes data backup and recovery,
email continuity and archiving, and data security functionality—some of the more data-
intensive processes handled by a typical IT department.

 To manage its cloud hardware, IBM provides open source workload-scheduling software
called Hadoop, which is based on the MapReduce software usedGoogle in its offerings. Also
included are PowerVM and Xen virtualization tools,along with IBM’s Tivoli data center
management software.
Eucalyptus

Eucalyptus is an open source software platform for implementing Infrastructure as a Service


(IaaS) in a private or hybrid cloud computing environment. The Eucalyptus cloud platform
pools together existing virtualized infrastructure to create cloud resources for infrastructure as
a service, network as a service and storage as a service. The name Eucalyptus is an acronym
for Elastic Utility Computing Architecture for Linking Your Programs To Useful Systems.

Eucalyptus features include:

Supports both Linux and Windows virtual machines (VMs).

Application program interface- (API) compatible with Amazon EC2 platform.

Compatible with Amazon Web Services (AWS) and Simple Storage Service (S3).

Works with multiple hypervisors including VMware, Xen and KVM.

Can be installed and deployed from source code or DEB and RPM packages.

Internal processes communications are secured through SOAP and WS-Security.

Multiple clusters can be virtualized as a single cloud.

Administrative features such as user and group management and reports.

Version 3.3, which became generally available in June 2013, adds the following features:

Auto Scaling: Allows application developers to scale Eucalyptus resources up or down based
on policies defined using Amazon EC2-compatible APIs and tools

Elastic Load Balancing: AWS-compatible service that provides greater fault tolerance for
applications

CloudWatch: An AWS-compatible service that allows users to collect metrics, set alarms,
identify trends, and take action to ensure applications run smoothly

Resource Tagging: Fine-grained reporting for showback and chargeback scenarios; allows IT/
DevOps to build reports that show cloud utilization by application, department or user

Expanded Instance Types: Expanded set of instance types to more closely align to those
available in Amazon EC2. Was 5 before, now up to 15 instance types.

Maintenance Mode: Allows for replication of a virtual machine’s hard drive, evacuation of
the server node and provides a maintenance window.
Cloud Security Issues
There are many security issues in clouds as they provide hardware and services over
the internet [8].

Data breaches

Cloud providers are the attractive target for the hackers to attack as massive data
stored on the clouds. How much severe the attack is depend upon the confidentiality
of the data which will be exposed. The information exposed may be financial or other
will be important the damage will be severe if the exposed information is personal
related to health information, trade secrets and intellectual property of a person of
an organization. This will produce a severe damage. When data breached happened
companies will be fined some lawsuits may also occur against these companies and
criminal charges also. Break examinations and client warnings can pile on critical
expenses. Aberrant impacts, for example, mark harm and loss of business, can affect
associations for a considerable length of time. Cloud suppliers commonly convey
security controls to ensure their surroundings, in any case, associations are in charge
of ensuring their own information in the cloud. The CSA has suggested associations
utilize multifaceted confirmation and encryption to ensure against information
ruptures [9].

Network security

Security data will be taken from enterprise in Saas and processes and stored by the
Saas provides. To avoid the leakage of the confidential information Data all over the
internet must be secured. Strong network traffic encryption will be involved to secure
the network for traffic.

Data locality

Consumer’s uses Saas applications in the Saas environment provided them by the
Saas providers and also processing of their data. In this case users or clients of clouds
are unaware of the fact that where their data is getting stored. Data locality is much
important in May of the countries laws and policies regarding the locality of data are
strict.

Data access

Data on clouds must be accessible from anywhere anytime and from any system.
Cloud storages have some issues regarding the access of the data from any device
[10]. Information breaks and different sorts of assaults flourish in situations with poor
client verification and frail passwords. Take a gander at the genuine assault on Sony
that happened only a few years back. They are as yet feeling the budgetary and
social impacts of the hack, which to a great extent succeeded on account of
administrators utilizing feeble passwords. The cloud is a particularly appealing target
since it exhibits a concentrated information store containing high-esteem
information and brought together client get to. Utilize enter administration
frameworks in your cloud condition, and be sure that the encryption keys can't
without much of a stretch be discovered on the web. Require solid passwords and
place teeth in the prerequisite via consequently turning passwords and different
methods for client ID. To wrap things up, utilize multi-figure validation.

DoS attacks

One cannot stop the denial of service attacks because it is not possible one can
mitigate the effect of these attacks but cannot stop these attacks. DoS assaults
overpower resources of a cloud service so clients can't get to information or
applications. Politically roused assaults get the front features, however programmers
are similarly prone to dispatch DoS assaults for pernicious goal including extortion.
What's more, when the DoS assault happens in a distributed computing condition,
process burn charges experience the rooftop. The cloud supplier ought to invert the
charges, yet consulting over what was an assault and what wasn't will take extra time
and irritation. Most cloud suppliers are set up to deny DoS assaults, which takes
consistent observing and moment alleviation.

System vulnerabilities

Vulnerabilities of the system are exploitable program bugs in the OS that


programmers intentionally use to control or invade a PC framework. Fortunately,
essential IT cleanliness goes far towards shielding you from this sort of genuine
assault. Since machines exist in your cloud supplier's server farms, be sure that your
supplier hones normal weakness examining alongside convenient security fixes and
overhauls.

Account hijacking

You may have seen an email that looks true legitimate. You tap on a connection, and
soon thereafter sirens blast and cautioning lights streak as your antivirus program
goes to fight. Or, then again you may have been genuinely unfortunate and had no
clue that you were recently the casualty of a phishing assault. At the point when a
client picks a powerless secret key, or taps on a connection in a phishing endeavor,
they are at genuine danger of turning into the channel for genuine risk to
information. Cloud-based records are no special case. Foundation solid two variable
validation and computerize solid passwords and watchword cycling to help secure
yourself against this sort of digital assault.

Malicious insiders
Most information loss or harm happening inside an association is human mistake.
noxious insiders do exist and they do much of harm. A malicious insider may be a
present or previous worker, contractual worker, or accomplice who has the
accreditations to get to organization information and intentionally uses, takes, or
harms that information. Resistance fixates on secure procedures, for example, solid
get to control, and always screen forms and explore activities that lie outside the
limits of adequate capacities.

The APT parasite

Additionally called APTs, programmers plan these long term cyber-attacks to give
them continuous access into a system. Cases of section focuses incorporate phishing,
introducing assault codes by means of USB gadgets, and interruption by means of
unreliable system get to focuses. Once in, the interruption shows up as ordinary
system movement and the aggressors are allowed to act. Mindful clients and solid
get to controls are the lines of best safeguard against this kind of assault.

Permanent data loss

Any information destruction or loss can be a permanent harm to the business. Cloud
information is liable to an indistinguishable dangers from is on premise information:
unintentional cancellation by clients or staff of providers, natural loss or damage, or
psychological militant assault. It is the cloud supplier's obligation to make
preparations for human mistake and to fabricate strong physical server farms. In any
case, IT should likewise secure against cloud information misfortune by setting up
SLAs that incorporate incessant and obvious reinforcement to remote locales, and
encoding records in the event of inadvertent information introduction [11].

Shared technology, shared dangers

Cloud suppliers allow administrations to thousands to a huge number of occupants.


Administrations run from cloud reinforcement to whole framework, stage, and
applications as an administration. The supplier ought to plan their engineering for
solid separation in multitenant designs: a fruitful assault on one client is sufficiently
terrible. A multitenant assault that spreads from one client to thousands is a debacle.
When you take a gander at cloud supplier and multitenant administrations, ensure
that they have executed multifaceted validation on all server has and work present
day interruption location frameworks.

Compromised credentials and broken authentication

Many cloud applications are equipped towards client collaboration, however free
programming trials and join openings open cloud administrations to pernicious
clients. A few genuine assault sorts can ride in on a download or sign in: DoS
assaults, email spam, computerized click extortion, and pilfered substance are only a
couple of them. Your cloud supplier is in charge of solid episode reaction structures
to distinguish and remediate this wellspring of assault. IT is in charge of checking the
quality of that structure and for observing their own cloud condition for manhandle
of resources.

Hacked interfaces and APIs

APIs and UIs are the backbone of cloud computing connections and integration
amongst clients and distributed computing. Cloud APIs' IP addresses uncover the
association amongst clients and the cloud, so securing APIs from irruption or human
mistake is basic to cloud security. Work with your cloud supplier and application
merchants to construct information streams that don't open APIs to simple assault.
Put resources into applications that model dangers in a live situation, and practice
visit entrance testing.

Solution to Security Issues


There are many security issues in the security of cloud computing which are need to
be resolved in order to make clouds more secure to check the security of a cloud the
following areas must be consulted with the cloud service providers.

Written security policies plan

If the cloud service providers have a written security plan of policies then the security
of the data will be guaranteed, if the cloud service provider do not have a security
policies written plan then the cloud is not safe and security of the data cannot be
guaranteed as they do not have a written plan of security policies. This means that
their data security program development. Organizations that have not formalized
their security strategies cannot be trusted with your touchy corporate/ client
information. Strategies shape the system and establishment and without security is
just an idea in retrospect

Multifactor authentication

If the cloud providers provide the multifactor authentication for example one time
password and mobile [3] code then the security of the data will be more tight as it
will be protected by multi factors. If someone try to unlock the data through
password one time wrong password will be sent to the data owner at his or her
mobile so that he can authenticate the login to the data [12]. Multifactor
authentication make the level of protection of data more high.

Access to data
Data of enterprise must be accessed and seen by the administration not by the users.
This access will provide the enhance security to the data over the cloud. Many cloud
applications are equipped towards client collaboration, however free programming
trials and join openings open cloud administrations to pernicious clients. A few
genuine assault sorts can ride in on a download or sign in DoS attacks, email spam,
computerized click extortion, and pilfered substance are only a couple of them. Your
cloud supplier is in charge of solid episode reaction structures to distinguish and
remediate this wellspring of assault. IT is in charge of checking the quality of that
structure and for observing their own cloud condition for manhandle of resources.

Appropriate cloud model for business

Appropriate cloud model for business will be private cloud. Private cloud are more
costly than public clouds but more secure. As they are costly they are more secure.
Private clouds are only used by only one organization and security level is higher
than the public cloud. As business contains confidential information and financial
transactions and business secrets more security is needed hence private clouds are
safer than public clouds.

Encryption of backups

Cloud backups of data must be encrypted otherwise encryption of data does not
have any meaning if the backups of data are not encrypted. Any of the hacker can
get access to these backups if they are not protected with appropriate encryptions. If
the backups are not encrypted data is not secure. An untested reinforcement is a
futile reinforcement. A decoded reinforcement overcomes the security controls in the
generation condition. Data should be secured over its whole lifecycle.

Regulatory compliance

Contingent on business prerequisites an organization's foundation could be liable to


certain complicity related. Organizations ought to have a reasonable rundown of
consistence prerequisites before considering cloud specialist co-op's consistence
directions may fluctuate from area related others secure.

Formal change control process

If the organization have a formal control process change then the cloud is fast and
secure for the time sensitive data. If the organization do not have a formal change
process control during the regular up gradations then their servers will goes down
no one can access the data. And if the data is time sensitive than these cloud which
do not have formal change process control they are not safe for tie sensitive data.
Organizations that execute changes and setup in a specially appointed way will
probably encounter huge downtime in their condition. The main source of system
blackouts can be credited to lack of foresight and absence of progress control. In the
event that the information you are sending to the cloud is time delicate, you need to
run with a supplier that submits to a formal change control handle, hence dealing
with the inborn hazard in impromptu changes.

Are external third-party contracts and agreements

Like the idea of subcontracting, in the event that you endow a cloud merchant with
your data and they thus utilize another supplier (to store your data for instance) does
the underlying seller guarantee that their accomplices follow the arrangements and
security understandings that were laid out in your agreement? If not, these
accomplices debilitate the general security of the data chain.

Secure data destruction

Secure destruction of data is necessary when needed. If the destruction of data is not
secured then the risks of data leakage are present. Anyone can retrieve that data
when the data is not destructed safely. In the event that you are putting away
classified/delicate information in the cloud and if the seller does not appropriately
pulverize information from decommissioned gear, the information is unnecessarily
put at hazard. Get some information about their information annihilation handle.

Encryption scheme design and test

If the encryption schemes designed and tested by the professional and experienced
persons then the security of the cloud is of trust. A questionnaire was designed and
conducted to test the security issues and their solution and the level of security. The
respondents of this questionnaire were cloud service providers and cloud users who
have expertise and experience in cloud environments.
Security and Privacy Issues in
Cloud Computing
cloud computing has a number of potential drawbacks – notably that of privacy
and control of information. Privacy and security are inherent challenges in cloud
computing because their very nature involves storing unencrypted data on a
machine owned and operated by someone other than the original owner of the data.

Issues arise from lack of data control, lack of trust of all parties with access,
uncertainty about the status of data (whether it has been destroyed when it should,
or whether there has been a privacy breach), and compliance with legal flow of
data over borders.

The nature of the risks of course, varies in different scenarios, depending among
other things, on what type of cloud is being employed. These concerns are serious
enough, for example, that public clouds are generally not used at all for sensitive
information. Privacy issues in cloud computing includes:

 Data protection: Data security plays an important role in cloud computing


environment where encryption technology is the best option whether data
at rest or transmitted over the internet. Hard drive producers are supplying
self-encrypting drives that provide automated encryption, even if you can
use encryption software to protect your data. If we talk about security of
transmitted data, then SSL encryption is the best option to secure your
online communications as well provides authentication to your website
and/or business that assures the data integrity and the users’ information is
not altered during transmissions.
 User control: This can be both a legal issue and one raised by consumers
themselves. SaaS environment offers the control of consumers’ data to the
service provider so; data visibility and control will be limited. In that case,
there is a threat of data stolen, misused or theft, as consumers have no
control over cloud. Even data transparency is missing for example, where
the data is, who owns it, and how it is being used. However, data exposure
can also be possible during data transferring as many countries have
implemented the law of accessing data if they found it distrusting.
 Employee training and knowledge: A full understanding of when cloud
services should and be used needs to be a part of basic employee training in
many jobs that involve managing information. Due to lack of training people
may not understand the impact of decisions related privacy they generally
made.
 Unauthorized usage: This can includes usage of data ranging from targeted
advertising, to the re-sale of data on the cloud. The service provider may
gain income from secondary usage of data. Agreements between clients and
providers must be specific about unauthorized usage as it will enhance the
trust and lessen the security worries.
 Loss of legal protection: Putting data on the cloud can involve a loss of legal
protection of privacy. It can be impossible to follow all the legislation for a
cloud computing for example, with Canada’s privacy act or health laws.
Other policies such as the U.S Patriot Act as mentioned above, can actually
force exposure of data to third parties. Different locations have many
different laws to protect (or in some cases infringe on) the privacy of these
users. Data in the cloud is, at best, extremely unclear in terms of locality. At
worst, the nature of this ambiguous and instantaneous data flow across
borders can make privacy laws impossible to enforce.

Cloud computing, inarguably holds immense potential for ease of use,


convenience, scaled expenses, and is a hard to resist tool for businesses. Cloud
computing eliminates many of the financial risks that had previously been inherent
to doing business digitally. It provides a platform for sharing information globally,
in an age where most business involves global elements. Along with mobile
devices, and increasingly reliable internet, cloud computing is another step in
pulling down distance between the business and teamwork. The plus sides are
many.

You might also like