BCA 2nd Year
BCA 2nd Year
BCA 2nd Year
peer-to-peer (P2P) network is created when two or more PCs are connected and share
resources without going through a separate server computer.
2. In the peer to peerComputing model we simply use the same Workgroup for all the
computers and a unique name for each computer in a computer network.
3. There is no master or controller or central server in this computer network and computers join
hands to share files, printers and Internet access.
4. It is practical for workgroups of a dozen or less computers making it common environments,
where each PC acts as an independent workstation and maintaining its own security that
stores data on its own disk but which can share it with all other PCs on the network.
5. Software for peer-to-peer network is included with most modern desktop operating systems
such as Windows and Mac OS.
5. Peer to peer relationship is suitable for small networks having less than 10 computers on a
single LAN.
6. A peer-to-peer (P2P) network is group of computers, each of which acts as a node for sharing
files within the group. Instead of having a central server to act as a shared drive, each
computer acts as the server for the files stored upon it. When a P2P network is established
over the Internet,
ARPnet
BASIS FOR
CLIENT-SERVER PEER-TO-PEER
COMAPAISON
Basic There is a specific server and Clients and server are not
Service The client request for service and Each node can request for services
server respond with the service. and can also provide the services.
Data The data is stored in a centralized Each peer has its own data.
server.
Server When several clients request for As the services are provided by
bottlenecked.
to implement. implement.
Front End
Back End
It is the responsibility of the back-end to provide the security of data for cloud users along
with the traffic control mechanism. The server also provides the middleware which helps to
connect devices & communicate with each other.
Server
The server helps to compute the resource sharing and offers other services such as resource
allocation and de-allocation, monitoring the resources, providing security etc.
Storage
Cloud keeps multiple replicas of storage. If one of the storage resources fails, then it can be
extracted from another one, which makes cloud computing more reliable.
Management
It helps to maintain and configure the infrastructure.
Cloud Storage
Cloud storage is a cloud computing model in which data is stored on remote servers accessed
from the internet, or "cloud." It is maintained, operated and managed by a cloud storage
service provider on a storage server that are built on virtualization techniques.
Cloud storage works through data center virtualization, providing end users and applications
with a virtual storage architecture that is scalable according to application requirements. In
general, cloud storage operates through a web-based API that is remotely implemented
through its interaction with the client application's in-house cloud storage infrastructure for
input/output (I/O) and read/write (R/W) operations.
When delivered through a public service provider, cloud storage is known as utility storage.
Private cloud storage provides the same scalability, flexibility and storage mechanism with
restricted or non-public access.
2. Time to Deployment. When development teams are ready to execute, infrastructure should
never slow them down. Cloud storage allows IT to quickly deliver the exact amount of
storage needed, right when it's needed. This allows IT to focus on solving complex
application problems instead of having to manage storage systems.
2. File Storage - Some applications need to access shared files and require a file system. This
type of storage is often supported with a Network Attached Storage (NAS) server
3. Block Storage - Other enterprise applications like databases or ERP systems often require
dedicated, low latency storage for each host. This is analogous to direct-attached storage
(DAS) or a Storage Area Network (SAN).
The hypervisor can be defined as the firmware (a permanent set of instruction or code
programmed into the read-only memory & is a low-level program) that acts as a manager for
the virtual machine. It is also called Virtual Machine Monitor (VMM) which creates & runs
the virtual machine. It provides the guest OS with a virtual operating platform to manages the
execution of other applications. There are two types of the hypervisor.
These are:
Native Hypervisor
Hosted Hypervisor
In cloud technology, virtualized resources are kept & maintained by the service provider or
the department of IT; these resources comprise of servers, memory, network switches,
firewalls, load-balancers & storage. In the cloud computing architecture, the cloud
infrastructure referred to the back-end components of the cloud.
Management Software firstly helps to configure the infrastructure then maintaining it.
The Deployment software, on the other hand, is used to deploy & combine all applications on
the cloud.
Network, as we all know is the key part of cloud technology allowing users to connect to the
cloud via the internet. Multiple copies of data are kept stored in the cloud. This is because, if
any storage resource fails - then the data can be extracted from another one. So, storage is
another essential component of cloud infrastructure.
SaaS pricing is based on a monthly fee or annual fee, SaaS allows organizations to access
business functionality at a low cost which is less than licensed applications. Applications are
ready to use once the user subscribes
Unlike traditional software which is sold as a licensed based with an up-front cost (and often
an optional ongoing support fee), SaaS providers generally pricing the applications using a
subscription fee, most commonly a monthly or annually fee.
The software is hosted remotely, so organizations don't need to invest in additional hardware.
Software as a service removes the necessity of installation, set-up, and often daily unkeep and
maintenance for organizations. Initial set-up cost for SaaS is typically less than the enterprise
software. SaaS vendors actually pricing their applications based on some usage parameters,
such as number of users using the application. So SaaS does easy to monitor.
All users will have the same version of software and typically access it through the web
browser. SaaS reduces IT support costs by outsourcing hardware and software maintenance
and support to the IaaS provider.
5) Scalable usage
Cloud services like SaaS offer high scalability, which gives customers the option to access
more, or fewer, services or features on-demand. Additional storage or services can be
accessed on demand without needing to install new software and hardware
6) Automatic updates:
Rather than purchasing new software, customers can rely on a SaaS provider to automatically
perform updates and patch management. This further reduces the burden on in-house IT staff.
7) Accessibility and persistence: Since SaaS applications are delivered over the Internet,
users can access them from any Internet-enabled device and location.
Actually data is stored in cloud, so security may be an issue for some users. However, cloud
computing is not more secure than in-house deployment.
2) Latency issue
Because the data and application are stored in cloud at a variable distance from the end user,
so there is a possibility that there may be more latency while interacting with the application
than a local deployment. So, SaaS model is not suitable for applications whose demand
response times are in milliseconds.
Switching SaaS vendors involves the difficult and slow task of transferring the very large
data files over the Internet and then converting and importing them into another SaaS also.
Infrastructure as a Service (IaaS)
IaaS is one of the layers of cloud computing platform wherein the customer organization
outsources its IT infrastructure such as servers, networking, processing, storage, virtual
machines and other resources. Customers access these resources over internet i.e. cloud
computing platform, on a pay-per-use model.
Iaas, earlier called Hardware as a Service (HaaS), is a cloud computing platform based
model.
In traditional hosting services, IT infrastructure was rented out for a specific periods of time,
with pre-determined hardware configuration. The client paid for the configuration and time,
regardless of the actual use. With the help of IaaS cloud computing platform layer, clients can
dynamically scale the configuration to meet changing requires, and are billed only for the
services actually used.
IaaS cloud computing platform layer eliminates the need for every organization to maintain
the IT infrastructure.
IaaS is offered in three models: public, private, and hybrid cloud. Private cloud implies that
the infrastructure resides at the customer-premise. In case of public cloud, it is located at the
cloud computing platform vendor's data center; and hybrid cloud is a combination of two
with customer choosing the best of both worlds.
1) You can dynamically choose a CPU, memory and storage configuration as per your needs.
2) You easily access the vast computing power available on IaaS cloud platform.
PaaS extend and abstract the IaaS layer by removing the hassle of managing the individual
virtual machine.
In PaaS cloud computing platform, back end scalability is handled by the cloud service
provider and the end user does not have to worry about to manage the infrastructure.
1) Simplified Development
Developers can focus on development and innovation without worrying about the
infrastructure.
2) Lower risk
Some PaaS vendors also provide already defined business functionality so that users can
avoid building everything from very scratch and hence can directly start the projects only.
4) Instant community
PaaS vendors frequently provides online communities where developer can get the ideas,
share experiences and seek advice from others.
5) Scalability
Applications deployed can scale from one to thousands of users without any changes to
the applications.
Disadvantages of PaaS cloud computing layer
1) Vendor lock-in
One have to write the applications according to the platform provided by PaaS vendor so
migration of an application to another PaaS vendor would be a problem.
2) Data Privacy
Corporate data, whether it can be critical or not, will be private so if it is not located
within the walls of the company there can be a risk in terms of privacy of data.
It may happen that some applications are local and some are in cloud. So there will be
chances of increased complexity when we want to use data which in the cloud with the
local data.
Desktop as a service (DaaS) is a cloud computing offering in which a third party hosts the
back end of a virtual desktop infrastructure (VDI) deployment.
With DaaS, desktop operating systems run inside virtual machines on servers in a cloud
provider's data center. All the necessary support infrastructure, including storage and
network resources, also lives in the cloud. As with on-premises VDI, a DaaS provider
streams virtual desktops over a network to a customer's endpoint devices, where end users
may access them through client software or a web browser.
Desktop as a Service (DaaS) is a desktop virtualization service that is hosted on the cloud,
so users can access their virtual desktops and applications wherever they go, using
whichever device they need. DaaS is purchased through a subscription and offers a
multitenancy environment. With a DaaS, the Virtual Desktop Infrastructure (VDI) is
deployed and managed by a service provider, which means that the responsibilities of
maintaining security, upgrades, data backup and storage are outsourced to your service
provider.
Because a virtual desktop is stored on a remote server, it is separated from the physical
device that is used to access it. With a DaaS, data gets saved automatically from the
virtual desktop because it is synced with the Cloud. Customers generally manage their
applications and desktop images, while the service provider handles all the back-end
infrastructure maintenance.
Benefits of VDI
1. Access
The most distinguishing feature of VDI is remote access. Traditional desktops can be viewed
as connected (or, one can say ‘restricted’) to a single system. As soon as you are away from
the system, you could not access your desktop anymore. With VDI, you can access your
desktop from anywhere, day or night.
2. Security
Applications and data are all stored on your local hardware like laptops or PCs. In case the
computer is stolen or damaged, all the data is lost.
With VDI, as remote data centers store the data with high-level redundancy, you do not need
to worry about data loss. Even if you lose the device, you can access your desktop from any
other device.
3. Device Portability
VDI technology enables you to access your desktop from various devices. As in the case of
VDI, the desktop is not bound to the hardware; it can be accessed from multiple devices. You
can use mobile, laptops, tablets or thin clients to view your desktop.
Easy Desktop Provisioning – Since with VDI you don’t have to configure each system
manually, it very easy to provision the desktops in VDI. The virtual desktops can be
provisioned almost instantaneously as the settings have to be mirrored from a desktop image.
4. Data Center Facilities
When you are availing VDI from a cloud service provider, the desktops are hosted on
servers situated in high-performance data centers. You get all the facilities and features
associated with the data center namely advanced security, high-end infrastructure and disaster
recovery plan among others.
5. Cost Reduction
By availing VDI services from a cloud provider, you eliminate the cost of hardware. You can
access your desktop from any device and can use the most outdated hardware in your office.
A thin client, mobile or tablet can also be used for the same purpose.
Web Service
Introduction he Internet and World Wide Web (WWW) have captured the world’s
imagination. Internet is represented in network as a cloud. Cloud computing is where
application and files are hosted on a cloud consisting of thousands of computers and servers,
all linked together and accessible via internet. Any web service or application offered via
cloud computing is called a cloud service. With this simple but powerful interface, a user can
download a file after accessing any web service from another computer with only a click of
the mouse. Moreover, advances in technology continue to extend the functionality of the
Internet. As Web services becomes increasingly popular, network congestion and server
overloading have becoming significant problems. So efforts are being made to address these
problems and improve web performance.
A Web service, in the context of .NET, is a component that resides on a Web server and
provides information and services to other network applications using standard Web
protocols such as HTTP and Simple Object Access Protocol (SOAP).
.NET Web services provide asynchronous communications for XML applications that
operate over a .NET communications framework. They exist so that users on the Internet can
use applications that are not dependent on their local operating system or hardware and are
generally browser-based.
The main advantage of a Web service is that its consumers can use the service without
knowing about the details of its implementation, such as the hardware platform, programming
language, object model, etc. Web service provides a loose coupling between heterogeneous
systems with the help of XML messages, provide interoperability.
Web services are designed to provide the messaging infrastructure necessary for
communication across platforms using industry standards. Web services also use
asynchronous communication to address the latency issue that arises due to requests from
remote locations across the Internet. This allows the execution of background tasks for the
client (such as responding to user interactions) until the actual completion of the Web service
request.
On-demand computing is a business computing model in which computing resources are
made available to the user on an “as needed” basis. Rather than all at once, on-demand
computing allows cloud hosting companies to provide their clients with access to computing
resources as they become necessary.
Web services and other specialized tasks are sometimes referenced as types of ODC.
ODC is succinctly defined as “pay and use” computing power. It is also known as OD
computing or utility computing.
For example, if a customer needs to utilize additional servers for the duration of a project,
they can do so and then drop back to the previous level after the project is completed.
The on-demand model was developed to overcome the common challenge to an enterprise of
being able to meet fluctuating demands efficiently. Because an enterprise's demand on
computing resources can vary drastically from one time to another, maintaining sufficient
resources to meet peak requirements can be costly. Conversely, if an enterprise tried to cut
costs by only maintaining minimal computing resources, it is likely there will not be
sufficient resources to meet peak requirements.
The on-demand model provides an enterprise with the ability to scale computing resources up
or down with the click of a button, an API call or a business rule. The model is characterized
by three attributes: scalability, pay-per-use and self-service. Whether the resource is an
application program that helps team members collaborate or additional storage for archiving
images, the computing resources are elastic, metered and easy to obtain.
Many on-demand computing services in the cloud are so user-friendly that non-technical end
users can easily acquire computing resources without any help from the organization's
information technology (IT) department. This has advantages because it can improve
business agility, but it also has disadvantages because shadow IT can pose security risks. For
this reason, many IT departments carry out periodic cloud audits to identify greynet on-
demand applications and other rogue IT.
Cloud computing is at an early stage of its development. This can be seen by observing the
large number of small and start-up companies offering cloud development tools.
In a more established industry, the smaller players eventually fall by the wayside as larger
companies take center stage.
Cloud services development services and tools are offered by a variety of companies, both
large and small.
The most basic offerings provide cloud-based hosting for applications developed from
scratch.
The more fully featured offerings include development tools and pre-built applications that
developers can use as the building blocks for their own unique web-based applications.
Amazon
Amazon, one of the largest retailers on the Internet, is also one of the primary providers of
cloud development services.
Amazon has spent a lot of time and money setting up a multitude of servers to service its
popular website, and is making those vast hardware resources available for all developers to
use.
The service in question is called the Elastic Compute Cloud, also known as EC2. This is a
commercial web service that allows developers and companies to rent capacity on Amazon’s
proprietary cloud of servers— which happens to be one of the biggest server farms in the
world.
EC2 enables scalable deployment of applications by letting customers request a set number
of virtual machines, onto which they can load any application of their choice.
Thus, customers can create, launch, and terminate server instances on demand, creating a
truly “elastic” operation. Amazon’s service lets customers choose from three sizes of virtual
servers:
EC2 is just part of Amazon’s Web Services (AWS) set of offerings, which provides developers
with direct access to Amazon’s software and machines.
By tapping into the computing power that Amazon has already constructed, developers can
build reliable, powerful, and low-cost web-based applications.
Amazon provides the cloud (and access to it), and developers provide the rest. They pay only
for the computing power that they use.
AWS is perhaps the most popular cloud computing service to date. Amazon claims a market
of more than 330,000 customers—a combination of developers, start-ups, and established
companies.
These services come in the form of the Google App Engine, which enables developers to
build their own web applications utilizing the same infrastructure that powers Google’s
powerful applications.
The Google App Engine provides a fully integrated application environment. Using Google’s
development tools and computing cloud, App Engine applications are easy to build, easy to
maintain, and easy to scale.
All you have to do is develop your application (using Google’s APIs and the Python
programming language) and upload it to the App Engine cloud; from there, it’s ready to
serve your users.
As you might suspect, Google offers a robust cloud development environment. It includes
the following features:
In addition, Google provides a fully featured local development environment that simulates
the Google App Engine on any desktop computer.
And here’s one of the best things about Google’s offering: Unlike most other cloud hosting
solutions, Google App Engine is completely free to use—at a basic level, anyway.
A free App Engine account gets up to 500MB of storage and enough CPU strength and
bandwidth for about 5 million page views a month.
If you need more storage, power, or capacity, Google intends to offer additional resources
(for a charge) in the near future.
IBM
It’s not surprising, given the company’s strength in enterprise-level computer hardware, that
IBM is offering a cloud computing solution.
The company is targeting small- and medium-sized businesses with a suite of cloud-based
ondemand services via its Blue Cloud initiative.
Blue Cloud is a series of cloud computing offerings that enables enterprises to distribute
their computing needs across a globally accessible resource grid.
One such offering is the Express Advantage suite, which includes data backup and recovery,
email continuity and archiving, and data security functionality—some of the more data-
intensive processes handled by a typical IT department.
To manage its cloud hardware, IBM provides open source workload-scheduling software
called Hadoop, which is based on the MapReduce software usedGoogle in its offerings. Also
included are PowerVM and Xen virtualization tools,along with IBM’s Tivoli data center
management software.
Eucalyptus
Compatible with Amazon Web Services (AWS) and Simple Storage Service (S3).
Can be installed and deployed from source code or DEB and RPM packages.
Version 3.3, which became generally available in June 2013, adds the following features:
Auto Scaling: Allows application developers to scale Eucalyptus resources up or down based
on policies defined using Amazon EC2-compatible APIs and tools
Elastic Load Balancing: AWS-compatible service that provides greater fault tolerance for
applications
CloudWatch: An AWS-compatible service that allows users to collect metrics, set alarms,
identify trends, and take action to ensure applications run smoothly
Resource Tagging: Fine-grained reporting for showback and chargeback scenarios; allows IT/
DevOps to build reports that show cloud utilization by application, department or user
Expanded Instance Types: Expanded set of instance types to more closely align to those
available in Amazon EC2. Was 5 before, now up to 15 instance types.
Maintenance Mode: Allows for replication of a virtual machine’s hard drive, evacuation of
the server node and provides a maintenance window.
Cloud Security Issues
There are many security issues in clouds as they provide hardware and services over
the internet [8].
Data breaches
Cloud providers are the attractive target for the hackers to attack as massive data
stored on the clouds. How much severe the attack is depend upon the confidentiality
of the data which will be exposed. The information exposed may be financial or other
will be important the damage will be severe if the exposed information is personal
related to health information, trade secrets and intellectual property of a person of
an organization. This will produce a severe damage. When data breached happened
companies will be fined some lawsuits may also occur against these companies and
criminal charges also. Break examinations and client warnings can pile on critical
expenses. Aberrant impacts, for example, mark harm and loss of business, can affect
associations for a considerable length of time. Cloud suppliers commonly convey
security controls to ensure their surroundings, in any case, associations are in charge
of ensuring their own information in the cloud. The CSA has suggested associations
utilize multifaceted confirmation and encryption to ensure against information
ruptures [9].
Network security
Security data will be taken from enterprise in Saas and processes and stored by the
Saas provides. To avoid the leakage of the confidential information Data all over the
internet must be secured. Strong network traffic encryption will be involved to secure
the network for traffic.
Data locality
Consumer’s uses Saas applications in the Saas environment provided them by the
Saas providers and also processing of their data. In this case users or clients of clouds
are unaware of the fact that where their data is getting stored. Data locality is much
important in May of the countries laws and policies regarding the locality of data are
strict.
Data access
Data on clouds must be accessible from anywhere anytime and from any system.
Cloud storages have some issues regarding the access of the data from any device
[10]. Information breaks and different sorts of assaults flourish in situations with poor
client verification and frail passwords. Take a gander at the genuine assault on Sony
that happened only a few years back. They are as yet feeling the budgetary and
social impacts of the hack, which to a great extent succeeded on account of
administrators utilizing feeble passwords. The cloud is a particularly appealing target
since it exhibits a concentrated information store containing high-esteem
information and brought together client get to. Utilize enter administration
frameworks in your cloud condition, and be sure that the encryption keys can't
without much of a stretch be discovered on the web. Require solid passwords and
place teeth in the prerequisite via consequently turning passwords and different
methods for client ID. To wrap things up, utilize multi-figure validation.
DoS attacks
One cannot stop the denial of service attacks because it is not possible one can
mitigate the effect of these attacks but cannot stop these attacks. DoS assaults
overpower resources of a cloud service so clients can't get to information or
applications. Politically roused assaults get the front features, however programmers
are similarly prone to dispatch DoS assaults for pernicious goal including extortion.
What's more, when the DoS assault happens in a distributed computing condition,
process burn charges experience the rooftop. The cloud supplier ought to invert the
charges, yet consulting over what was an assault and what wasn't will take extra time
and irritation. Most cloud suppliers are set up to deny DoS assaults, which takes
consistent observing and moment alleviation.
System vulnerabilities
Account hijacking
You may have seen an email that looks true legitimate. You tap on a connection, and
soon thereafter sirens blast and cautioning lights streak as your antivirus program
goes to fight. Or, then again you may have been genuinely unfortunate and had no
clue that you were recently the casualty of a phishing assault. At the point when a
client picks a powerless secret key, or taps on a connection in a phishing endeavor,
they are at genuine danger of turning into the channel for genuine risk to
information. Cloud-based records are no special case. Foundation solid two variable
validation and computerize solid passwords and watchword cycling to help secure
yourself against this sort of digital assault.
Malicious insiders
Most information loss or harm happening inside an association is human mistake.
noxious insiders do exist and they do much of harm. A malicious insider may be a
present or previous worker, contractual worker, or accomplice who has the
accreditations to get to organization information and intentionally uses, takes, or
harms that information. Resistance fixates on secure procedures, for example, solid
get to control, and always screen forms and explore activities that lie outside the
limits of adequate capacities.
Additionally called APTs, programmers plan these long term cyber-attacks to give
them continuous access into a system. Cases of section focuses incorporate phishing,
introducing assault codes by means of USB gadgets, and interruption by means of
unreliable system get to focuses. Once in, the interruption shows up as ordinary
system movement and the aggressors are allowed to act. Mindful clients and solid
get to controls are the lines of best safeguard against this kind of assault.
Any information destruction or loss can be a permanent harm to the business. Cloud
information is liable to an indistinguishable dangers from is on premise information:
unintentional cancellation by clients or staff of providers, natural loss or damage, or
psychological militant assault. It is the cloud supplier's obligation to make
preparations for human mistake and to fabricate strong physical server farms. In any
case, IT should likewise secure against cloud information misfortune by setting up
SLAs that incorporate incessant and obvious reinforcement to remote locales, and
encoding records in the event of inadvertent information introduction [11].
Many cloud applications are equipped towards client collaboration, however free
programming trials and join openings open cloud administrations to pernicious
clients. A few genuine assault sorts can ride in on a download or sign in: DoS
assaults, email spam, computerized click extortion, and pilfered substance are only a
couple of them. Your cloud supplier is in charge of solid episode reaction structures
to distinguish and remediate this wellspring of assault. IT is in charge of checking the
quality of that structure and for observing their own cloud condition for manhandle
of resources.
APIs and UIs are the backbone of cloud computing connections and integration
amongst clients and distributed computing. Cloud APIs' IP addresses uncover the
association amongst clients and the cloud, so securing APIs from irruption or human
mistake is basic to cloud security. Work with your cloud supplier and application
merchants to construct information streams that don't open APIs to simple assault.
Put resources into applications that model dangers in a live situation, and practice
visit entrance testing.
If the cloud service providers have a written security plan of policies then the security
of the data will be guaranteed, if the cloud service provider do not have a security
policies written plan then the cloud is not safe and security of the data cannot be
guaranteed as they do not have a written plan of security policies. This means that
their data security program development. Organizations that have not formalized
their security strategies cannot be trusted with your touchy corporate/ client
information. Strategies shape the system and establishment and without security is
just an idea in retrospect
Multifactor authentication
If the cloud providers provide the multifactor authentication for example one time
password and mobile [3] code then the security of the data will be more tight as it
will be protected by multi factors. If someone try to unlock the data through
password one time wrong password will be sent to the data owner at his or her
mobile so that he can authenticate the login to the data [12]. Multifactor
authentication make the level of protection of data more high.
Access to data
Data of enterprise must be accessed and seen by the administration not by the users.
This access will provide the enhance security to the data over the cloud. Many cloud
applications are equipped towards client collaboration, however free programming
trials and join openings open cloud administrations to pernicious clients. A few
genuine assault sorts can ride in on a download or sign in DoS attacks, email spam,
computerized click extortion, and pilfered substance are only a couple of them. Your
cloud supplier is in charge of solid episode reaction structures to distinguish and
remediate this wellspring of assault. IT is in charge of checking the quality of that
structure and for observing their own cloud condition for manhandle of resources.
Appropriate cloud model for business will be private cloud. Private cloud are more
costly than public clouds but more secure. As they are costly they are more secure.
Private clouds are only used by only one organization and security level is higher
than the public cloud. As business contains confidential information and financial
transactions and business secrets more security is needed hence private clouds are
safer than public clouds.
Encryption of backups
Cloud backups of data must be encrypted otherwise encryption of data does not
have any meaning if the backups of data are not encrypted. Any of the hacker can
get access to these backups if they are not protected with appropriate encryptions. If
the backups are not encrypted data is not secure. An untested reinforcement is a
futile reinforcement. A decoded reinforcement overcomes the security controls in the
generation condition. Data should be secured over its whole lifecycle.
Regulatory compliance
If the organization have a formal control process change then the cloud is fast and
secure for the time sensitive data. If the organization do not have a formal change
process control during the regular up gradations then their servers will goes down
no one can access the data. And if the data is time sensitive than these cloud which
do not have formal change process control they are not safe for tie sensitive data.
Organizations that execute changes and setup in a specially appointed way will
probably encounter huge downtime in their condition. The main source of system
blackouts can be credited to lack of foresight and absence of progress control. In the
event that the information you are sending to the cloud is time delicate, you need to
run with a supplier that submits to a formal change control handle, hence dealing
with the inborn hazard in impromptu changes.
Like the idea of subcontracting, in the event that you endow a cloud merchant with
your data and they thus utilize another supplier (to store your data for instance) does
the underlying seller guarantee that their accomplices follow the arrangements and
security understandings that were laid out in your agreement? If not, these
accomplices debilitate the general security of the data chain.
Secure destruction of data is necessary when needed. If the destruction of data is not
secured then the risks of data leakage are present. Anyone can retrieve that data
when the data is not destructed safely. In the event that you are putting away
classified/delicate information in the cloud and if the seller does not appropriately
pulverize information from decommissioned gear, the information is unnecessarily
put at hazard. Get some information about their information annihilation handle.
If the encryption schemes designed and tested by the professional and experienced
persons then the security of the cloud is of trust. A questionnaire was designed and
conducted to test the security issues and their solution and the level of security. The
respondents of this questionnaire were cloud service providers and cloud users who
have expertise and experience in cloud environments.
Security and Privacy Issues in
Cloud Computing
cloud computing has a number of potential drawbacks – notably that of privacy
and control of information. Privacy and security are inherent challenges in cloud
computing because their very nature involves storing unencrypted data on a
machine owned and operated by someone other than the original owner of the data.
Issues arise from lack of data control, lack of trust of all parties with access,
uncertainty about the status of data (whether it has been destroyed when it should,
or whether there has been a privacy breach), and compliance with legal flow of
data over borders.
The nature of the risks of course, varies in different scenarios, depending among
other things, on what type of cloud is being employed. These concerns are serious
enough, for example, that public clouds are generally not used at all for sensitive
information. Privacy issues in cloud computing includes: