Cloud Computing Is Internet
Cloud Computing Is Internet
Cloud Computing Is Internet
information are provided to computers and other devices on demand, like the electricity grid.
Cloud computing is a paradigm shift following the shift from mainframe to client–server in the
early 1980s. Details are abstracted from the users, who no longer have need for expertise in, or
control over, the technology infrastructure "in the cloud" that supports them.[1] Cloud computing
describes a new supplement, consumption, and delivery model for IT services based on the
Internet, and it typically involves over-the-Internet provision of dynamically scalable and often
virtualized resources.[2][3] It is a byproduct and consequence of the ease-of-access to remote
computing sites provided by the Internet.[4] This frequently takes the form of web-based tools or
applications that users can access and use through a web browser as if it were a program installed
locally on their own computer.[5] NIST provides a somewhat more objective and specific
definition here.[6] The term "cloud" is used as a metaphor for the Internet, based on the cloud
drawing used in the past to represent the telephone network,[7] and later to depict the Internet in
computer network diagrams as an abstraction of the underlying infrastructure it represents.[8]
Typical cloud computing providers deliver common business applications online that are
accessed from another Web service or software like a Web browser, while the software and data
are stored on servers. A key element of cloud computing is customization and the creation of a
user-defined experience.
Most cloud computing infrastructures consist of services delivered through common centers and
built on servers. Clouds often appear as single points of access for all consumers' computing
needs. Commercial offerings are generally expected to meet quality of service (QoS)
requirements of customers, and typically include SLAs.[9] The major cloud service providers
include Microsoft,[10] Salesforce, Amazon, and Google.[11][12]
Overview
[edit] Comparisons
Cloud computing derives characteristics from, but should not be confused with:
[edit] Characteristics
In general, cloud computing customers do not own the physical infrastructure, instead avoiding
capital expenditure by renting usage from a third-party provider. They consume resources as a
service and pay only for resources that they use. Many cloud-computing offerings employ the
utility computing model, which is analogous to how traditional utility services (such as
electricity) are consumed, whereas others bill on a subscription basis. Sharing "perishable and
intangible" computing power among multiple tenants can improve utilization rates, as servers are
not unnecessarily left idle (which can reduce costs significantly while increasing the speed of
application development). A side-effect of this approach is that overall computer usage rises
dramatically, as customers do not have to engineer for peak load limits.[17] In addition, "increased
high-speed bandwidth" makes it possible to receive the same response times from centralized
infrastructure at other sites.[citation needed]
The cloud is becoming increasingly associated with small and medium enterprises (SMEs) as in
many cases they cannot justify or afford the large capital expenditure of traditional IT. SMEs
also typically have less existing infrastructure, less bureaucracy, more flexibility, and smaller
capital budgets for purchasing in-house technology. Similarly, SMEs in emerging markets are
typically unburdened by established legacy infrastructures, thus reducing the complexity of
deploying cloud solutions.[18]
[edit] Economics
Cloud computing users avoid capital expenditure (CapEx) on hardware, software, and services
when they pay a provider only for what they use. Consumption is usually billed on a utility
(resources consumed, like electricity) or subscription (time-based, like a newspaper) basis with
little or no upfront cost. Other benefits of this approach are low barriers to entry, shared
infrastructure and costs, low management overhead, and immediate access to a broad range of
applications. In general, users can terminate the contract at any time (thereby avoiding return on
investment risk and uncertainty), and the services are often covered by service level agreements
(SLAs) with financial penalties.[19][20]
Although companies might be able to save on upfront capital expenditures, they might not save
much and might actually pay more for operating expenses. In situations where the capital
expense would be relatively small, or where the organization has more flexibility in their capital
budget than their operating budget, the cloud model might not make great fiscal sense. Other
factors impacting the scale of any potential cost savings include the efficiency of a company's
data center as compared to the cloud vendor's, the company's existing operating costs, the level
of adoption of cloud computing, and the type of functionality being hosted in the cloud.[22][23]
Among the items that some cloud hosts charge for are instances (often with extra charges for
high-memory or high-CPU instances); data transfer in and out; storage (measured by the GB-
month); I/O requests; PUT requests and GET requests; IP addresses; and load balancing. In some
cases, users can bid on instances, with pricing dependent on demand for available instances.[citation
needed]
[edit] Architecture
Cloud architecture,[24] the systems architecture of the software systems involved in the delivery
of cloud computing, typically involves multiple cloud components communicating with each
other over application programming interfaces, usually web services.[25] This resembles the Unix
philosophy of having multiple programs each doing one thing well and working together over
universal interfaces. Complexity is controlled and the resulting systems are more manageable
than their monolithic counterparts. The two most significant components of cloud computing
architecture are known as the front end and the back end. The front end is the part seen by the
client, i.e. the computer user. This includes the client’s network (or computer) and the
applications used to access the cloud via a user interface such as a web browser. The back end of
the cloud computing architecture is the ‘cloud’ itself, comprising various computers, servers and
data storage devices.
[edit] History
The underlying concept of cloud computing dates back to 1960s, when John McCarthy opined
that "computation may someday be organized as a public utility". Almost all the modern day
characteristics of cloud computing (elastic provision, provided as a utility, online, illusion of
infinite supply), the comparison to the electricity industry and the use of public, private,
government and community forms was thoroughly explored in Douglas Parkhill's, 1966 book,
"The Challenge of the Computer Utility".
The actual term "cloud" borrows from telephony in that telecommunications companies, who
until the 1990s primarily offered dedicated point-to-point data circuits, began offering Virtual
Private Network (VPN) services with comparable quality of service but at a much lower cost. By
switching traffic to balance utilization as they saw fit they were able to utilise their overall
network bandwidth more effectively. The cloud symbol was used to denote the demarcation
point between that which was the responsibility of the provider from that of the user. Cloud
computing extends this boundary to cover servers as well as the network infrastructure.[26]
Amazon played a key role in the development of cloud computing by modernizing their data
centers after the dot-com bubble, which, like most computer networks, were using as little as
10% of their capacity at any one time just to leave room for occasional spikes. Having found that
the new cloud architecture resulted in significant internal efficiency improvements whereby
small, fast-moving "two-pizza teams" could add new features faster and easier, Amazon initiated
a new product development effort to provide cloud computing to external customers and
launched Amazon Web Service (AWS) on a utility computing basis in 2006.[27][28]
In 2007, Google, IBM, and a number of universities embarked on a large scale cloud computing
research project.[29] In early 2008, Eucalyptus became the first open source AWS API compatible
platform for deploying private clouds. By mid-2008, Gartner saw an opportunity for cloud
computing "to shape the relationship among consumers of IT services, those who use IT services
and those who sell them",[30] and observed that "[o]rganisations are switching from company-
owned hardware and software assets to per-use service-based models" so that the "projected shift
to cloud computing ... will result in dramatic growth in IT products in some areas and significant
reductions in other areas."[31]
In March 2010, Microsoft's CEO, Steve Ballmer, made his strongest statement of betting the
company's future in the cloud by proclaiming "For the cloud, we're all in" and further stating
"About 75 percent of our folks are doing entirely cloud based or entirely cloud inspired, a year
from now that will be 90 percent."[32]
[edit] Layers
[edit] Client
A cloud client consists of computer hardware and/or computer software that relies on cloud
computing for application delivery, or that is specifically designed for delivery of cloud services
and that, in either case, is essentially useless without it. Examples include some computers,
phones and other devices, operating systems and browsers.[44][45][46][47][48]
[edit] Application
Cloud application services or "Software as a Service (SaaS)" deliver software as a service over
the Internet, eliminating the need to install and run the application on the customer's own
computers and simplifying maintenance and support. People tend to use the terms ‘SaaS’ and
‘cloud’ interchangeably, when in fact they are 2 different things.[49] Key characteristics include:
[50]
• Network-based access to, and management of, commercially available (i.e., not custom)
software
• Activities that are managed from central locations rather than at each customer's site,
enabling customers to access applications remotely via the Web
• Application delivery that typically is closer to a one-to-many model (single instance,
multi-tenant architecture) than to a one-to-one model, including architecture, pricing,
partnering, and management characteristics
• Centralized feature updating, which obviates the need for downloadable patches and
upgrades.
[edit] Platform
See also: Category:Cloud platforms
Cloud platform services or "Platform as a Service (PaaS)" deliver a computing platform and/or
solution stack as a service, often consuming cloud infrastructure and sustaining cloud
applications.[51] It facilitates deployment of applications without the cost and complexity of
buying and managing the underlying hardware and software layers.[52][53]
[edit] Infrastructure
[edit] Server
The servers layer consists of computer hardware and/or computer software products that are
specifically designed for the delivery of cloud services, including multi-core processors, cloud-
specific operating systems and combined offerings.[44][55][56][57]
Public cloud or external cloud describes cloud computing in the traditional mainstream sense,
whereby resources are dynamically provisioned on a fine-grained, self-service basis over the
Internet, via web applications/web services, from an off-site third-party provider who bills on a
fine-grained utility computing basis.[35]
A community cloud may be established where several organizations have similar requirements
and seek to share infrastructure so as to realize some of the benefits of cloud computing. With
the costs spread over fewer users than a public cloud (but more than a single tenant) this option is
more expensive but may offer a higher level of privacy, security and/or policy compliance.
Examples of community cloud include Google's "Gov Cloud".[58]
A hybrid cloud environment consisting of multiple internal and/or external providers[59] "will be
typical for most enterprises".[60] By integrating multiple cloud services users may be able to ease
the transition to public cloud services while avoiding issues such as PCI compliance.[61]
Another perspective on deploying a web application in the cloud is using Hybrid Web Hosting,
where the hosting infrastructure is a mix between Cloud Hosting for the web server, and
Managed dedicated server for the database server.
The concept of a Private Computer Utility was first described by Douglas Parkhill in his 1966
book "The Challenge of the Computer Utility". The idea was based upon direct comparison with
other industries (e.g. the electricity industry) and the extensive use of hybrid supply models to
balance and mitigate risks.
Private cloud and internal cloud have been described as neologisms, however the concepts
themselves pre-date the term cloud by 40 years. Even within modern utility industries, hybrid
models still exist despite the formation of reasonably well functioning markets and the ability to
combine multiple providers.
Some vendors have used the terms to describe offerings that emulate cloud computing on private
networks. These (typically virtualisation automation) products offer the ability to deliver some
benefits of cloud computing whilst mitigating some of the pitfalls. These offerings capitalise on
data security, corporate governance, and reliability concerns during this time of transition from a
product to a functioning service based industry supported by competitive marketplaces.
They have been criticized on the basis that users "still have to buy, build, and manage them" and
as such do not benefit from lower up-front capital costs and less hands-on management,[60]
essentially "[lacking] the economic model that makes cloud computing such an intriguing
concept".[62][63]
[edit] Cloud Storage
Main article: Cloud Storage
Cloud Storage is a model of networked computer data storage where data is stored on multiple
virtual servers, generally hosted by third parties, rather than being hosted on dedicated servers.
Hosting companies operate large data centers; and people who require their data to be hosted buy
or lease storage capacity from them and use it for their storage needs. The data center operators,
in the background, virtualize the resources according to the requirements of the customer and
expose them as virtual servers, which the customers can themselves manage. Physically, the
resource may span across multiple servers.
The Intercloud scenario is based on the key concept that each single cloud does not have infinite
physical resources. If a cloud saturates the computational and storage resources of its
virtualization infrastructure, it could not be able to satisfy further requests for service allocations
sent from its clients. The Intercloud scenario aims to address such situation, and in theory, each
cloud can use the computational and storage resources of the virtualization infrastructures of
other clouds. Such form of pay-for-use may introduce new business opportunities among cloud
providers if they manage to go beyond theoretical framework. Nevertheless, the Intercloud raises
many more challenges than solutions concerning cloud federation, security, interoperability,
QoS, vendor's lock-ins, trust, legal issues, monitoring and billing.[citation needed]
The concept of a competitive utility computing market which combined many computer utilities
together was originally described by Douglas Parkhill in his 1966 book, the "Challenge of the
Computer Utility". This concept has been subsequently used many times over the last 40 years
and is identical to the Intercloud.
[edit] Issues
[edit] Privacy
The Cloud model has been criticized by privacy advocates for the greater ease in which the
companies hosting the Cloud services control, and thus, can monitor at will, lawfully or
unlawfully, the communication and data stored between the user and the host company.
Instances such as the secret NSA program, working with AT&T, and Verizon, which recorded
over 10 million phone calls between American citizens, causes uncertainty among privacy
advocates, and the greater powers it gives to telecommunication companies to monitor user
activity.[70] While there have been efforts (such as US-EU Safe Harbor) to "harmonise" the legal
environment, providers such as Amazon still cater to major markets (typically the United States
and the European Union) by deploying local infrastructure and allowing customers to select
"availability zones."[71]
[edit] Compliance
In order to obtain compliance with regulations including FISMA, HIPAA and SOX in the United
States, the Data Protection Directive in the EU and the credit card industry's PCI DSS, users may
have to adopt community or hybrid deployment modes which are typically more expensive and
may offer restricted benefits. This is how Google is able to "manage and meet additional
government policy requirements beyond FISMA"[72][73] and Rackspace Cloud are able to claim
PCI compliance.[74] Customers in the EU contracting with Cloud Providers established outside
the EU/EEA have to adhere to the EU regulations on export of personal data.[75]
Many providers also obtain SAS 70 Type II certification (e.g. Amazon,[76] Salesforce.com,[77]
Google[78] and Microsoft[79]), but this has been criticised on the grounds that the hand-picked set
of goals and standards determined by the auditor and the auditee are often not disclosed and can
vary widely.[80] Providers typically make this information available on request, under non-
disclosure agreement.[81]
[edit] Legal
In March 2007, Dell applied to trademark the term "cloud computing" (U.S. Trademark
77,139,082) in the United States. The "Notice of Allowance" the company received in July 2008
was cancelled in August, resulting in a formal rejection of the trademark application less than a
week later.
Since 2007, the number of trademark filings covering cloud computing brands, goods and
services has increased at an almost exponential rate. As companies sought to better position
themselves for cloud computing branding and marketing efforts, cloud computing trademark
filings increased by 483% between 2008 and 2009. In 2009, 116 cloud computing trademarks
were filed, and trademark analysts predict that over 500 such marks could be filed during 2010.
[82]
Open source software has provided the foundation for many cloud computing implementations.
[83]
In November 2007, the Free Software Foundation released the Affero General Public License,
a version of GPLv3 intended to close a perceived legal loophole associated with free software
designed to be run over a network.[84]
Most cloud providers expose APIs which are typically well-documented (often under a Creative
Commons license[85]) but also unique to their implementation and thus not interoperable. Some
vendors have adopted others' APIs[86] and there are a number of open standards under
development, including the OGF's Open Cloud Computing Interface. The Open Cloud
Consortium (OCC) [87] is working to develop consensus on early cloud computing standards and
practices.
[edit] Security
The relative security of cloud computing services is a contentious issue which may be delaying
its adoption.[88] Some argue that customer data is more secure when managed internally, while
others argue that cloud providers have a strong incentive to maintain trust and as such employ a
higher level of security.[89]
The Cloud Security Alliance is a non-profit organization formed to promote the use of best
practices for providing security assurance within Cloud Computing.[90]
In addition to concerns about security, businesses are also worried about acceptable levels of
availability and performance of applications hosted in the cloud.[91]
There are also concerns about a cloud provider shutting down for financial or legal reasons,
which has happened in a number of cases.[92]
Although cloud computing is often assumed to be a form of "green computing", there is as of yet
no published study to substantiate this assumption.[93] Siting the servers affects the environmental
effects of cloud computing. In areas where climate favors cooling and lots of renewable
electricity is available the environmental effects will be more moderate. Thus countries with
favorable conditions, such as Finland,[94] Sweden[95] and Switzerland,[96] are trying to attract cloud
computing data centers.
[edit] Research
A number of universities, vendors and government organizations are investing in research
around the topic of cloud computing.[97] Academic institutions include University of Melbourne
(Australia), Georgia Tech, Yale, Wayne State, Virginia Tech, University of Wisconsin–Madison,
Boston University, Carnegie Mellon, MIT, Indiana University, University of Massachusetts,
University of Maryland, North Carolina State, Purdue, University of California, University of
Washington, University of Virginia, University of Utah, University of Minnesota, among others.
[98]
Joint government, academic and vendor collaborative research projects include the IBM/Google
Academic Cloud Computing Initiative (ACCI). In October 2007 IBM and Google announced the
multi- university project designed to enhance students' technical knowledge to address the
challenges of cloud computing.[99] In April 2009, the National Science Foundation joined the
ACCI and awarded approximately $5 million in grants to 14 academic institutions.[100]
In July 2008, HP, Intel Corporation and Yahoo! announced the creation of a global, multi-data
center, open source test bed, called Open Cirrus,[101] designed to encourage research into all
aspects of cloud computing, service and data center management.[102] Open Cirrus partners
include the NSF, the University of Illinois (UIUC), Karlsruhe Institute of Technology, the
Infocomm Development Authority (IDA) of Singapore, the Electronics and Telecommunications
Research Institute (ETRI) in Korea, the Malaysian Institute for Microelectronic
Systems(MIMOS), and the Institute for System Programming at the Russian Academy of
Sciences (ISPRAS).[103]
In July 2010, HP Labs India announced a new cloud-based technology designed to simplify
taking content and making it mobile-enabled, even from low-end devices.[104] Called
SiteonMobile, the new technology is designed for emerging markets where people are more
likely to access the internet via mobile phones rather than computers.[105]
The IEEE Technical Committee on Services Computing[106] in IEEE Computer Society sponsors
the IEEE International Conference on Cloud Computing (CLOUD).[107] CLOUD 2010 was held
on July 5–10, 2010 in Miami, Florida.
During a video interview, Forrester Research VP Frank Gillett expresses criticism about the
nature of and motivations behind the push for cloud computing. He describes what he calls
"cloud washing" in the industry whereby companies relabel their products as cloud computing
resulting in a lot of marketing innovation on top of real innovation. The result is a lot of
overblown hype surrounding cloud computing. Gillett sees cloud computing as revolutionary in
the long term but over-hyped and misunderstood in the short term, representing more of a
gradual shift in our thinking about computer systems and not a sudden transformational change.
[108][109]
Larry Ellison, CEO of Oracle Corporation has stated that cloud computing has been defined as
"everything that we already do" and that it will have no effect except to "change the wording on
some of our ads".[110][111] Oracle Corporation has since launched a cloud computing center and
worldwide tour. Forrester Research Principal Analyst John Rymer dismisses Ellison's remarks by
stating that his "comments are complete nonsense and he knows it".[112][113][114]
Richard Stallman said that cloud computing was simply a trap aimed at forcing more people to
buy into locked, proprietary systems that would cost them more and more over time. "It's
stupidity. It's worse than stupidity: it's a marketing hype campaign", he told The Guardian.
"Somebody is saying this is inevitable – and whenever you hear somebody saying that, it's very
likely to be a set of businesses campaigning to make it true." [115]