Cloud computing

>> Saturday, September 26, 2009

Cloud computing is a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the Internet. Users need not have knowledge of, expertise in, or control over the technology infrastructure in the "cloud" that supports them.
The concept generally incorporates combinations of the following:
infrastructure as a service (IaaS)
platform as a service (PaaS)
software as a service (SaaS)
Other recent (ca. 2007–09) technologies that rely on the Internet to satisfy the computing needs of users. Cloud computing services often provide common business applications online that are accessed from a web browser, while the software and data are stored on the servers.
The term cloud is used as a metaphor for the Internet, based on how the Internet is depicted in computer network diagrams and is an abstraction for the complex infrastructure it conceals.
The first academic use of this term appears to be by Prof. Ramnath K. Chellappa (currently at Goizueta Business School, Emory University) who originally defined it as a computing paradigm where the boundaries of computing will be determined by economic rationale rather than technical limits.


Cloud computing customers do not generally own the physical infrastructure serving as host to the software platform in question. Instead, they avoid capital expenditure by renting usage from a third-party provider. They consume resources as a service and pay only for resources that they use. Many cloud-computing offerings employ the utility computing model, which is analogous to how traditional utility services (such as electricity) are consumed, while others bill on a subscription basis. Sharing "perishable and intangible" computing power among multiple tenants can improve utilization rates, as servers are not unnecessarily left idle (which can reduce costs significantly while increasing the speed of application development). A side effect of this approach is that overall computer usage rises dramatically, as customers do not have to engineer for peak load limits. Additionally, "increased high-speed bandwidth" makes it possible to receive the same response times from centralized infrastructure at other sites.


Cloud computing users can avoid capital expenditure (CapEx) on hardware, software, and services when they pay a provider only for what they use. Consumption is usually billed on a utility (e.g. resources consumed, like electricity) or subscription (e.g. time based, like a newspaper) basis with little or no upfront cost. A few cloud providers are now beginning to offer the service for a flat monthly fee as opposed to on a utility billing basis. Other benefits of this time sharing style approach are low barriers to entry, shared infrastructure and costs, low management overhead, and immediate access to a broad range of applications. Users can generally terminate the contract at any time (thereby avoiding return on investment risk and uncertainty) and the services are often covered by service level agreements (SLAs) with financial penalties.
According to Nicholas Carr, the strategic importance of information technology is diminishing as it becomes standardized and less expensive. He argues that the cloud computing paradigm shift is similar to the displacement of electricity generators by electricity grids early in the 20th century.
Although companies might be able to save on upfront capital expenditures, they might not save much and might actually pay more for operating expenses. In situations where the capital expense would be relatively small, or where the organization has more flexibility in their capital budget than their operating budget, the cloud model might not make great fiscal sense. Other factors impacting the scale of any potential cost savings include the efficiency of a company’s data center as compared to the cloud vendor’s, the company’s existing operating costs, the level of adoption of cloud computing, and the type of functionality being hosted in the cloud.


Vmware, Sun Microsystems, Rackspace US, IBM, Amazon, Google, BMC, Microsoft, and Yahoo are some of the major cloud computing service providers. Cloud services are also being adopted by individual users through large enterprises including Vmware, General Electric, and Procter & Gamble
As of 2009, new players, such as Ubuntu Cloud Computing, are gaining attention in the industry
The majority of cloud computing infrastructure, as of 2009, consists of reliable services delivered through data centers and built on servers with different levels of virtualization technologies. The services are accessible anywhere that provides access to networking infrastructure. Clouds often appear as single points of access for all consumers' computing needs. Commercial offerings are generally expected to meet quality of service (QoS) requirements of customers and typically offer SLAs. Open standards are critical to the growth of cloud computing, and open source software has provided the foundation for many cloud computing implementations.

Criticism and Disadvantages of Cloud Computing

Because cloud computing does not allow users to physically possess the storage of their data (the exception being the possibility that data can be backed up to a user-owned storage device, such as a USB flash drive or hard disk) it does leave responsibility of data storage and control in the hands of the provider.
Cloud computing has been criticized for limiting the freedom of users and making them dependent on the cloud computing provider, and some critics have alleged that is only possible to use applications or services that the provider is willing to offer. Thus, The London Times compares cloud computing to centralized systems of the 1950s and 60s, by which users connected through "dumb" terminals to mainframe computers. Typically, users had no freedom to install new applications and needed approval from administrators to achieve certain tasks. Overall, it limited both freedom and creativity. The Times argues that cloud computing is a regression to that time.
Similarly, Richard Stallman, founder of the Free Software Foundation, believes that cloud computing endangers liberties because users sacrifice their privacy and personal data to a third party. He stated that cloud computing is "simply a trap aimed at forcing more people to buy into locked, proprietary systems that would cost them more and more over time."
Further to Stallman's observation, It would be a challenge for hosting/deploying intranet and access restricted (for Govt., defense, institutional, etc) sites and their maintenance. Commercial sites using tools such as web analytics may not be able to capture right data for their business planning etc.

Risk mitigation

Corporations or end-users wishing to avoid not being able to access their data — or even losing it — are typically advised to research vendors' policies on data security before using their services. One technology analyst and consulting firm, Gartner, lists several security issues that one should discuss with cloud-computing vendors:
Privileged user access—Who has specialized access to data and about the hiring and management of such administrators?
Regulatory compliance—Is the vendor willing to undergo external audits and/or security certifications?
Data location—Does the provider allow for any control over the location of data?
Data segregation—Is encryption available at all stages, and were these encryption schemes designed and tested by experienced professionals?
Recovery—What happens to data in the case of a disaster, and does the vendor offer complete restoration, and, if so, how long does that process take?
Investigative Support—Does the vendor have the ability to investigate any inappropriate or illegal activity?
Long-term viability—What happens to data if the company goes out of business, and is data returned and in what format?
Data availability—Can the vendor move your data onto a different environment should the existing environment become compromised or unavailable?
In practice, one can best determine data-recovery capabilities by experiment; for example, by asking to get back old data, seeing how long it takes, and verifying that the checksums match the original data. Determining data security can be more difficult, but one approach is to encrypt the data yourself. If you encrypt data using a trusted algorithm, then, regardless of the service provider's security and encryption policies, the data will only be accessible with the decryption keys. This leads, however, to the problem of managing private keys in a pay-on-demand computing infrastructure.

Key characteristics

Agility improves with users able to rapidly and inexpensively re-provision technological infrastructure resources. The cost of overall computing is unchanged, however, and the providers will merely absorb up-front costs and spread costs over a longer period.
Cost is claimed to be greatly reduced and capital expenditure is converted to operational expenditure. This ostensibly lowers barriers to entry, as infrastructure is typically provided by a third-party and does not need to be purchased for one-time or infrequent intensive computing tasks. Pricing on a utility computing basis is fine-grained with usage-based options and fewer IT skills are required for implementation (in-house). Some would argue that given the low cost of computing resources, that the IT burden merely shifts the cost from in-house to outsourced providers. Furthermore, any cost reduction benefit must be weighed against a corresponding loss of control, access and security risks.
Device and location independenceenable users to access systems using a web browser regardless of their location or what device they are using (e.g., PC, mobile). As infrastructure is off-site (typically provided by a third-party) and accessed via the Internet, users can connect from anywhere.
Multi-tenancy enables sharing of resources and costs across a large pool of users thus allowing for:
Centralization of infrastructure in locations with lower costs (such as real estate, electricity, etc.)
Peak-load capacity increases (users need not engineer for highest possible load-levels)
Utilization and efficiency improvements for systems that are often only 10–20% utilized.
Reliability improves through the use of multiple redundant sites, which makes cloud computing suitable for business continuity and disaster recovery. Nonetheless, many major cloud computing services have suffered outages, and IT and business managers can at times do little when they are affected.
Scalability via dynamic ("on-demand") provisioning of resources on a fine-grained, self-service basis near real-time, without users having to engineer for peak loads. Performance is monitored, and consistent and loosely-coupled architectures are constructed using web services as the system interface.
Security typically improves due to centralization of data, increased security-focused resources, etc., but concerns can persist about loss of control over certain sensitive data. Security is often as good as or better than under traditional systems, in part because providers are able to devote resources to solving security issues that many customers cannot afford. Providers typically log accesses, but accessing the audit logs themselves can be difficult or impossible. Ownership, control and access to data controlled by "cloud" providers may be made more difficult, just as it is sometimes difficult to gain access to "live" support with current utilities. Under the cloud paradigm, management of sensitive data is placed in the hands of cloud providers and third parties.
Sustainability comes about through improved resource utilization, more efficient systems, and carbon neutrality. Nonetheless, computers and associated infrastructure are major consumers of energy. A given (server-based) computing task will use X amount of energy whether it is on-site, or off.


A cloud application leverages the Cloud in software architecture, often eliminating the need to install and run the application on the customer's own computer, thus alleviating the burden of software maintenance, ongoing operation, and support. For example:
Peer-to-peer / volunteer computing (Bittorrent, BOINC Projects, Skype)
Web application (Facebook)
Software as a service (Google Apps, SAP and Salesforce)
Software plus services (Microsoft Online Services)


Cloud infrastructure, such as Infrastructure as a service, is the delivery of computer infrastructure, typically a platform virtualization environment, as a service. For example:
Full virtualization (GoGrid, Skytap, iland)
Grid computing (Sun Cloud)
Management (RightScale)
Compute (Amazon Elastic Compute Cloud)
Platform (
Storage (Amazon S3, Nirvanix, Rackspace)


A cloud platform, such as Platform as a service, the delivery of a computing platform, and/or solution stack as a service, facilitates deployment of applications without the cost and complexity of buying and managing the underlying hardware and software layers.[58] For example:
Code Based Web Application Frameworks
Java Google Web Toolkit (Google App Engine)
Python Django (Google App Engine)
Ruby on Rails (Heroku)
.NET (Azure Services Platform)
Non-Code Based Web Application Framework
Cloud Computing Application & Web Hosting (Rackspace Cloud)
Proprietary (


A user is a consumer of cloud computing. The privacy of users in cloud computing has become of increasing concern. The rights of users is also an issue, which is being addressed via a community effort to create a bill of rights. The Franklin Street statement was drafted with an eye towards protecting users' freedoms.


  © Blogger templates Palm by 2008

Back to TOP