Archivo de la categoría: Research

Cloud Computing Entering Hypergrowth Phase

Cloud services and cloud platforms are now an undeniable part of the IT landscape. Forrester research indicates the shift has begun from exploration of cloud as a potential option, to rationalization of cloud services within the overall IT portfolio.

Cloud platforms, most notably Amazon Web Services, were only collectively $4.7 billion last year but are maturing quickly thanks to stronger recent solutions from traditional IT partners IBM, HP and Microsoft. The growth in use, maturity, and financial viability of public cloud platforms are proving their longstanding value as legitimate deployment options for enterprise applications. While not a one-for-one replacement for on-premise, hosting, or colocation, cloud platforms fit well as ideal deployment options for elastic and transient workloads built in modern application architectures.

For applications and services built in an agile mode with modern architectures, discrete cloud services, such as database, storage, integration and other standalone cloud middleware components, will empower developers by freeing them from the management and maintenance of these components and reduce overall deployment footprint and cost. They are also managed and enhanced by vendors as often as daily delivering new capabilities that can help a company maintain pace with the changing desires of an empowered customer base

As the largest clouds continue to invest in efficiencies that can only be achieved at their massive scales, the gulf between the cost efficiencies that can be had from the cloud and what is possible on-premise or through other outsourcing and hosting options will widen dramatically.

How Forrester came to these conclusions.

Stanford Researchers Create Tool to Triple Cloud Server Efficiency

Two Stanford engineers have created a cluster management tool that can triple server efficiency while delivering reliable service at all times, allowing data center operators to serve more customers for each dollar they invest.

“This is a proof of concept for an approach that could change the way we manage server clusters,” said Jason Mars, a computer science professor at the University of Michigan at Ann Arbor.

Kushagra Vaid, general manager for cloud server engineering at Microsoft Corp., said that the largest data center operators have devised ways to manage their operations but that a great many smaller organizations haven’t.

“If you can double the amount of work you do with the same server footprint, it would give you the agility to grow your business fast,” said Vaid, who oversees a global operation with more than a million servers catering to more than a billion users.

How Quasar works takes some explaining, but one key ingredient is a sophisticated algorithm that is modeled on the way companies such as Netflix and Amazon recommend movies, books and other products to their customers. Instead of asking developers to estimate how much capacity they are likely to need, the Stanford system would start by asking what sort of performance their applications require.

Read much more detail here.

 

Will Disks Using Shingled Magnetic Recording Kill Tape for Cold Storage?

We previously reported on the rumored Seagate/eVault “cold storage” tech initiative seeking to use disks to supplant tape libraries.

Now come this analysis from The Register.

We know Facebook’s Open Compute Project has a cold storage vault configuration using shingled magnetic recording drives. Both Google (mail backup and more) and Amazon (Glacier) have tape vaults in their storage estate. Shingled drives could change that equation because, probably, the cost/GB of a 6TB shingled drive is a lot less that that of a 4TB drive and, over, say, 500,000 drives, that saving turns into a big sum of dollars.

What are shingled drives, you ask? This video from Seagate explains:

Cloud Computing is Dead. Long Live Quantum Cloud Computing

Qcloud “…aims to provide resources for anybody interested in quantum technologies, in particular those who want to have some practical experience of using and manipulating information using quantum computers.”

We don’t even pretend to understand quantum computing. Now it’s in the cloud?!?

The Bloch sphere is a representation of a qubit, the fundamental building block of quantum computers (source: Wikipedia).

Survey Shows Extent of NSA/PRISM’s Damage to US Cloud Companies

A survey by the Cloud Security Alliance  found that 56% of non-US residents were now less likely to use US-based cloud providers, in light of recent revelations about government access to customer information.

During June and July of 2013, news of a whistleblower, US government contractor Edward Snowden, dominated global headlines. Snowden provided evidence of US government access to information from telecommunications and Internet providers via secret court orders as specified by the Patriot Act. The subsequent news leaks indicated that allied governments of the US may have also received some of this information and acted upon it in unknown ways. As this news became widespread, it led to a great deal of debate and soul searching about appropriate access to an individual’s digital information, both within the United States of America and any other country.

CSA initiated this survey to collect a broad spectrum of member opinions about this news, and to understand how this impacts attitudes about using public cloud providers.

Study Finds Enterprise Cloud Focus Shifting From Adoption to Optimization

Cloudyn together with The Big Data Group has released the latest AWS customer optimization data, reinforcing the positive growth trend expected for the year ahead.

We set out to evaluate whether the projected 2013 ‘year of cloud optimization’ is on course and discovered that we are well into the public cloud adoption life cycle. In 2011 and 2012 the conversation centered around how and when to move to the cloud. Now it is all about companies looking for efficiencies and cost controls,” commented David Feinleib, Managing Director of The Big Data Group.

The study, based on over 450 selected AWS and Cloudyn customers, highlights a more mature approach to cloud deployments reflected by a deeper understanding of where inefficiencies lurk and how to optimize them. EC2 makes up for 62% of total AWS spend, with more than 50% of customers now using Reserved Instances in their deployment mix. However, On-Demand pricing remains the top choice for most, accounting for 71% of EC2 spend. Even for customers using reservations, there is still opportunity for further efficiency.

For example, Cloudyn’s Unused Reservation Detector has assisted customers in finding a startling 24% of unused reservations. These can be recycled by relocating matching On-Demand instances to the availability zone of the unused reservation.

There is also a shift away from large instance types to medium, where two medium instances cost the same as one large, but can produce 30% more output. However, with the low 8-9% utilization rates of the popular instance types, there is certainly more work to be done on the road to cloud optimization.

Cloudyn and The Big Data Group host a webinar on May 1, 2013 at 9:00 am PT focused on deployment efficiency.

How Common are Private Clouds? How About Enterprise Use of Public Cloud Services?

Joe McKendrick, who covers technology at Forbes, has good good coverage and analysis of a new survey (pdf) just published by Unisphere Research on adoption of cloud technology by corporations:

Close to two-fifths of organizations now run private clouds in one form or another, and one-fourth are using public cloud services in an enterprise capacity. Private clouds are being extended deeper into the organizations that have them — a majority expect to be running most of their workloads in the cloud within the next 12 months, especially Platform as a Service middleware.  In addition, close to one-third of public cloud users report they are employing hosted services to run their private clouds for them.

Read the full article.

Survey Says 40 Per Cent of IT Managers Have Suffered a Cloud Outage

According to a survey by Kelton done for TeamQuest, nearly four in ten respondents reported having suffered a cloud outage:

Many survey respondents believe the reported outages could have been prevented. Capacity management is sighted as one way to minimize the risks associated with cloud computing, according to respondents in a survey from Kelton Research, commissioned by TeamQuest Corporation.