An Assessment of Public Cloud Storage Offerings

Earlier in the year, Gartner forecast that by year-end 2016, more than 50 percent of global companies will have stored customer-sensitive data in a public cloud platform. These services can scale very quickly, which has made them very popular for applications that have a requirement for flexibility.

Cloud storage offers organizations significant cost and agility benefits, but can pose some security, privacy, accessibility and performance challenges. Therefore, a big part of selecting a best-fit cloud storage provider is investing the time to perform the research to fully understand its capabilities.

Furthermore, most industry analyst prior market studies have uncovered that Line of Business leaders are most likely to procure public cloud service offerings for their organizations.

However, survey participants at two recent business technology events on the East Coast were divided about who in their organization is primarily driving their cloud computing strategy.

So, given these latest findings, who exactly is leading the transition to cloud services? Actually, it can vary, based upon the type of organization. Top-level executive decision makers and lower-level IT management staff respondents were within 4 percentage points of each other.

Profile of Cloud Storage Service Demand

Avere Systems released the findings of a recent cloud adoption study conducted at the “2014 Cloud Expo New York” and the “2014 AWS Summit,” also held in New York.

Which industries are leading the move to public cloud service adoption? In addition to a cross-section of technology companies, life sciences and finance organizations were most interested in adopting cloud computing solutions.

Forty-six percent of respondents are already building a hybrid cloud — with 35 percent doing it on their own and 11 percent leveraging a partner. Moreover, 31 percent of respondents do not have a hybrid cloud plan in place at this time.

Thirty-three percent are planning to move over half of their storage to the public cloud. Fifty-two percent are planning to move less than half to a public cloud service provider.

Data archiving was the most common use for cloud migration, with 28 percent of survey respondents. Corporate file sharing was next with 22 percent of the respondents.

Somewhat surprisingly, 18 percent of the survey respondents indicate that they are migrating all their data-related business processes to a cloud services model.

Other Key findings from the market study include:

  • Hybrid Cloud Adoption – 58 percent of attendees are adopting a hybrid cloud strategy. Similarly, 58 percent of attendees plan on migrating at least some of their on-premises applications to the cloud within the next two years.
  • Cloud Leadership – 26 percent of attendees indicated that their top level executive team were driving cloud strategy within their organizations, while 22 percent of respondents said storage or data management teams were spearheading these decisions.
  • Public Cloud Vendors – 49 percent of respondents indicated that Amazon Web Services (AWS) is their cloud of choice for storage, with 56 percent considering the service for object storage. 19 percent of respondents are considering Google Cloud Platform (GCP) and 13 percent Microsoft Azure.

The entire survey was conducted on-site at the two industry events in New York. A total of 205 respondents were asked a series of questions relating to cloud adoption and cloud storage usage.

read more

Announcing @Stratogent to Exhibit at @CloudExpo Silicon Valley [#Cloud]

SYS-CON Events announced today that Stratogent will exhibit at SYS-CON’s 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Stratogent is a custom managed services organization based in San Mateo, California. We design, implement, and support mission critical infrastructure 24×7 on premises, in datacenters and in the Cloud. Since 2005, we have acted as an extension of internal IT teams, achieving a customer retention rate of 100%.

read more

Meet @Seagate November 4-6 at @CloudExpo Silicon Valley [#Cloud]

Seagate has a strong track record of collaborating with others to develop better cloud solutions. The Seagate Cloud Builder Alliance program, for example, leverages the company’s knowledge of storage and cloud-optimized solutions to give cloud service providers the customized, flexible and scalable server and storage solutions to meet the high levels of service their customers demand. Seagate also is a member of the OpenStack Foundation and Open Compute Project to help define and promote open-source standards for cloud computing.

read more

Meet @CiscoCloud November 4-6 at @CloudExpo Silicopn Valley [#Cloud #IoT]

The Internet of Things (IoT) is going to require a new way of thinking and of developing software for speed, security and innovation. This requires IT leaders to balance business as usual while anticipating for the next market and technology trends. Cloud provides the right IT asset portfolio to help today’s IT leaders manage the old and prepare for the new. Today the cloud conversation is evolving from private and public to hybrid. This session will provide use cases and insights to reinforce the value of the network in helping organizations to maximize their company’s cloud experience.

read more

Meet @DataCenters_QTS on November 4-6 at @CloudExpo [#Cloud]

What process has your provider undertaken to ensure that the cloud tenant will receive predictable performance and service? What was involved in the planning? Who owns and operates the data center? What technology is being used? How is it being supported? In his session at 14th Cloud Expo, Dave Weisbrot, Cloud Business Manager for QTS, will provide the attendees a look into what it takes to stand up and stand behind a highly available certified cloud IaaS.

read more

Microsoft takes clear lead in IaaS second place race, AWS still way out in front

Picture credit: iStockPhoto

Microsoft has leapt ahead of its competition for second place in global infrastructure as a service (IaaS) revenues, yet still continues to be dwarfed by Amazon Web Services, according to the latest analysis from Synergy Research.

The analyst house, which has covered the IaaS market in depth over the past few years, released a graph today which showed competitive pricing for cloud infrastructure services in Q3. Amazon retains its huge lead in the market, yet Microsoft is racing clear of competitors Google, IBM, Salesforce and Rackspace:

AWS’ exact market share is put at 27% by Synergy, gaining ground after what the research firm called a “relatively soft” Q2. While the graph shows Microsoft having by far the highest percentage growth, it’s all relative: AWS revenue growth over the past four quarters is greater than Microsoft’s total cloud infrastructure revenue.

“AWS remains in a league of its own for scale,” the researchers argue.

The second quarter analysis can be found here, with AWS’ percentage growth at 49% compared with Microsoft’s 164% and IBM’s 86%. Back then the headline story was Google’s comparative lack of growth, at ‘just’ 47%.

Back then Synergy Research chief analyst John Dinsdale argued it was only going to be a blip for Amazon, telling CloudTech he didn’t think the company was resting on its laurels, “quite the opposite actually”. So it has proved in Q3.

Synergy estimates that quarterly cloud infrastructure service revenues, including IaaS, PaaS, private and hybrid cloud, have now hit the $4bn mark. The overall market grew by almost half (49%), yet the four main operators grew even more rapidly than that.

Microsoft announced last week it was offering unlimited OneDrive storage to Office 365 customers, while AWS launched a series of data centres in Frankfurt. Yet the Q3 revenue figures of the bigger players haven’t all hit the mark; Google’s came back lower than Wall Street expected, while IBM’s were also criticised.

How software defined networking and cloud computing pave the way towards a digital society

Picture credit: iStockPhoto

By Antonio Manzalini

Ultra-broadband network proliferation, advances in information technology and the evolution of endpoint devices have created the conditions for re-inventing telecommunications networks and services architectures.

Software defined networking (SDN) and network function virtualization (NFV) are just two facets of the so-called “IT-zation or softwarization” of telecom infrastructures. SDN decouples the software, control plane, from the forwarding hardware of nodes such as routers and switches and executes the control software in the cloud or in any standard or hybrid processing resources made available, such as blades or servers. SDN doesn’t just affect the evolution of Layer 2 and Layer 3 services such as switching and networking, but also impacts Layer 4 to Layer 7 network functions.

In reality, there are a lot of middle-boxes deployed in current networks, including Wide Area Network (WAN) optimizers, Network Address Translation (NAT), performance-enhancing-proxies, intrusion detection and prevention systems such as firewalls. Virtualizing these middle-boxes network functions allow for considerable cost savings, and this is where NFV plays a role: virtualized network functions can be dynamically deployed and moved to various locations of an infrastructure where processing power is available, not only distributed into the network but even into the cloud.

However, SDN and NFV are not dependent on each other, but they are certainly mutually beneficial. For an example, it will be possible in the medium-term to develop network functions and services, spanning L2-L7, as applications and execute them on virtual resources (e.g. virtual machines) hosted either in centralized cloud computing data centers, or in distributed clusters of mini-data centers.

SDN and NFV are not new principles, as they were already proposed and demonstrated in the past as far back as the ‘90s, but now there are the conditions for a breakthrough: in fact, the telecom industry is considering them potentially impactful due to the emergence of ultra-broadband, low-latency connections and high performance processing power. Technology today appears to be ready for a sustainable deployment of SDN and NFV.

This is a potentially big transformation of the telecom infrastructure.

In the long term, the distinction between the network and the cloud is likely to disappear, as more and more functions will be performed either in the network or in the cloud depending on the performance requirements and costs optimizations. The control and orchestration capabilities will be the key factor for success in taming the complexity of an infrastructure executing millions of software transactions or service chains. The most important requirement will be ensuring ultra-low application latency.

Looking at the evolution of telecom from this perspective, the so-called role of a “software-defined operator” is gaining more traction. A software-defined operator could be described as an operator owning basically software networks and services platforms, whose L2-L7 functions are decoupled from the hardware and executed and operated either in distributed computing capabilities or in the cloud. This will be possible in less than five years, but the full adoption of this innovation will depend on a number of factors, including business models, sustainability and a proper regulation.

In general, software-defined operators will see dramatic costs reductions, including 40-50% savings in energy, CAPEX reductions, improved efficiency in the overall operations (as much as 35% OPEX savings just by automating processes), and reduced time-to-market when deploying services. Other strategic scenarios are even possible. These operators could possibly “upload and execute” their networks and services platforms anywhere in the world if there are infrastructure providers willing to rent hardware resources such as processing, storage and transmission. This might represent a disruptive evolution of the business towards OPEX-centric models, which are typical of the cloud.

It should be mentioned that IT advances in computing and storage hardware have an investment profile in terms of CAPEX and OPEX, which are quite different from those of the traditional networks. In fact, telecom network investments span across longer periods, anywhere from five to ten years and require maintenance operations efforts that are geographically distributed and increase operational costs. On the other hand, software implies large upfront costs for development and testing and requires a longer initial phase to guarantee carrier grade applications.

Even still, these trends are making network and cloud computing innovation accessible to all enterprises in almost any part of the world on an equal basis. This evolution will reduce the thresholds for new players to enter the telecom and information communication technology (ICT) markets: competition is being moved to the realm of software, and lower CAPEX and / or OPEX will be required to start providing ICT services.

In general, we can argue that these trends are accelerating the transition towards the Digital Society and the Digital Economy, where the network infrastructures, more and more pervasive and embedding processing and storage capabilities, will become the “nervous system” of our society. This will enable new service scenarios.

In the book The Second Machine Age, the authors Brynjolfsson and McAfee argue the exponential growth in the computing power of machines, the amount of digital information and the number of relatively cheap interconnected devices will bring soon the “machines” to do things that we, humans, are usually doing today. This is again another facet of the same IT-zation trend: the creation of a new and pervasive “machine intelligence”, supported by an highly flexible network, capable of fundamentally changing the economy.

In fact, even today data and information are instantaneously reaching almost every corner of the world through ultra-broadband, low-latency networks, where a huge amount of computing via the cloud is available to transform it into knowledge. But this is the definition of “intelligence”: the capability of processing and exchanging information to understand what’s happening in the environment, to adapt to changes and to learn.Availability of huge amount of cloud processing and storage, interconnected by flexible and fast networks will be create a pervasive “machine intelligence” able to morph the space-time physical dimensions of life, as the physical direct presence of humans will be less and less required to perform certain jobs or tasks.

When these intelligent machines will “flood the society landscape”, there will be a number of socio-economic impacts: reduction of human efforts in jobs subjected to computerization, robotization; increase of local production; reduction of long distance transportation; optimization of socio-economic processes; and industries will not need relocating where human labor cost are lower.

Eventually, because of this evolution, several economists, as well as technologists, have started to wonder if the usual representation of relationships among a myriad of players in the telecom and cloud domains can still be modeled on the bases of value chains.  There is a growing consensus that value chains modeling shall have to be complemented by a broader view, considering the business ecosystems of the true Digital Economy.

Antonio Manzalini received the M. Sc. Degree in Electronic Engineering from the Politecnico of Turin and is currently senior manager at the Innovation Dept. (Future Centre) of Telecom Italia. His current development interests are in the area of Software Defined Networks (SDN) and Network Functions Virtualization (NFV), as it relates to the evolution to 5G. Manzalini is also chair of the IEEE SDN Initiative, which is now seeking authors to present at the IEEE International Conference on Network Softwarization (NetSoft 2015), its flagship event, 13-17 April 2015, at the Cruciform Building at University College London.

Harnessing Rapid ITSM for Business Agility | @CloudExpo [#Cloud]

A new wave of ITSM technologies and methods allow for a more rapid ITSM adoption — and that means better rapid support of agile business processes.
Businesses of all stripes rate the need to move faster as a top priority, and many times, that translates into the need for better and faster IT projects. But traditional IT processes and disjointed project management don’t easily afford rapid, agile, and adaptive IT innovation.
The good news is that a new wave of ITSM technologies and methods allow for a more rapid ITSM adoption — and that means better rapid support of agile business processes.

read more

How to Avoid the #BigData Black Hole | @BigDataExpo [#IoT]

It’s hard to miss the world of opportunities that data collection and analysis have opened up. But how can you avoid having information overload?
It takes a lot of will power, in our data obsessed world to say “too much!” However, there are many ways where too much information is destroying productivity, and actually causing bad decision making, not good. But it is hard to avoid the world of opportunities that has been opened in data collection and analysis. So how do you balance the two? The first step is to understand there is a big difference between data collection, and it’s utilization. While it seems subtle, the difference is key, and utilization is where many make mistakes.

read more

Internet of Things – Meet @AriaSystemsInc Nov 4-6 @CloudExpo [#Cloud #IoT]

SYS-CON Events announced today that Aria Systems, the recurring revenue expert, has been named “Bronze Sponsor” of SYS-CON’s 15th International Cloud Expo®, which will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Aria Systems helps leading businesses connect their customers with the products and services they love. Industry leaders like Pitney Bowes, Experian, AAA NCNU, VMware, HootSuite and many others choose Aria to power their recurring revenue business and deliver exceptional experiences to their customers.

read more