Archivo de la categoría: Public Cloud

57% of organizations still don’t have multi-cloud strategy – survey

Competition. Business concept illustrationResearch from VMTurbo has highlighted 57% of organizations have no multi-cloud strategy at all, where as 35% do not have a private cloud strategy and 28% lack one for public cloud.

Although hybrid cloud is considered one of the growing trends within the industry, the research suggests the noise behind multi-cloud strategies is coming from either a small number of customers, or from vendor organizations themselves. Of those who would be considered in the ‘Functional Multi-cloud Owner’ group, which only represented 10.4% of the respondents, almost half were using a two-cloud model, and just over a quarter were using a three-cloud model. The multi-cloud strategy was favoured by larger organizations in general.

“A lack of cloud strategy doesn’t mean an organization has studied and rejected the idea of the cloud; it means it has given adoption little or no thought at all,” said Charles Crouchman, CTO of VMTurbo. “As organizations make the journey from on-premise IT, to public and private clouds, and finally to multi- and hybrid clouds, it’s essential that they address this.

“Having a cloud strategy means understanding the precise costs and challenges that the cloud will introduce, knowing how to make the cloud approach work for you, and choosing technologies that will supplement cloud adoption. For instance, by automating workload allocation so that services are always provided with the best performance for the best cost. Without a strategy, organizations will be condemning themselves to higher-than-expected costs, and a cloud that never performs to its full potential.”

The survey also demonstrated the total cost of ownership is not fully understood within the community itself, less so within smaller organizations. SME’s planning to build private cloud environments estimated their budget to be in the region of $150,000 (average of all respondents), whereas the total bill for those who have already completed such projects averaged at $898,508.

The stat backs up thoughts of a number of organizations who believe there should be more of a business case behind the transition to the cloud than simply reducing CAPEX and OPEX. Last month, BCN spoke to Gwil Davies, Director & Cloud Lead in the EMEA IT Infrastructure Centre of Excellence at Deloitte, to understand the economics behind cloud computing. Davies believes a successful journey to the cloud is not just focused on reducing CAPEX and OPEX throughout the organization, but identifies where value can be achieved through a cloud-enabled business.

“I think it’s more important for organizations get a real understanding of how to use the cloud and perhaps not automatically assume that moving all of their current IT into cloud is going to be the cheaper solution.” said Davies.

The business case for the cloud is almost entirely dependent on the long-term ambitions of the business itself, though the survey does imply there is a need to further educate some corners of the IT industry on the benefits and perceived cost of private cloud. Cloud computing as a concept could be perceived to have penetrated the mainstream market, though the benefits may be less so.

Microsoft shifts focus to Chinese cloud market

MicrosoftMicrosoft has announced a successful year in the Chinese market, as well as intentions to step-up its expansion plans in the region, according to China Daily.

The company claims it now has more than 65,000 corporate clients, and appetite for its Azure offering in Chinese enterprise organizations is steadily increasing. As part of the expansion plans, Microsoft lowered its prices for Chinese customers earlier this month, seemingly in an effort to undercut its global competitor AWS, as well as local powerhouses such as Alibaba Tencent.

“Though the GDP growth is slowing down, Chinese companies still need to focus on three points to remain relevant and competitive: innovation, productivity and the return of investments,” Ralph Haupter, CEO of Microsoft in China. “And cloud computing can help in all of the above three aspects. We will focus on manufacturing, retail, automotive, media and other industries to further expand market share.”

While China has proved to be one of the top priorities of the majority of the cloud players in recent years, a recent report from BSA highlighted the region was one of the poorest performers in the global IT community. Measuring each country of their cloud policies and legislation, as well as the readiness of its enterprises, China ranked 23 out of the top 24 IT nations worldwide, mainly due to poor performance in the data privacy, cybercrime, promotion of free trade and security categories, though it was one of the worst performers across every category.

Despite concerns from the BSA, Ji Yanhang, an analyst at Analysys International, believes the market has strong potential, stating “China’s national strategies, such as boosting high-end manufacturing, will increase demand for cloud services in the coming years.”

The announcement follows last weeks’ quarterly earnings call, where CEO Satya Nadella reported that Office commercial products and cloud services revenue grew 7%, Office consumer products and cloud services revenue grew 6% and Dynamics products and cloud services revenue grew 9%. Azure revenues grew 120% over the period, though this is down from 140% growth in the previous quarter.

Dropbox launches Project Infinite to bolster mobility capabilities

Project InfiniteSpeaking at Dropbox Open London, Dropbox has announced the launch of Project Infinite, a new offering which the company claims meets expectations on how people find, access, and collaborates with large amounts of data.

Building on the ideas and new trends of mobility, collaboration and accessibility, Dropbox believe traditional tools, such as shared network drives and browser-based solutions, don’t meet the standards. The company claims Project Infinite will enable customers to work directly from the cloud, removing any concerns about the power and storage capabilities of their device.

“With Project Infinite, we’re addressing a major issue our users have asked us to solve,” said Genevieve Sheehan, Product Manager at Dropbox. “The amount of information being created and shared has exploded, but most people still work on devices with limited storage capacity. While teams can store terabyte upon terabyte in the cloud, most individuals’ laptops can only store a small fraction of that. Getting secure access to all the team’s data usually means jumping over to a web browser, a clunky user experience at best

“Project Infinite will enable users to seamlessly and securely access all their Dropbox files from the desktop, regardless of how much space they have available on their hard drives. Everything in the company’s Dropbox that you’re given access to, whether it’s stored locally or in the cloud, will show up in Dropbox on your desktop. If it’s synced locally, you’ll see the familiar green checkmark, while everything else will have a new cloud icon.”

The company also announced it has been growing in Europe, which is also supported by the appointment of a new European Vice President, Philip Lacor, who joins from Vodafone in Germany. The company now claims to have more than 500 million registered users, as well being used in 52% of companies in the Fortune 500, 33% of companies in the FTSE 100, and 29% of companies in the Global 2000.

Software-Defined Data Centre to become a common fixture in US – survey

Cloud computingA survey from security and compliance company HyTrust claims the Software-Defined Data Centre (SDDC) is on the verge of becoming a common fixture in corporate America.

65% of the respondents predict faster deployment in 2016, while 62% anticipate increased adoption of the SDDC. Nearly half see greater adoption of network virtualization, while even more, 53%, anticipate and increased adoption of storage virtualization. 50% of the respondents also anticipate higher levels of adoption of public cloud across the course of 2016 also.

“This survey is truly interesting in that it uncovers a new level of maturity in organizations pursuing a SDDC leveraging virtualization and the cloud. It’s long been happening, but now faster and with greater conviction and comfort than perhaps ever before,” said Eric Chiu, President of HyTrust. “Security and privacy have always been the critical inhibitors, and no one denies that these issues still concern senior executives.

“But now we can also see that technologies like those offered by HyTrust, which balance a high level of security and control with smooth automation, are having a major impact. The benefits of virtualized and cloud infrastructures are undeniable—think agility, flexibility and lower cost, among many other advantages—and the obstacles to enjoying those benefits are increasingly being overcome.”

From a security perspective, 74% of the respondents believe security is less of an obstacle to adoption compared to 12 months ago, however that is not to say security challenges have been reduced significantly. 54% of the respondents believe there will be an increased number of breaches throughout 2016, whereas only 11% say the contrary. In terms of migration, 67% believe security will ultimately slow down the process, and 70% believe there will be the same or even greater levels of internal compliance and auditing challenges following the transition to a SDDC platform.

While the Software-Defined Data Centre should not be considered a new term or trend within the industry levels of adoption and trust have been lower in comparison to other technologies in the cloud world. As the industry continues its journey towards automation, the SDDC trends will likely only become louder, as the survey demonstrates.

AWS launches new features at Chicago Summit

amazon awsAmazon Web Services has launched a number of new features, along with the announcement that AWS Import/Export Snowball is now available in four new regions, including Europe.

Speaking at AWS Chicago Summit, the team announced several updates including new security features, tools which simplify the movement of data around an organizations cloud, platforms for automatically deploying and running apps on Amazon’s cloud infrastructure, testing features, as well as authentication services.

Firstly, the AWS Device Farm Update is a feature, initially introduced last June, which enables customers to test mobile apps on real devices. The service is built on the concept of ‘write once, test everywhere’, enabling developers the chance to test apps in more than 200 unique environments (a variety of carriers, manufacturers, models, operating systems etc.). The update now provides customers with remote access to devices for interactive testing.

Writing on the AWS blog, Jeff Barr, Chief Evangelist at Amazon Web Services said, “you simply open a new session on the desired device, wait (generally a minute or two) until the device is available, and then interact with the device via the AWS Management Console. You can gesture, swipe, and interact with devices in real time directly through your web browser as if the device was on your desk or in your hand. This includes installing and running applications.”

Amazon S3 and Snowball, designed to increase speed of the data migration process, also received attention during the event. The AWS Import/Export Snowball was launched for customers who intend to move larger amounts of data, generally 10 terabytes or more, and has now been beefed up once again. New features for S3 make use of the AWS edge infrastructure to increase speed, and Snowball also has larger-capacity as well as now being available in four new regions.

“Many AWS customers are now using AWS Import/Export Snowball to move large amounts of data in and out of the AWS Cloud,” said Barr. “The original Snowball appliances had a capacity of 50 terabytes. Today we are launching a newer appliance with 80 terabytes of capacity.”

Amazon Kinesis, a service which enables users to manage data that is streamed into the cloud, has been updated to allow users to deploy, run, and scale Elasticsearch in the AWS Cloud, as well interaction with Amazon CloudWatch, its monitoring service.

The Cognito service allows apps to add authentication, user management, and data synchronization without having to write backend code or manage any infrastructure. The ‘Your User Pools’ feature update allows developers to build a user directory that can scale to hundreds of millions of users, to help manage the authentication process.

“Using a user pool gives you detailed control over the sign-up and sign-in aspects of your web and mobile SaaS apps, games, and so forth,” said Barr. “Building and running a directory service at scale is not easy, but is definitely undifferentiated heavy lifting, with the added security burden that comes when you are managing user names, passwords, email addresses, and other sensitive pieces of information. You don’t need to build or run your own directory service when you use Cognito Identity.”

Finally, the Elastic Beanstalk, which automatically deploys and runs apps on Amazon’s cloud infrastructure, has also been updated, by adding support for managed platform updates. Developers are now able to select a maintenance window, and the new feature will update the environment to the latest platform version automatically.

“The updates are installed using an immutable deployment model to ensure that no changes are made to the existing environment until the updated replacement instances are available and deemed healthy (according to the health check that you have configured for the application),” said Barr.

Google cloud team launches damage control mission

Close up of an astronaut in outer space, planet Mars in the background. Elements of the image are furnished by NASAGoogle will offer all customers who were affected by the Google Compute Engine outage with service credits, in what would appear to be a damage control exercise as the company looks to gain ground on AWS and Microsoft Azure in the public cloud market segment.

On Monday, 11 April, Google Compute Engine instances in all regions lost external connectivity for a total of 18 minutes. The outage has been blamed on two separate bugs, which separately would not have caused any major problems, though the combined result was a service outage. Although the outage has seemingly caused embarrassment for the company, it did not impact other more visible, consumer services such as Google Maps or Gmail.

“We recognize the severity of this outage, and we apologize to all of our customers for allowing it to occur,” said Benjamin Treynor Sloss, VP of Engineering at Google, in a statement on the company’s blog. “As of this writing, the root cause of the outage is fully understood and GCE is not at risk of a recurrence. Additionally, our engineering teams will be working over the next several weeks on a broad array of prevention, detection and mitigation systems intended to add additional defence in depth to our existing production safeguards.

“We take all outages seriously, but we are particularly concerned with outages which affect multiple zones simultaneously because it is difficult for our customers to mitigate the effect of such outages. It is our hope that, by being transparent and providing considerable detail, we both help you to build more reliable services and we demonstrate our ongoing commitment to offering you a reliable Google Cloud platform.”

While the outage would not appear to have caused any major damage for the company, competitors in the space may secretly be pleased with the level of publicity the incident has received. Google has been ramping up efforts in recent months to bolster its cloud computing capabilities to tackle the public cloud market segment with hires of industry hard-hitters, for instance Diane Greene, rumoured acquisitions, as well as announcing plans to open 12 new data centres by the end of 2017.

The company currently sits in third place in the public cloud market segment, behind AWS and Microsoft Azure, though has been demonstrating healthy growth in recent months prior to the outage.

Public cloud spend to increase by 14.1% in 2016

Searching. Search for opportunities. Business illustrationResearch firm IDC have released findings which demonstrate healthy growth in the cloud market throughout 2016.

IDC’s Worldwide Quarterly Cloud IT Infrastructure Tracker estimates spending on public cloud infrastructure is to increase by 14.1% over the course of the 12 months to $24.4 billion, and spending on private cloud platforms could be up 11.1% to $13.9 billion.

“For the majority of corporate and public organizations, IT is not a core business but rather an enabler for their core businesses and operations,” said Natalya Yezhkova, Research Director for the storage systems group at IDC. “Expansion of cloud offerings creates new opportunities for these businesses to focus efforts on core competences while leveraging the flexibility of service-based IT.”

Total spend for IT infrastructure products is expected to increase by 18.9% over the course 2016 to reach $38.2 billion, though it is still yet to surpass traditional, non-cloud, environments, which will decrease by 4%. Non-cloud platforms will still account for the majority of enterprise IT spend, accounting for 62.8%. From a cloud-deployment product perspective Ethernet switching spend will increase by 26.8%, with investments in servers and storage to grow at 12.4% and 11.3%, respectively.

The report also detailed vendor revenue from sales of infrastructure products over the course of 2015, which grew 21.9% to $29 billion. Revenues for Q4 grew at a slower rate, 15.7%, but still accounted for $8.2 billion, with public cloud grabbing the lion’s share $4.9 billion. Japan saw the largest margin of growth, 50%, whereas Central and Eastern Europe declined 9.3% seemingly owing to political and economic turmoil, which could be linked to a reduction in IT spend.

“The cloud IT infrastructure market continues to see strong double-digit growth with faster gains coming from public cloud infrastructure demand,” said Kuba Stolarski, Research Director for Computing Platforms at IDC. “End customers are modernizing their infrastructures along specific workload, performance, and TCO requirements, with a general tendency to move into 3rd Platform, next-gen technologies.

“Public cloud as-a-service offerings also continue to mature and grow in number, allowing customers to increasingly use sophisticated, mixed strategies for their deployment profiles. While the ice was broken a long time ago for public cloud services, the continued evolution of the enterprise IT customer means that public cloud acceptance and adoption will continue on a steady pace into the next decade.”

HPE continued as market leader for cloud IT infrastructure vendor revenues bringing in around $4.55 billion over the course of 2015, increasing its market share from 15% to 15.7%. Dell, Cisco, EMC and IBM completed the top 5, with only IBM dropping market share over the period. The company’s market share decreased 24.6% to roughly $1.24 billion, down from 6.9% to 4.3% of the overall segment.

Googles continues public cloud charge with 12 new data centres

GoogleGoogle has continued its expansion plans in the public cloud sector after announcing it will open 12 new data centres by the end of 2017.

In recent weeks, Google has been expanding its footprint in the cloud space with rumoured acquisitions, hires of industry big-hitters and blue-chip client wins, however its new announcement adds weight to the moves. With two new data centres to open in Oregon and Tokyo by the end of 2016, and a further ten by the end of 2017, Google is positioning itself to challenge Microsoft and AWS for market share in the public cloud segment.

“We’re opening these new regions to help Cloud Platform customers deploy services and applications nearer to their own customers, for lower latency and greater responsiveness,” said Varun Sakalkar, Product Manager at Google. “With these new regions, even more applications become candidates to run on Cloud Platform, and get the benefits of Google-level scale and industry leading price/performance.”

Google currently operates in four cloud regions and the new data centres will give the company a presence in 15. AWS and Microsoft have built a market-share lead over Google thanks in part to the fact that they operate in 12 and 22 regions respectively, with Microsoft planning to open a further five.

Recent findings from Synergy Research Group show AWS is still the clear leader in the cloud space at market share of 31%, with Microsoft accounting for 9% and Google controlling 4%. Owing to its private and hybrid cloud offerings, IBM accounts for 7% of the global market according to Synergy.

Growth at AWS was measured at 63%, whereas Microsoft and Google report 124% and 108% respectively. Industry insiders have told BCN that Microsoft and Google have been making moves to improve their offering, with talent and company acquisitions. Greater proactivity in the market from the two challengers could explain the difference in growth figures over the last quarter.

Alongside the new data centres, Google’s cloud business leader Diane Greene has announced a change to the way the company operates its sales and marketing divisions. According to Bloomberg Business, Greene told employees that Google will be going on a substantial recruitment drive, while also changing the way it sells its services, focusing more on customer interaction and feedback. This practice would not be seen as unusual for its competitors, however Google’s model has been so far built on the idea of customer self-service. The cloud sales team on the west coast has already doubled in size to fifty, with the team planning on widening this recruitment drive.

While Google’s intentions have been made clear over recent months, there are still some who remain unconvinced. 451 Group Lead Analyst Carl Brooks believes the company is still not at the same level as its competitors, needing to add more enterprise compatibility, compliance, and security features. “They are probably the most advanced cloud operation on the planet. It also doesn’t matter,” he said.

GoDaddy lauches cloud services tailored for small businesses

Godaddy logo matWeb hosting company GoDaddy has expanded its offering for small business customers to include Cloud Servers and Bitnami-powered Cloud Applications.

GoDaddy, which claims to have more than 61 million domain names under management, will offer its customers a “pay as you go” utility billing model, which will enable customers build, test and scale cloud solutions on GoDaddy’s infrastructure. The company’s traditional playing field is to give customers access to site building software like Word Press, but the new move will provide an environment where they can build and run just about any software they like.

“With the launch of Cloud Servers, GoDaddy aims to extend our lead as the number one trusted provider of Cloud Hosting solutions for individual developers and technologists. We’re looking to make it easy for developers to serve small businesses with the technology they want,” said Jeff King, GM Hosting, Security at GoDaddy. “By offering a powerful, yet simple cloud offering that integrates domains, DNS, security and backups all in one place, developers can save time and exceed their clients’ expectations.”

Unlike its better-known rivals in the cloud space, GoDaddy will build on its traditional business model of targeting individual developers, tech entrepreneurs and small-scale businesses with the new solution. The services will offer a number of features to smaller businesses that cannot afford or justify an all-encompassing service offered by the traditional players in the public cloud market. The company claims virtual instances can be built, tested, cloned and re-provisioned in less than a minute, meeting market expectations.

Alongside the servers, GoDaddy’s Cloud Applications are powered by Bitnami, an open source server application deployments library. “As a GoDaddy technology partner on Cloud Applications, we’re excited for GoDaddy’s international customer base to take advantage of our capabilities – joining the millions of developers and business users who save time and effort with our library’s consistent, secure and optimized end-user experience,” said Erica Brescia, Co-Founder at Bitnami. “We’re proud to partner with GoDaddy in serving this global market of advanced SMB-focused developers.”

The new offering from GoDaddy has seemingly been in the works for some time, as the team announced the acquisition of the public cloud customer division of Apptix for $22.5 million last September.

“With the acquisition of Apptix’s public cloud customer base, we have an opportunity to take customers using Hosted Exchange and bring them over to GoDaddy’s Microsoft Office 365 offering,” said Dan Race, GoDaddy’s VP of Corporate Comms, at the time.

With Microsoft and Google making moves to take market share away from AWS in the corporate space, GoDaddy is targeting the small business market, a niche that appears to be relatively overlooked.

Google said to be on cloud shopping spree

Googlers having funGoogle is rumoured to be planning the acquisition of a number of businesses to bolster its cloud computing platform and suite of workplace applications.

According to Re/code, the tech giant has amassed a short-list of various start-ups and niche service providers including automated app services start-up Metavine, e-commerce public company Shopify, and payroll and health benefits services business Namely. Re/code sources have stressed that the approaches are preliminary, and none of the companies involved have commented on the rumours.

The moves seem to address two challenges currently facing the Google team. Firstly, there is a notable gap of ‘middle range’ customers for Google Apps. The company traditionally does well with small and large companies, but has struggled with the lucrative market in between. Last year, Google attempted to lure the middle market onto Google Apps for Work by offering the service for free while seeing out their current enterprise agreement, and then $25 per user after that point.

Secondly, the acquisitions would enable Google to move its internal systems to its cloud platform, potentially creating a more solid offering to challenge AWS and Microsoft Azure.

The reports back-up recent moves in the market which indicated Google’s intentions of increasing its stake in the cloud market. While AWS and Microsoft have been firmly planted as the number one and number two players in the public and private cloud space, Google is closing the gap, making a number of company and talent acquisitions to improve its proposition.

Aside from the recent hire of VMware founder Diane Greene to lead its cloud business, last year SVP of Technical Infrastructure Urs Hölzle highlighted that Google cloud platform revenues could surpass Google’s advertising revenue within five years.

“The goal is for us to talk about Google as a cloud company by 2020,” said Hölzle in October. “Our cloud growth rate is probably industry-leading…and we have lots of enterprise customers, happy enterprise customers.”

The rumours shouldn’t come as a surprise, as Hölzle also said that there would be a number of announcements which would “remove any doubt” from Google’s future plans.

While the approaches are rumours, GCP Next 2016, the company’s cloud developer user conference taking place this week, may provide some clarity to Google’s aspirations.