Archivo de la categoría: Public Cloud

Software-Defined Data Centre to become a common fixture in US – survey

Cloud computingA survey from security and compliance company HyTrust claims the Software-Defined Data Centre (SDDC) is on the verge of becoming a common fixture in corporate America.

65% of the respondents predict faster deployment in 2016, while 62% anticipate increased adoption of the SDDC. Nearly half see greater adoption of network virtualization, while even more, 53%, anticipate and increased adoption of storage virtualization. 50% of the respondents also anticipate higher levels of adoption of public cloud across the course of 2016 also.

“This survey is truly interesting in that it uncovers a new level of maturity in organizations pursuing a SDDC leveraging virtualization and the cloud. It’s long been happening, but now faster and with greater conviction and comfort than perhaps ever before,” said Eric Chiu, President of HyTrust. “Security and privacy have always been the critical inhibitors, and no one denies that these issues still concern senior executives.

“But now we can also see that technologies like those offered by HyTrust, which balance a high level of security and control with smooth automation, are having a major impact. The benefits of virtualized and cloud infrastructures are undeniable—think agility, flexibility and lower cost, among many other advantages—and the obstacles to enjoying those benefits are increasingly being overcome.”

From a security perspective, 74% of the respondents believe security is less of an obstacle to adoption compared to 12 months ago, however that is not to say security challenges have been reduced significantly. 54% of the respondents believe there will be an increased number of breaches throughout 2016, whereas only 11% say the contrary. In terms of migration, 67% believe security will ultimately slow down the process, and 70% believe there will be the same or even greater levels of internal compliance and auditing challenges following the transition to a SDDC platform.

While the Software-Defined Data Centre should not be considered a new term or trend within the industry levels of adoption and trust have been lower in comparison to other technologies in the cloud world. As the industry continues its journey towards automation, the SDDC trends will likely only become louder, as the survey demonstrates.

AWS launches new features at Chicago Summit

amazon awsAmazon Web Services has launched a number of new features, along with the announcement that AWS Import/Export Snowball is now available in four new regions, including Europe.

Speaking at AWS Chicago Summit, the team announced several updates including new security features, tools which simplify the movement of data around an organizations cloud, platforms for automatically deploying and running apps on Amazon’s cloud infrastructure, testing features, as well as authentication services.

Firstly, the AWS Device Farm Update is a feature, initially introduced last June, which enables customers to test mobile apps on real devices. The service is built on the concept of ‘write once, test everywhere’, enabling developers the chance to test apps in more than 200 unique environments (a variety of carriers, manufacturers, models, operating systems etc.). The update now provides customers with remote access to devices for interactive testing.

Writing on the AWS blog, Jeff Barr, Chief Evangelist at Amazon Web Services said, “you simply open a new session on the desired device, wait (generally a minute or two) until the device is available, and then interact with the device via the AWS Management Console. You can gesture, swipe, and interact with devices in real time directly through your web browser as if the device was on your desk or in your hand. This includes installing and running applications.”

Amazon S3 and Snowball, designed to increase speed of the data migration process, also received attention during the event. The AWS Import/Export Snowball was launched for customers who intend to move larger amounts of data, generally 10 terabytes or more, and has now been beefed up once again. New features for S3 make use of the AWS edge infrastructure to increase speed, and Snowball also has larger-capacity as well as now being available in four new regions.

“Many AWS customers are now using AWS Import/Export Snowball to move large amounts of data in and out of the AWS Cloud,” said Barr. “The original Snowball appliances had a capacity of 50 terabytes. Today we are launching a newer appliance with 80 terabytes of capacity.”

Amazon Kinesis, a service which enables users to manage data that is streamed into the cloud, has been updated to allow users to deploy, run, and scale Elasticsearch in the AWS Cloud, as well interaction with Amazon CloudWatch, its monitoring service.

The Cognito service allows apps to add authentication, user management, and data synchronization without having to write backend code or manage any infrastructure. The ‘Your User Pools’ feature update allows developers to build a user directory that can scale to hundreds of millions of users, to help manage the authentication process.

“Using a user pool gives you detailed control over the sign-up and sign-in aspects of your web and mobile SaaS apps, games, and so forth,” said Barr. “Building and running a directory service at scale is not easy, but is definitely undifferentiated heavy lifting, with the added security burden that comes when you are managing user names, passwords, email addresses, and other sensitive pieces of information. You don’t need to build or run your own directory service when you use Cognito Identity.”

Finally, the Elastic Beanstalk, which automatically deploys and runs apps on Amazon’s cloud infrastructure, has also been updated, by adding support for managed platform updates. Developers are now able to select a maintenance window, and the new feature will update the environment to the latest platform version automatically.

“The updates are installed using an immutable deployment model to ensure that no changes are made to the existing environment until the updated replacement instances are available and deemed healthy (according to the health check that you have configured for the application),” said Barr.

Google cloud team launches damage control mission

Close up of an astronaut in outer space, planet Mars in the background. Elements of the image are furnished by NASAGoogle will offer all customers who were affected by the Google Compute Engine outage with service credits, in what would appear to be a damage control exercise as the company looks to gain ground on AWS and Microsoft Azure in the public cloud market segment.

On Monday, 11 April, Google Compute Engine instances in all regions lost external connectivity for a total of 18 minutes. The outage has been blamed on two separate bugs, which separately would not have caused any major problems, though the combined result was a service outage. Although the outage has seemingly caused embarrassment for the company, it did not impact other more visible, consumer services such as Google Maps or Gmail.

“We recognize the severity of this outage, and we apologize to all of our customers for allowing it to occur,” said Benjamin Treynor Sloss, VP of Engineering at Google, in a statement on the company’s blog. “As of this writing, the root cause of the outage is fully understood and GCE is not at risk of a recurrence. Additionally, our engineering teams will be working over the next several weeks on a broad array of prevention, detection and mitigation systems intended to add additional defence in depth to our existing production safeguards.

“We take all outages seriously, but we are particularly concerned with outages which affect multiple zones simultaneously because it is difficult for our customers to mitigate the effect of such outages. It is our hope that, by being transparent and providing considerable detail, we both help you to build more reliable services and we demonstrate our ongoing commitment to offering you a reliable Google Cloud platform.”

While the outage would not appear to have caused any major damage for the company, competitors in the space may secretly be pleased with the level of publicity the incident has received. Google has been ramping up efforts in recent months to bolster its cloud computing capabilities to tackle the public cloud market segment with hires of industry hard-hitters, for instance Diane Greene, rumoured acquisitions, as well as announcing plans to open 12 new data centres by the end of 2017.

The company currently sits in third place in the public cloud market segment, behind AWS and Microsoft Azure, though has been demonstrating healthy growth in recent months prior to the outage.

Public cloud spend to increase by 14.1% in 2016

Searching. Search for opportunities. Business illustrationResearch firm IDC have released findings which demonstrate healthy growth in the cloud market throughout 2016.

IDC’s Worldwide Quarterly Cloud IT Infrastructure Tracker estimates spending on public cloud infrastructure is to increase by 14.1% over the course of the 12 months to $24.4 billion, and spending on private cloud platforms could be up 11.1% to $13.9 billion.

“For the majority of corporate and public organizations, IT is not a core business but rather an enabler for their core businesses and operations,” said Natalya Yezhkova, Research Director for the storage systems group at IDC. “Expansion of cloud offerings creates new opportunities for these businesses to focus efforts on core competences while leveraging the flexibility of service-based IT.”

Total spend for IT infrastructure products is expected to increase by 18.9% over the course 2016 to reach $38.2 billion, though it is still yet to surpass traditional, non-cloud, environments, which will decrease by 4%. Non-cloud platforms will still account for the majority of enterprise IT spend, accounting for 62.8%. From a cloud-deployment product perspective Ethernet switching spend will increase by 26.8%, with investments in servers and storage to grow at 12.4% and 11.3%, respectively.

The report also detailed vendor revenue from sales of infrastructure products over the course of 2015, which grew 21.9% to $29 billion. Revenues for Q4 grew at a slower rate, 15.7%, but still accounted for $8.2 billion, with public cloud grabbing the lion’s share $4.9 billion. Japan saw the largest margin of growth, 50%, whereas Central and Eastern Europe declined 9.3% seemingly owing to political and economic turmoil, which could be linked to a reduction in IT spend.

“The cloud IT infrastructure market continues to see strong double-digit growth with faster gains coming from public cloud infrastructure demand,” said Kuba Stolarski, Research Director for Computing Platforms at IDC. “End customers are modernizing their infrastructures along specific workload, performance, and TCO requirements, with a general tendency to move into 3rd Platform, next-gen technologies.

“Public cloud as-a-service offerings also continue to mature and grow in number, allowing customers to increasingly use sophisticated, mixed strategies for their deployment profiles. While the ice was broken a long time ago for public cloud services, the continued evolution of the enterprise IT customer means that public cloud acceptance and adoption will continue on a steady pace into the next decade.”

HPE continued as market leader for cloud IT infrastructure vendor revenues bringing in around $4.55 billion over the course of 2015, increasing its market share from 15% to 15.7%. Dell, Cisco, EMC and IBM completed the top 5, with only IBM dropping market share over the period. The company’s market share decreased 24.6% to roughly $1.24 billion, down from 6.9% to 4.3% of the overall segment.

Googles continues public cloud charge with 12 new data centres

GoogleGoogle has continued its expansion plans in the public cloud sector after announcing it will open 12 new data centres by the end of 2017.

In recent weeks, Google has been expanding its footprint in the cloud space with rumoured acquisitions, hires of industry big-hitters and blue-chip client wins, however its new announcement adds weight to the moves. With two new data centres to open in Oregon and Tokyo by the end of 2016, and a further ten by the end of 2017, Google is positioning itself to challenge Microsoft and AWS for market share in the public cloud segment.

“We’re opening these new regions to help Cloud Platform customers deploy services and applications nearer to their own customers, for lower latency and greater responsiveness,” said Varun Sakalkar, Product Manager at Google. “With these new regions, even more applications become candidates to run on Cloud Platform, and get the benefits of Google-level scale and industry leading price/performance.”

Google currently operates in four cloud regions and the new data centres will give the company a presence in 15. AWS and Microsoft have built a market-share lead over Google thanks in part to the fact that they operate in 12 and 22 regions respectively, with Microsoft planning to open a further five.

Recent findings from Synergy Research Group show AWS is still the clear leader in the cloud space at market share of 31%, with Microsoft accounting for 9% and Google controlling 4%. Owing to its private and hybrid cloud offerings, IBM accounts for 7% of the global market according to Synergy.

Growth at AWS was measured at 63%, whereas Microsoft and Google report 124% and 108% respectively. Industry insiders have told BCN that Microsoft and Google have been making moves to improve their offering, with talent and company acquisitions. Greater proactivity in the market from the two challengers could explain the difference in growth figures over the last quarter.

Alongside the new data centres, Google’s cloud business leader Diane Greene has announced a change to the way the company operates its sales and marketing divisions. According to Bloomberg Business, Greene told employees that Google will be going on a substantial recruitment drive, while also changing the way it sells its services, focusing more on customer interaction and feedback. This practice would not be seen as unusual for its competitors, however Google’s model has been so far built on the idea of customer self-service. The cloud sales team on the west coast has already doubled in size to fifty, with the team planning on widening this recruitment drive.

While Google’s intentions have been made clear over recent months, there are still some who remain unconvinced. 451 Group Lead Analyst Carl Brooks believes the company is still not at the same level as its competitors, needing to add more enterprise compatibility, compliance, and security features. “They are probably the most advanced cloud operation on the planet. It also doesn’t matter,” he said.

GoDaddy lauches cloud services tailored for small businesses

Godaddy logo matWeb hosting company GoDaddy has expanded its offering for small business customers to include Cloud Servers and Bitnami-powered Cloud Applications.

GoDaddy, which claims to have more than 61 million domain names under management, will offer its customers a “pay as you go” utility billing model, which will enable customers build, test and scale cloud solutions on GoDaddy’s infrastructure. The company’s traditional playing field is to give customers access to site building software like Word Press, but the new move will provide an environment where they can build and run just about any software they like.

“With the launch of Cloud Servers, GoDaddy aims to extend our lead as the number one trusted provider of Cloud Hosting solutions for individual developers and technologists. We’re looking to make it easy for developers to serve small businesses with the technology they want,” said Jeff King, GM Hosting, Security at GoDaddy. “By offering a powerful, yet simple cloud offering that integrates domains, DNS, security and backups all in one place, developers can save time and exceed their clients’ expectations.”

Unlike its better-known rivals in the cloud space, GoDaddy will build on its traditional business model of targeting individual developers, tech entrepreneurs and small-scale businesses with the new solution. The services will offer a number of features to smaller businesses that cannot afford or justify an all-encompassing service offered by the traditional players in the public cloud market. The company claims virtual instances can be built, tested, cloned and re-provisioned in less than a minute, meeting market expectations.

Alongside the servers, GoDaddy’s Cloud Applications are powered by Bitnami, an open source server application deployments library. “As a GoDaddy technology partner on Cloud Applications, we’re excited for GoDaddy’s international customer base to take advantage of our capabilities – joining the millions of developers and business users who save time and effort with our library’s consistent, secure and optimized end-user experience,” said Erica Brescia, Co-Founder at Bitnami. “We’re proud to partner with GoDaddy in serving this global market of advanced SMB-focused developers.”

The new offering from GoDaddy has seemingly been in the works for some time, as the team announced the acquisition of the public cloud customer division of Apptix for $22.5 million last September.

“With the acquisition of Apptix’s public cloud customer base, we have an opportunity to take customers using Hosted Exchange and bring them over to GoDaddy’s Microsoft Office 365 offering,” said Dan Race, GoDaddy’s VP of Corporate Comms, at the time.

With Microsoft and Google making moves to take market share away from AWS in the corporate space, GoDaddy is targeting the small business market, a niche that appears to be relatively overlooked.

Google said to be on cloud shopping spree

Googlers having funGoogle is rumoured to be planning the acquisition of a number of businesses to bolster its cloud computing platform and suite of workplace applications.

According to Re/code, the tech giant has amassed a short-list of various start-ups and niche service providers including automated app services start-up Metavine, e-commerce public company Shopify, and payroll and health benefits services business Namely. Re/code sources have stressed that the approaches are preliminary, and none of the companies involved have commented on the rumours.

The moves seem to address two challenges currently facing the Google team. Firstly, there is a notable gap of ‘middle range’ customers for Google Apps. The company traditionally does well with small and large companies, but has struggled with the lucrative market in between. Last year, Google attempted to lure the middle market onto Google Apps for Work by offering the service for free while seeing out their current enterprise agreement, and then $25 per user after that point.

Secondly, the acquisitions would enable Google to move its internal systems to its cloud platform, potentially creating a more solid offering to challenge AWS and Microsoft Azure.

The reports back-up recent moves in the market which indicated Google’s intentions of increasing its stake in the cloud market. While AWS and Microsoft have been firmly planted as the number one and number two players in the public and private cloud space, Google is closing the gap, making a number of company and talent acquisitions to improve its proposition.

Aside from the recent hire of VMware founder Diane Greene to lead its cloud business, last year SVP of Technical Infrastructure Urs Hölzle highlighted that Google cloud platform revenues could surpass Google’s advertising revenue within five years.

“The goal is for us to talk about Google as a cloud company by 2020,” said Hölzle in October. “Our cloud growth rate is probably industry-leading…and we have lots of enterprise customers, happy enterprise customers.”

The rumours shouldn’t come as a surprise, as Hölzle also said that there would be a number of announcements which would “remove any doubt” from Google’s future plans.

While the approaches are rumours, GCP Next 2016, the company’s cloud developer user conference taking place this week, may provide some clarity to Google’s aspirations.

Apple reportedly defects iCloud from AWS to Google Cloud

iCloud-croppedApple has moved some of its iCloud services onto Google Cloud, reducing its reliance on AWS, according to a CRN report.

Though it will still remain an AWS customer, the story states Google claims Apple will now be spending between $400 million and $600 million on its cloud platform. Last month, financial services firm Morgan Stanley estimated Apple spends $1 billion annually on AWS public cloud, though this is likely to be reduced over the coming years as Apple invests more on its own datacentres.

The company currently operates four datacentres worldwide and apparently has plans to open three more. It has been widely reported that Apple has set aside $3.9 billion to open datacentres in Arizona, Ireland and Denmark, with plans to open the first later this year.

Google has been struggling to keep pace with AWS and Microsoft’s Azure, but recent deals indicate an improved performance. A recent survey from Rightscale demonstrated AWS’ dominance in the market, accounting for 57% of public cloud market share, while Azure currently commands seconds place and Google only accounts for 6% of the market.

To bolster its cloud business Google hired VMware co-founder Diane Greene to lead the business unit, which includes Google for Work, Cloud Platform, and Google Apps. The appointment, together with the acquisition of bebop, which was founded by Greene, highlights the company’s ambitions in the cloud world, where it claims it has larger data centre capacity than any other public cloud provider.

Industry insiders have told BCN that acquisitions such as this are one of the main reasons the public cloud market segment is becoming more competitive. Despite AWS’ market dominance, which some insiders attribute to it being first to market, offerings like Azure and Google are becoming more attractive propositions thanks in part to company and talent acquisitions.

Last month, the Google team secured another significant win after confirming music streaming service Spotify as a customer. Spotify had toyed with the idea of managing its own datacentres but said in its blog “The storage, compute and network services available from cloud providers are as high quality, high performance and low cost as what the traditional approach provides.” The company also highlighted that the decision was made based on Google value adds in its data platform and tools.

While Google and Apple have yet to comment on the deal, an Amazon spokesperson has implied the deal may not have happened at all, sending BCN the following emailed statement. “It’s kind of a puzzler to us because vendors who understand doing business with enterprises respect NDAs with their customers and don’t imply competitive defection where it doesn’t exist.”

The rumoured Apple/Google deal marks a tough couple of weeks for AWS. Aside from Apple and Spotify, the company also lost the majority of Dropbox’s business. AWS is still occupies a strong position in the public cloud market but there are increasing signs its competitors are raising their game.

Dropbox drops Amazon Web Services for in-house system

Hand Touching A Cloud Secured By Electronic LockDropbox has announced that it will no longer be utilizing Amazon Web Service’s cloud infrastructure, favouring its own in-house solution.

The project, named “Magic Pocket” has been in the works for over two and a half years, and will store and serve over 90% of users’ data on the company’s own custom-built infrastructure. Dropbox was one of Amazon’s first customers to utilize its S3 service to store bulk data eight years ago, but has commented that the relationship will continue in certain areas.

“As the needs of our users and customers kept growing, we decided to invest seriously in building our own in-house storage system,” said Akhil Gupta, Dropbox VP of Engineering. While the company has traditionally stored file content on Amazon, the hosting of metadata and Dropbox web servers has always been in data centres managed by Dropbox itself.

“There were a couple reasons behind this decision. First, one of our key product differentiators is performance. Bringing storage in-house allows us to customize the entire stack end-to-end and improve performance for our particular use case,” said Gupta. “Second, as one of the world’s leading providers of cloud services, our use case for block storage is unique. We can leverage our scale and particular use case to customize both the hardware and software, resulting in better unit economics.”

The company has witnessed healthy growth over recent years, recently passing the milestone of 500 million users and 500 petabytes of user data, prompting the in-house move. Back in 2012, the company only had around 40 petabytes of user data, demonstrating 12-fold growth in the last four years. Dropbox initially began building its own storage infrastructure in 2013, with the company first storing user files in house in February 2015. The team hit its goal of storing 90% of its data in-house on 7 October 2015.

“Magic Pocket became a major initiative in the summer of 2013. We’d built a small prototype as a proof of concept prior to this to get a sense of our workloads and file distributions. Software was a big part of the project, and we iterated on how to build this in production while validating rigorously at every stage,” said Gupta “We knew we’d be building one of only a handful of exabyte-scale storage systems in the world. It was clear to us from the beginning that we’d have to build everything from scratch, since there’s nothing in the open source community that’s proven to work reliably at our scale.”

The move highlights the transition through to private cloud as a business benefit once enterprise reaches a certain level. Zynga is another company who have a similar story, moving between private and public cloud in recent years. Zynga is now in the process of shifting its data back onto in-house infrastructure. Dropbox’s move highlights the potential for overhead reductions when effectively moving onto private cloud, though if the company fails to scale as planned, the move could become a financial burden.

While the move does result in AWS losing a substantial amount of business, it is not the end of the relationship. The team will continue to partner with Amazon for new projects, but will also offer its European customers the opportunity to store data on AWS infrastructure in Germany, should they request it.

Deutsche Telekom aims to increase European market share with Open Telekom Cloud launch

DTDeutsche Telekom has launched Open Telekom Cloud, a new public cloud platform with Huawei as the hardware and software solution provider, in an effort to increase its market share in the European public cloud segment.

The service will offer European enterprises on-demand, pay-as-you-go cloud services via an OpenStack-based Infrastructure-as-a-Service solution operated by T-Systems. The company ambition is to accelerate its position in the market segment, which is currently dominated by US players.

“We are adding a new, transformational cloud offering to our existing portfolio of cloud services,” said Deutsche Telekom CEO Tim Höttges at CeBIT in Hanover. “For our business customers in Europe this is an important new service to support their digitization, and a critical milestone for us in our ambition to be the leading provider of cloud services in Europe.”

“More and more customers are discovering the advantages of the public cloud. But they want a European alternative,” said Anette Bronder, Head of the T-Systems Digital Division. The move aims to capitalize on recent industry concerns over where data is being stored, as European customers are increasingly demanding that their data remain within the boundaries of the EU.

Located in Biere, Saxony-Anhalt, any data will be subject to German data protection policy, recognized as one of the most stringent globally. “Access to a scalable, inexpensive public cloud provided by a German service provider from a German data centre under German law will be very attractive to many customers in Germany” said Andreas Zilch, SVP at analyst firm Pierre Audoin Consultants. “The combination of a competitive service and German legal security represents a unique selling point right now.”

Deutsche Telekom and its subsidiary T-Systems have been offering cloud solutions since 2005. The data centre in Biere, and its twin in Madgeburg, hosts almost all of the company’s ecosystem partners, which includes the likes of Microsoft, SAP, Cisco, Salesforce, VMWare, Huawei, Oracle, SugarCRM, and Informatica.

The announcement also strengthens Huawei’s position in the European market, a long-term ambition for the Chinese tech giant. Huawei will provide hardware and software solutions, including servers, storage, networking and Cloud OS, while also the technical support for the public cloud services.

“The strategic partnership allows each party to fully play to their strengths, providing enterprises and the industry with various innovative public cloud services that are beyond those provided by over-the-top content players,” said Huawei Rotating CEO Eric Xu “At Huawei, we are confident that, with esteemed partners like Deutsche Telekom, we can turn Open Telekom Cloud into the standard of public cloud services for the industry at large.”