Archivo de la categoría: Infrastructure as a Service

Google Cloud adds Microsoft support as Windows Server 2003 reaches EOL

Google made Windows Server support generally available this week

Google made Windows Server support generally available this week

Making good on commitments the cloud provider made in December last year Google has announced general availability of Windows Server on the Google Cloud Platform. The move comes the same week Windows Server 2003 reached its end of life.

“Making sure Google Cloud Platform is the best place to run your workloads is our top priority, so we’re happy that today Windows Server on Google Compute Engine graduates to General Availability, joining the growing list OSes we support. We’re also introducing several enhancements for Windows Server users,” the company said in a statement on its cloud blog.

“With its graduation to General Availability, Windows Server instances are now covered by the Compute Engine SLA. Windows Server users can now easily deploy a server running Active Directory or ASP.NET using the Cloud Launcher, and can securely extend their existing infrastructure into Google Cloud Platform using VPN.”

Google also said customers the purchase GCP support packages can get architectural and operational support for their Windows Server deployments on its cloud platform. And with Microsoft ceasing support for Windows Server 2003 Google is looking to lure in Microsoft developers by committing to support migration to more current Microsoft Server releases (2008, 2012).

In December last year the company announced it would begin offering Microsoft license mobility for the Google Cloud Platform, enabling existing Microsoft server application users to bring their own licenses and apps – SQL Server, SharePoint, Exchange – from on-premise to the cloud, without incurring any additional fees.

As before the move to expand support for the Microsoft ecosystem is likely to come as welcome news to the .NET crowd, which is fairly sizeable. Microsoft commands a 32.8 per cent share of all public web server infrastructure according to W3Techs.

IBM, Mubadala joint venture to bring Watson cloud to MENA

IBM is bringing Watson to the Middle East

IBM is bringing Watson to the Middle East

IBM is teaming up with Abu Dhabi-based investment firm Mubadala Development Company to create a joint venture based in Abu Dhabi that will deliver IBM’s cloud-based Watson service to customers in the Middle East and Northern Afirca (MENA) region.

The companies will set up the joint venture through Mubadala’s subsidiary, Injazat, which will be the sole provider of the Watson platform in the region.

The companies said the move will help create an ecosystem of MENA-based partners, software vendors and startups developing new solutions based on the cognitive compute platform.

“Bringing IBM Watson to the region represents the latest major milestone in the global adoption of cognitive computing,” said Mounir Barakat, executive director of ICT at Aerospace & Engineering Services, Mubadala.

“It also signals Mubadala’s commitment to bringing new technologies and spurring economic growth in the Middle East, another step towards developing the UAE as a hub for the region’s ICT sector,” Barakat said.

Mike Rhodin, senior vice president of IBM Watson said Mubadala’s knowledge of the local corporate ecosystem will help the company expand its cognitive compute cloud service in the region.

IBM has enjoyed some Watson wins in financial services, healthcare and the utilities sectors, but the company has been fairly quiet on how much the division rakes in; over the past year the company made strides to expand the platform in the US, Africa and Japan, and recently made a number of strategic acquisitions in software automation in order to boost Watson’s appeal in customer engagement and health services.

EMEA cloud infrastructure spending swells 16% in Q1 2015

Spending on cloud as a proportion of overall IT expenditure is growing at healthy rates

Spending on cloud as a proportion of overall IT expenditure is growing at healthy rates

Cloud-related IT infrastructure spending in the EMEA region grew 16 per cent year on year to reach $1.01bn in the first quarter of this year, representing just under 20 per cent of the overall IT infrastructure spend according to analyst house IDC.

Spending on IT infrastructure (servers, disk storage, and Ethernet switches) for public cloud accounts for about 8 per cent and private cloud 11 per cent of the overall spend; the firm previously estimated that growth in spending on public cloud would outpace private cloud spending by nearly 10 percentage points (25 and 16 per cent, respectively).

Michal Vesely, research analyst, european infrastructure at IDC said much of the expenditure in Western Europe was fuelled mainly by public cloud and large-scale datacentre installations.

“Private cloud expenditure, especially on premises, on the other hand, is more directly connected to regular IT investments by enterprises,” he explained. “Private cloud spending saw a slower pace as users assess their storage, as well as integrated and hyperconverged systems, strategies. Once decisions are made, we expect another major push in the forthcoming period.”

The firm also said unstable macroeconomic conditions in Southern and Western Europe haven’t adversely impacted spending trends , although on-premise deployments seem to be growing at a slower rate – in part due to an increased shift to cloud. According to the analyst house this shift is in full swing. In April the firm forecast that cloud will make up nearly half of all IT infrastructure spending in four years.

OVH adds ARM to public cloud

OVH has launched an ARM-based public cloud service just 8 months after going to market with a Power8-based cloud platform

OVH has launched an ARM-based public cloud service just 8 months after going to market with a Power8-based cloud platform

French cloud and hosting provider OVH said this week it will add Cavium ARM-based processors to its public cloud platform by the end of next month. The move comes just 8 months after the company added the Power8 architecture to its cloud arsenal.

The company said it will add Cavium’s flagship 48 core 64-bit ARMv8-A ThunderX workload-optimized processor to its RunAbove public cloud service cloud service.

“This deployment is an example of OVH.Com’s leadership in delivering latest industry leading technologies to our customers,” said Miroslaw Klaba, vice president of research & development at OVH.

“With RunAbove ThunderX based instances, we can offer our users breakthrough performance at the lowest cost while optimizing the infrastructure for targeted compute and storage workloads delivering best in class TCO and user experience.”

OVH, which serves 700,000 customers from 17 datacentres globally, said it wanted to offer a more diversified technology stack and cater to growing demand for cloud-based high performance compute workloads, and drop the cost per VM.

“Cloud service operators are looking to gain the benefits and flexibility of end to end virtualization while managing dynamically changing workloads and massive data requirements,” said Rishi Chugh, director marketing at Cavium. “ ThunderX based RunAbove instances provide exceptional processing performance and flexibility by integrating a tremendous amount of  IO along with targeted workload accelerators for compute, security, networking and storage at the lowest cost per VM for RunAbove – into a power, space and cost-optimized form factor.”

OVH is among just a handful of cloud service providers offering a variety of cloud compute platforms beyond x86. Late last year the company launched a cloud service based on IBM’s Power8 processor architecture, an open source architecture tailored specifically for big data applications, and OpenStack.

But while cloud compute is becoming more heterogeneous there are still far fewer workloads being created natively for ARM and Power8, which are both quite young, than x86, so it will likely take some time for asset utilisation (and the TCO) rates to catch up with where x86 servers are today.

AWS and Chef cook up DevOps deal

Chef is moving onto the AWS Marketplace

Chef is moving onto the AWS Marketplace

IT automation specialist Chef and AWS announced a deal this week that would see Chef’s flagship offering offered via the AWS Marketplace, a move the companies said would help drive DevOps cloud uptake.

Tools like Chef and Puppet Labs, which use an intermediary service to help automate a company’s infrastructure, have grown increasingly popular with DevOps personnel in recent years – particularly given not just the growth but heterogeneity of cloud today. And with DevOps continuing to grow – by 2016 nearly a quarter of the largest enterprises globally will have adopted a DevOps strategy according to Gartner – it’s clear both AWS and Chef see a huge opportunity to onboard more users to the former’s cloud service.

As one might expect, the companies touted the ability to use Chef to migrate workloads off premise and into the AWS without losing all of the code developed to automate lower level services.

Though Chef and Puppet Labs can both be deployed on and automate AWS cloud resources the Chef / AWS deal will see it gain one-click deployment and a more prominent placement in its catalogue of available services.

“Chef is one of the leading offerings for DevOps workflows, which engineers and developers depend on to accelerate their businesses,” said Dave McCann, vice president, AWS Marketplace. “Our customers want easy-to-use software like Chef that is available for immediate purchase and deployment in AWS Marketplace. This new partnership demonstrates our focus on offering low-friction DevOps tools to power customers’ businesses.”

Ken Cheney, vice president of business development at Chef said: “AWS’s market leadership in cloud computing, coupled with our expertise in IT automation and DevOps practices, brings a new level of capabilities to our customers. Together, we’re delivering a single source for automation, cloud, and DevOps, so businesses everywhere can spend minimal calories on managing infrastructure and maximise their ability to develop the software driving today’s economy.”

Dev-focused DigitalOcean raises $83m from Access Industries, Andreessen Horowitz

DigitalOcean raised $83m this week, which it will use to add features to its IaaS platform

DigitalOcean raised $83m this week, which it will use to add features to its IaaS platform

DigitalOcean this week announced it has raised $83m in a series B funding round the cloud provider said would help it ramp up global expansion and portfolio development.

The round was led by Access Industries with participation from seasoned tech investment firm Andreessen Horowitz.

DigitalOcean offers infrastructure as a service in a variety of Linux flavours and and aims its services primarily at developers, though the company said the latest round of funding, which brings the total amount it has secured since its founding in 2012 to $173m, will be used to aggressively expand its feature set.

“We are laser­-focused on empowering the developer community,” said Mitch Wainer, co-founder and chief marketing officer at DigitalOcean. “This capital infusion enables us to expand our world­-class engineering team so we can continue to offer the best infrastructure experience in the industry.”

Although the company is fairly young, and with just ten datacentres globally it claims to serve roughly 500,000 (individual) developers deploying cloud services on its IaaS platform, a respectable size by any measure. It also recently added another European datacentre in Frankfurt back in April, the company’s third on the continent.

But with bare bones IaaS competition getting more intense it will be interesting to see how DigitalOcean evolves; given its emphasis on developers it is possible the company’s platform could evolve into something more PaaS-like.

“We began with a vision to simplify infrastructure that will change how millions of developers build, deploy and scale web applications,” said Ben Uretsky, chief exec and co-­founder of DigitalOcean. “Our investors share our vision, and they’ll be essential partners in our continued growth.”

SoftLayer ups RAM, drops storage and compute costs

SoftLayer rejigged its cloud pricing

SoftLayer rejigged its cloud pricing

SoftLayer announced new pricing model it said would make the company more competitive among other cloud providers, in part by not charging for many of the networking costs.

“While other cloud providers advertise “low” prices for incomplete solutions, they neglect to mention extra charges for essential resources like network bandwidth, primary system storage, and support. At SoftLayer, our servers already include these necessary resources at no additional charge,” the company explained on its blog.

“Our new pricing model includes a redeveloped ordering and provisioning system that offers even more granular pricing for every SoftLayer bare metal and virtual server, from the processor to the RAM, storage, networking, security, and more.”

It also announced location-based pricing, meaning the company will uniquely price cloud services based on datacentre location.

Under the new cost model compute (dual Xeon ES-2620 4U processors) and storage costs dropped while RAM prices increased slightly – though the company said users can expect to save close to 40 per cent overall.

The latest round of cloud cost cutting follows similar moves from others to strip out fees and drop cloud service prices. AWS, Google and VMware have all adjusted their pricing downward in the past few months.

AWS to expand to India in 2016

AWS said India is the next big market for public cloud expansion

AWS said India is the next big market for public cloud expansion

Amazon unveiled plans this week to bring its Amazon Web Services (AWS) infrastructure to India by 2016 in a bid to expand into the quickly growing public cloud services market there.

AWS is already available in India and the company claims to have over 10,000 local customers using the platform, but the recently announced move would see the company set up its own infrastructure in-country rather than relying on delivering the services from nearby availability zones like Singapore.

The company says the move will likely improve the performance of the cloud services on offer to local organisations.

“Tens of thousands of customers in India are using AWS from one of AWS’s eleven global infrastructure regions outside of India. Several of these customers, along with many prospective new customers, have asked us to locate infrastructure in India so they can enjoy even lower latency to their end users in India and satisfy any data sovereignty requirements they may have,”said Andy Jassy, senior vice president, AWS.

“We’re excited to share that Indian customers will be able to use the world’s leading cloud computing platform in India in 2016 – and we believe India will be one of AWS’s largest regions over the long term.”

The India expansion comes at a time when the local market is maturing rapidly.

According to analyst and consulting house Gartner public cloud services revenue in India will reach $838m by the end of 2015, an increase of almost 33 per cent – making it one of the fastest growing markets for public cloud services in the world (global average growth rates sit in the mid-twenties range, depending on the analyst house). The firm believe many local organisations in India are shifting away from more traditional IT outsourcing and using public cloud services instead.

Green America hits out at Amazon for its dirty cloud

Amazon has committed to bolstering its use of renewables, but Green America thinks it needs to go further

Amazon has committed to bolstering its use of renewables, but Green America thinks it needs to go further

Notforprofit environmental advocacy group Green America is launched a campaign to try and convince Amazon to reduce its carbon footprint and catch up with other large cloud incumbents’ green credentials.

Green America said Amazon is behind other datacentre operators – including some of its large competitors like Google, Apple and Facebook – in terms of its renewable energy use and reporting practices.

“Every day, tens of millions of consumers are watching movies, reading news articles, and posting to social media sites that all use Amazon Web Services.  What they don’t realize is that by using Amazon Web Services they are contributing to climate change,” said Green america’s campaigns director Elizabeth O’Connell.

“Amazon needs to take action now to increase its use of renewables to 100 percent by 2020, so that consumers won’t have to choose between using the internet and protecting the planet,” O’Connell said.

Executive co-director Todd Larsen also commented on Amazon’s green cred: “Amazon lags behind its competitors, such as Google and Microsoft, in using renewable energy for its cloud-based computer servers.  Unlike most of its competitors, it also fails to publish a corporate responsibility or sustainability reporting, and it fails to disclose its emissions and impacts to the Carbon Disclosure Project.”

Amazon has recently taken strides towards making its datacentres greener. In November last year the company committed to using 100 per cent renewable energy for its global infrastructure, bowing to pressure from organisations like Greenpeace which have previously criticised the company’s reporting practices around its carbon footprint. But organisations like Green America still believe the company is way off the mark on its commitment.

Green America’s campaign is calling on Amazon to commit to full use of renewables for its datacentres by 2020; submit accurate and complete data to the Carbon Disclosure Project; and issue and annual sustainability report.

An Amazon spokesperson told BCN that the company and its customers are already showing environmental leadership by adopting cloud services in the first place.

“AWS customers have already shown environmental leadership by moving to cloud computing, which is inherently more environmentally friendly than traditional computing. Any analysis on the climate impact of a datacentre should take into consideration resource utilization and energy efficiency, in addition to power mix,” the spokesperson said.

“On average, AWS customers use 77 per cent fewer servers, 84 per cent less power, and utilize a 28 per cent cleaner power mix, for a total reduction in carbon emissions of 88 per cent from using the AWS Cloud instead of operating their own datacentres. We believe that our focus on resource utilization and energy efficiency, combined with our increasing use of renewable energy, will help our customers achieve their carbon reduction and sustainability goals. We will continue to provide updates of our progress on our AWS & Sustainable Energy page,” she added.

CIF: Enterprises still struggling with cloud migration

Enterprises are still struggling with their cloud migrations, the CIF claims

UK enterprises are still struggling with their cloud migrations, the CIF research shows

The latest research published by Cloud Industry Forum (CIF) suggests over a third of UK enterprises IT decision-makers believe cloud service providers could have better supported their migration to the cloud.

The CIF, which polled 250 senior IT decision-makers in the UK earlier this year to better understand where cloud services fit into their overall IT strategies, said its clear UK business are generally satisfied with their cloud services and plan to use more of them. But 35 per cent of those polled also said their companies still struggle with migration.

“The transition to cloud services has, for many, not been as straightforward as expected. Our latest research indicates that the complexity of migration is a challenge for a significant proportion of cloud users, resulting in unplanned disruption to the business,” said Alex Hilton, chief executive of the Cloud Industry Forum.

“There may be a case that cloud service providers need to be better at either setting end user expectations or easing the pain of migration to their services. But equally, it’s important that end users equip themselves with enough knowledge about cloud to be able to manage it and ensure that the cloud-based services rolled out can support business objectives, not hinder them.”

Piers Linney, co-chief executive of Outsourcery said the research highlights the need for providers to develop a “strong integrated stack of partners.”

“IT leaders looking for a provider should first assess their existing in-house skills and experience to understand how reliant they will be on the supplier to ensure a smooth transition. Equally, cloud suppliers need to be more sensitive to their customers’ requirements and tailor their service to the level of support needed for successful cloud adoption,” he said.

“The most critical factor is for IT leaders to really get under the bonnet of their potential cloud provider, make sure that the have a strong and highly integrated stack of partners and a proven track record of delivery for other customers with needs similar to their own.”