Archivo de la categoría: Infrastructure as a Service

SoftLayer ups RAM, drops storage and compute costs

SoftLayer rejigged its cloud pricing

SoftLayer rejigged its cloud pricing

SoftLayer announced new pricing model it said would make the company more competitive among other cloud providers, in part by not charging for many of the networking costs.

“While other cloud providers advertise “low” prices for incomplete solutions, they neglect to mention extra charges for essential resources like network bandwidth, primary system storage, and support. At SoftLayer, our servers already include these necessary resources at no additional charge,” the company explained on its blog.

“Our new pricing model includes a redeveloped ordering and provisioning system that offers even more granular pricing for every SoftLayer bare metal and virtual server, from the processor to the RAM, storage, networking, security, and more.”

It also announced location-based pricing, meaning the company will uniquely price cloud services based on datacentre location.

Under the new cost model compute (dual Xeon ES-2620 4U processors) and storage costs dropped while RAM prices increased slightly – though the company said users can expect to save close to 40 per cent overall.

The latest round of cloud cost cutting follows similar moves from others to strip out fees and drop cloud service prices. AWS, Google and VMware have all adjusted their pricing downward in the past few months.

AWS to expand to India in 2016

AWS said India is the next big market for public cloud expansion

AWS said India is the next big market for public cloud expansion

Amazon unveiled plans this week to bring its Amazon Web Services (AWS) infrastructure to India by 2016 in a bid to expand into the quickly growing public cloud services market there.

AWS is already available in India and the company claims to have over 10,000 local customers using the platform, but the recently announced move would see the company set up its own infrastructure in-country rather than relying on delivering the services from nearby availability zones like Singapore.

The company says the move will likely improve the performance of the cloud services on offer to local organisations.

“Tens of thousands of customers in India are using AWS from one of AWS’s eleven global infrastructure regions outside of India. Several of these customers, along with many prospective new customers, have asked us to locate infrastructure in India so they can enjoy even lower latency to their end users in India and satisfy any data sovereignty requirements they may have,”said Andy Jassy, senior vice president, AWS.

“We’re excited to share that Indian customers will be able to use the world’s leading cloud computing platform in India in 2016 – and we believe India will be one of AWS’s largest regions over the long term.”

The India expansion comes at a time when the local market is maturing rapidly.

According to analyst and consulting house Gartner public cloud services revenue in India will reach $838m by the end of 2015, an increase of almost 33 per cent – making it one of the fastest growing markets for public cloud services in the world (global average growth rates sit in the mid-twenties range, depending on the analyst house). The firm believe many local organisations in India are shifting away from more traditional IT outsourcing and using public cloud services instead.

Green America hits out at Amazon for its dirty cloud

Amazon has committed to bolstering its use of renewables, but Green America thinks it needs to go further

Amazon has committed to bolstering its use of renewables, but Green America thinks it needs to go further

Notforprofit environmental advocacy group Green America is launched a campaign to try and convince Amazon to reduce its carbon footprint and catch up with other large cloud incumbents’ green credentials.

Green America said Amazon is behind other datacentre operators – including some of its large competitors like Google, Apple and Facebook – in terms of its renewable energy use and reporting practices.

“Every day, tens of millions of consumers are watching movies, reading news articles, and posting to social media sites that all use Amazon Web Services.  What they don’t realize is that by using Amazon Web Services they are contributing to climate change,” said Green america’s campaigns director Elizabeth O’Connell.

“Amazon needs to take action now to increase its use of renewables to 100 percent by 2020, so that consumers won’t have to choose between using the internet and protecting the planet,” O’Connell said.

Executive co-director Todd Larsen also commented on Amazon’s green cred: “Amazon lags behind its competitors, such as Google and Microsoft, in using renewable energy for its cloud-based computer servers.  Unlike most of its competitors, it also fails to publish a corporate responsibility or sustainability reporting, and it fails to disclose its emissions and impacts to the Carbon Disclosure Project.”

Amazon has recently taken strides towards making its datacentres greener. In November last year the company committed to using 100 per cent renewable energy for its global infrastructure, bowing to pressure from organisations like Greenpeace which have previously criticised the company’s reporting practices around its carbon footprint. But organisations like Green America still believe the company is way off the mark on its commitment.

Green America’s campaign is calling on Amazon to commit to full use of renewables for its datacentres by 2020; submit accurate and complete data to the Carbon Disclosure Project; and issue and annual sustainability report.

An Amazon spokesperson told BCN that the company and its customers are already showing environmental leadership by adopting cloud services in the first place.

“AWS customers have already shown environmental leadership by moving to cloud computing, which is inherently more environmentally friendly than traditional computing. Any analysis on the climate impact of a datacentre should take into consideration resource utilization and energy efficiency, in addition to power mix,” the spokesperson said.

“On average, AWS customers use 77 per cent fewer servers, 84 per cent less power, and utilize a 28 per cent cleaner power mix, for a total reduction in carbon emissions of 88 per cent from using the AWS Cloud instead of operating their own datacentres. We believe that our focus on resource utilization and energy efficiency, combined with our increasing use of renewable energy, will help our customers achieve their carbon reduction and sustainability goals. We will continue to provide updates of our progress on our AWS & Sustainable Energy page,” she added.

CIF: Enterprises still struggling with cloud migration

Enterprises are still struggling with their cloud migrations, the CIF claims

UK enterprises are still struggling with their cloud migrations, the CIF research shows

The latest research published by Cloud Industry Forum (CIF) suggests over a third of UK enterprises IT decision-makers believe cloud service providers could have better supported their migration to the cloud.

The CIF, which polled 250 senior IT decision-makers in the UK earlier this year to better understand where cloud services fit into their overall IT strategies, said its clear UK business are generally satisfied with their cloud services and plan to use more of them. But 35 per cent of those polled also said their companies still struggle with migration.

“The transition to cloud services has, for many, not been as straightforward as expected. Our latest research indicates that the complexity of migration is a challenge for a significant proportion of cloud users, resulting in unplanned disruption to the business,” said Alex Hilton, chief executive of the Cloud Industry Forum.

“There may be a case that cloud service providers need to be better at either setting end user expectations or easing the pain of migration to their services. But equally, it’s important that end users equip themselves with enough knowledge about cloud to be able to manage it and ensure that the cloud-based services rolled out can support business objectives, not hinder them.”

Piers Linney, co-chief executive of Outsourcery said the research highlights the need for providers to develop a “strong integrated stack of partners.”

“IT leaders looking for a provider should first assess their existing in-house skills and experience to understand how reliant they will be on the supplier to ensure a smooth transition. Equally, cloud suppliers need to be more sensitive to their customers’ requirements and tailor their service to the level of support needed for successful cloud adoption,” he said.

“The most critical factor is for IT leaders to really get under the bonnet of their potential cloud provider, make sure that the have a strong and highly integrated stack of partners and a proven track record of delivery for other customers with needs similar to their own.”

iomart buys cloud consultancy for SystemsUp for £12.5

iomart is buying IT consultancy SystemsUp for an estimate £12.5m

iomart is buying IT consultancy SystemsUp for an estimate £12.5m

UK cloud service provider iomart announced it has entered into a deal to acquire IT consultancy SystemsUp, which specialises in designing and delivering cloud solutions, for up to £12.5m.

The deal will see iomart pay £9m in an initial cash consideration for the London-based consultancy with a contingent consideration of up to £3.5m depending on performance.

iomart said the move would broaden its cloud computing expertise. SystemsUp designs and delivers solutions made to run on Google, AWS and Microsoft public clouds among other platforms, and specialises in the public sector cloud strategies.

“The market for cloud computing is becoming incredibly complex and the demand for public cloud services is increasing at pace,” said Angus MacSween, chief executive of iomart. “With the acquisition of SystemsUp, iomart has broadened its ability to engage at a strategic level and act as a trusted advisor on cloud strategy to organisations wanting to create the right blend of cloud services, both public and private, to fit their requirements.”

While iomart offers its own cloud services the company seems to recognise the need to build up skills in a range of other platforms; the company said SystemsUp will remain an “impartial, agnostic, expert consultancy.”

Peter Burgess, managing director of SystemsUp said: “We have already built up a significant reputation and expertise in helping organisations use public cloud to drive down IT costs and improve efficiency. As part of iomart we can leverage their award winning managed services offerings to deepen and widen our toolset to deliver a broader set of cloud services, alongside continuing to deliver the strategic advice and deployment of complex large public and private sector cloud projects.”

The move comes six months after iomart’s last acquisition, when the company announced it had bought ServerSpace, a rival cloud service provider, for £4.25m.

Alibaba announces partner programme to boost cloud efforts

Alibaba's partner programme will help it expand internationally

Alibaba’s partner programme will help it expand internationally

Alibaba’s cloud division Aliyun has launched a global partnership programme aimed at bolstering global access to its cloud services.

The company’s Marketplace Alliance Program (MAP) will see it partner with large tech and datacentre operators, initially including Intel, Singtel, Meeras, Equinix and PCCW among others to help localise its cloud computing services and grow its ecosystem.

“The new Aliyun program is designed to bring our customers the best cloud computing solutions by partnering with some of the most respected technology brands in the world. We will continue to bring more partners online to grow our cloud computing ecosystem,” said Sicheng Yu, vice president, Aliyun.

Raejeanne Skillern, general manager of cloud service provider business at Intel said: “For years Intel and Alibaba have collaborated on optimizing hardware and software technology across the data center for Alibaba’s unique workloads. As a partner in Aliyun’s Marketplace Alliance Program, Intel looks forward to continuing our collaboration to promoting joint technology solutions that are based on Intel Architecture specifically tailored to the rapidly growing market of international public cloud consumers.”

The move is part of Alibaba’s efforts to rapidly expand its presence internationally. This year the company put its first datacentre in the US, and just last week announced Equinix would offer direct access to its cloud platform globally. The company, often viewed as the Chinese Amazon, also plans to set up a joint venture with Meeras in Dubai that specialises in systems integration with a focus on big data and cloud-based services.

Equinix to offer direct access to Alibaba’s cloud service

Equinix will offer direct links to Alibaba's cloud

Equinix will offer direct links to Alibaba’s cloud

Equinix has signed an agreement with Alibaba that will see the American datacentre incumbent provide direct access to Chinese ecommerce firm’s cloud computing service.

The deal will see Equinix add Aliyun, Alibaba’s cloud computing division, to its growing roster of cloud services integrated with its cloud interconnection service, and offer direct access to Aliyun’s IaaS and SaaSs in both Asia and North America.

Equinix said it’s aiming this primarily at large multinationals looking to expand their infrastructure into Asia.

“Our multi-national enterprise customers are increasingly asking for access to the Aliyun cloud platform, as they deploy cloud-based applications across Asia,” said Chris Sharp, vice president of cloud innovation, Equinix.

“By providing this access in two strategic markets, we’re empowering businesses to build secure, private clouds, without compromising network and application performance,” Sharp said.

Sicheng Yu, vice president of Aliyun said: “Aliyun is very excited about our global partnership with Equinix, who not only has a global footprint of cutting-edge datacentres, but has also brought together the most abundant cloud players and tenants in the cloud computing ecosystem on its Equinix Cloud Exchange platform. Connecting the Equinix ecosystem with our Aliyun cloud services on Cloud Exchange will provide customers with the best-of-breed choices and flexibility.”

The move will see Equinix expand its reach in Asia, a fast-growing market for cloud services, and comes just one week after Equinix announced it would bolster its European footprint with the TelecityGroup merger.

Containers ready for the primetime, Rackspace CTO says

John Engates was in London for the Rackspace Solve conference

John Engates was in London for the Rackspace Solve conference

Linux containers have been around for some time but only now is the technology reaching a level of maturity enterprise cloud developers are comfortable with, explained John Engates, Rackspace’s chief technology officer.

Linux containers have been all the rage the past year, and Engates told BCN the volume of the discussion is only likely to increase as the technology matures. But the technology is still young.

“We tried to bring support for containers to OpenStack around three or four years back,” Engates said. “But I think that containers are finally ready for cloud.”

One of the projects Engates cited to illustrate this is Project Magnum, a young sub-project within OpenStack building on Heat to produce Nova instances on which to run application containers, and it basically creates native capabilities (like support for different scheduling techniques); it effectively enables users and service providers to offer containers-as-a-service, and improves portability of containers between different cloud platforms.

“While containers have been around for a while they’ve only recently become the darling of the enterprise cloud developers, and part of that is because there’s a growing ecosystem out there working to build the tools needed to support them,” he said.

A range of use cases around Linux containers have emerged over the years – as a transport method, as a way of quickly deploying and porting apps between different sets of infrastructure, as a way of standing up a cloud service that offers greater billing granularity (more accurate / efficient usage) – the technology is still maturing and has suffered from a lack of tooling. Doing anything like complex service chaining is still challenging with existing tools, but that’s improving.

Beyond LXC, one of the earliest Linux container projects, there’s now CoreOS, Docker, Mesos, Kubernetes, and a whole host of container-like technologies that bring the microservices / OS ‘light’ architecture as well as deployment scheduling and cluster management tools to market.

“We’re certainly hearing more about how we can help support containers, so we see it as a pretty important from a service perspective moving forward,” he added.

Skyscape, DeepSecure strike cloud data compliance deal

Skyscape is partnering with DeepSecure to bolster its security cred

Skyscape is partnering with DeepSecure to bolster its security cred

Cloud service provider Skyscape is partnering with DeepSecure in a move the companies said would help public sector cloud users meet their compliance needs.

DeepSecure traditionally sells to the police, defence and intelligence sectors and provides secure data sharing and data management services as well as cybersecurity systems, and the partnership will see Skyscape offer its customers DeepSecure’s suite of data sharing and security services.

The move could give Skyscape, which already heavily targets the public sector, a way in with some of the more heavily regulated clients (security-wise) there.

“We’re delighted to announce our partnership with DeepSecure, a likeminded company with a significant track record when it comes to helping organisations share data securely,” said Simon Hansford, chief executive of Skyscape Cloud Services.

“DeepSecure is certainly a good cultural fit for us as a fellow UK sovereign SME that specialises in delivering secure digital services to the UK public sector.  The firm also shares our commitment to offering a consumption-based pricing model for its security services, which aligns with our own pay-as-you-go model for our full catalogue of assured cloud services,” Hansford said.

OpenStack does some soul searching, finds its core self

Bryce: 'OpenStack will power the planet's clouds'

Bryce: ‘OpenStack will power the planet’s clouds’

The OpenStack Foundation announced new interoperability and testing requirements as well as enhancements to the software’s implementation of federated identity which the Foundation’s executive director Jonathan Bryce says will take the open source cloud platform one step closer to world domination.

OpenStack’s key pitch beyond being able to spin up scalable compute, storage and networking resources fairly quickly, is that OpenStack-based private clouds should be able to burst into the public cloud or some private cloud instances if need be. That kind of capability is essential if the company is going to take on companies like AWS, VMware and Microsoft, but has so far been quite basic in terms of implementation.

But for that kind of interoperability to happen you need three things: the ability to federate the identity of a cloud user so permissions and workloads can port over to whatever platforms are being deployed on (and to ensure those workloads are secure); a definition of what vendors, service providers and customers can reliably call core OpenStack, so they can all expect a standard collection of tools, services, and APIs to be found in every distribution; and, a way to test interoperability of OpenStack distributions and appliances.

To that end, the Foundation announced a new OpenStack Powered interoperability testing programme, so users can validate the interoperability of their own deployments as well as gain assurances from vendors that clouds and appliances branded as “OpenStack Powered” meet the same requirements. About 16 companies already have either certified cloud platforms or appliances available on the OpenStack Marketplace as of this week, and Bryce said there’s more to come.

The latest release of OpenStack, Kilo, also brings a number of improvements to federated identity, making it much easier to implement as well as more dynamic in terms of workload deployment, and Bryce said that over 30 companies have committed to implementing federated identity (which has been available since the Lighthouse release) by the end of this year – meaning the OpenStack cloud footprint just got a whole lot bigger.

“It has been a massive effort to come to an agreement on what we need to have in these clouds, how to test it,” Bryce said. “It’s a key step towards the goal of realising an OpenStack-powered planet.”

The challenge is, as the code gets bulkier and as groups add more services, joining all the bits and making sure they work together without one component or service breaking another becomes much more complex. That said, the move marks a significant milestone for the DefCore group, the internal committee in charge of setting base requirements by defining 1) capabilities, 2) code and 3) must-pass tests for all OpenStack products. The group have been working for well over a year on developing a standard definition of what a core OpenStack deployment is.