Archivo de la categoría: News & Analysis

IBM adds second SoftLayer datacentre in the Netherlands

IBM is launching a second SoftLayer datacentre in the Netherlands

IBM is launching a second SoftLayer datacentre in the Netherlands

IBM has announced the launch of a SoftLayer datacentre in the Netherlands, its second in the country. The move comes the same week IBM reported cloud revenue increases of close to 75 per cent.

The company said the new datacentre, located in Almere just outside Amsterdam, will double SoftLayer capacity in the region and provide customers with more in-country options for data storage and geographically isolated services.

“This new facility demonstrates the demand and success IBM Cloud is having at delivering high-value services right to the doorstep of our clients,” said James Comfort, IBM general manager of cloud services.

“We’re reaching customers in a way that takes all the guess work out of moving to the cloud. They can build and scale applications, run the toughest big data workloads, have the level of security they need, all in country and connected to a truly global platform,” Comfort said.

IBM has moved to rapidly expand its cloud services in the past year. The company has opened up 13 new SoftLayer datacentres in the past 10 months alone as it looks to shift its focus onto lower-margin strategic initiatives like cloud, big data and security.

That said, despite sequential quarterly revenue declines the company recently reported is annual “as-a-service” run rate stands at $3.8bn, up $1.5bn in the last year. Cloud revenue was up over 75 per cent from last year; on a trailing 12-month basis, the company reported cloud revenue of $7.7bn, with analytics up more than 20 per cent and social more than 40 per cent.

ISO 27018 and protecting personal information in the cloud: a first year scorecard

ISO 27018 has been around for a year - but is it effective?

ISO 27018 has been around for a year – but is it effective?

A year after it was published,  – the first international standard focusing on the protection of personal data in the public cloud – continues, unobtrusively and out of the spotlight, to move centre stage as the battle for cloud pre-eminence heats up.

At the highest level, this is a competitive field for those with the longest investment horizons and the deepest pockets – think million square foot data centres with 100,000+ servers using enough energy to power a city.  According to research firm Synergy, the cloud infrastructure services market – Infrastructure as a Service (Iaas), Platform as a Services (PaaS) and private and hybrid cloud – was worth $16bn in 2014, up 50 per cent on 2013, and is predicted to grow 30 per cent to over $21bn in 2015. Synergy estimated that the four largest players accounted for 50 per cent of this market, with Amazon at 28 per cent, Microsoft at 11 per cent, IBM at 7 per cent and Google at 5 per cent.  Of these, Microsoft’s 2014 revenues almost doubled over 2013, whilst Amazon’s and IBM’s were each up by around half.

Significantly, the proportion of computing sourced from the cloud compared to on-premise is set to rise steeply: enterprise applications in the cloud accounted for one fifth of the total in 2014 and this is predicted to increase to one third by 2018.

This growth represents a huge increase year on year in the amount of personal data (PII or personally identifiable information) going into the cloud and the number of cloud customers contracting for the various and growing types of cloud services on offer. but as the cloud continues to grow at these startling rates, the biggest inhibitor to cloud services growth – trust about security of personal data in the cloud – continues to hog the headlines.

Under data protection law, the Cloud Service Customer (CSC) retains responsibility for ensuring that its PII processing complies with the applicable rules.  In the language of the EU Data Protection Directive, the CSC is the data controller.  In the language of ISO 27018, the CSC is either a PII principal (processing her own data) or a PII controller (processing other PII principals’ data).

Where a CSC contracts with a Cloud Service Provider (CSP), Article 17 the EU Data Protection Directive sets out how the relationship is to be governed. The CSC must have a written agreement with the CSP; must select a CSP providing ‘sufficient guarantees’ over the technical security measures and organizational measures governing PII in the Cloud service concerned; must ensure compliance with those measures; and must ensure that the CSP acts only on the CSC’s instructions.

As the pace of migration to the cloud quickens, the world of data protection law continues both to be fragmented – 100 countries have their own laws – and to move at a pace driven by the need to mediate all competing interests rather than the pace of market developments.

In this world of burgeoning cloud uptake, ISO 27018 is proving effective at bridging the gap between the dizzying pace of Cloud market development and the slow and uncertain rate of legislative change by providing CSCs with a workable degree of assurance in meeting their data protection law responsibilities.  Almost a year on from publication of the standard, Microsoft has become the first major CSP (in February 2015) to achieve ISO 27018 certification for its Microsoft Azure (IaaS/PaaS), Office 365 (PaaS/Saas) and Dynamics CRM Online (SaaS) services (verified by BSI, the British Standards Institution) and its Microsoft Intune SaaS services (verified by Bureau Veritas).

In the context of privacy and cloud services, ISO 27018 builds on other information security standards within the IS 27000 family. This layered, interlocking approach is proving supple enough in practice to deal with the increasingly wide array of cloud services. For example, it is not tied to any particular kind of cloud service and, as Microsoft’s certifications show, applies to IaaS (Azure), PaaS (Azure and Office 365) and SaaS (Office 365 and Intune). If, as shown in the graphic below, you consider computing services as a stack of layered elements ranging from networking (at the bottom of the stack) up through equipment and software to data (at the top), and that each of these elements can be carried out on premise or from the cloud (from left to right), then ISO 27018 is flexible enough to cater for all situations across the continuum.

Software as a Licence to Software as a Service: the Cloud Continuum

Software as a Licence to Software as a Service: the cloud continuum

Indeed, the standard specifically states at Paragraph 5.1.1:

“Contractual agreements should clearly allocate responsibilities between the public cloud PII processor [i.e. the CSP], its sub-contractors and the cloud service customer, taking into account the type of cloud service in question (e.g. a service of an IaaS, PaaS or SaaS category of the cloud computing reference architecture).  For example, the allocation of responsibility for application layer controls may differ depending on whether the public cloud PII processor is providing a SaaS service or rather is providing a PaaS or IaaS service upon which the cloud service customer can build or layer its own applications.”

Equally, CSPs will generally not know whether their CSCs are sending PII to the cloud and, even if they do, they are unlikely to know whether or not particular data is PII. Here, another strength of ISO 27018 is that it applies regardless of whether particular data is, or is not, PII: certification simply assures the CSC that the service the CSP is providing is suitable for processing PII in relation to the performance by the CSP of its PII legal obligations.

Perhaps the biggest practical boon to the CSC however is the contractual certainty that ISO 27018 certification provides.  As more work migrates to the cloud, particularly in the enterprise space, the IT procurement functions of large customers will be following structured processes in order to meet the requirements of their business and, in certain cases, their regulators. In their requests for information, proposals and quotations from prospective CSPs, CSCs now have a range of interlocking standards including ISO 27018 to choose from in their statements of requirements for a particular Cloud procurement.  As well as short-circuiting the need for CSCs to spend time in writing up detailed specifications of their own requirements, verified compliance with these standards for the first time provides meaningful assurance and protection from risk around most aspects of cloud service provision. Organisations running competitive tenders can benchmark bidding CSPs against each other on their responses to these requirements, and then include as binding commitments the obligations to meet the requirements of the standards concerned in the contract when it is let.

In the cloud contract lifecycle, the flexibility provided by ISO 27018 certification, along with the contract and the CSP’s policy statements, goes beyond this to provide the CSC with a framework to discuss with the CSP on an ongoing basis the cloud PII measures taken and their adequacy.

In its first year, it is emerging that complying, and being seen to comply, with ISO 27018 is providing genuine assurance for CSCs in managing their data protection legal obligations.  This reassurance operates across the continuum of cloud services and through the procurement and contract lifecycle, regardless of whether or not any particular data is PII.  In customarily unobtrusive style, ISO 27018 is likely to go on being a ‘win’ for the standards world, cloud providers and their customers, and data protection regulators and policy makers around the world.

 

IDC: Cloud to make up nearly half of IT infrastructure spending by 2019

Enterprise adoption of public cloud services seems to be outstripping private cloud demand

Enterprise adoption of public cloud services seems to be outstripping private cloud demand

Total cloud infrastructure spending will grow by 21 per cent year over year to $32bn this year, accounting for approximately 33 per cent of all IT infrastructure spending, up from about 28 per cent in 2014, according to IDC.

The research and analyst house echoed claims that cloud computing has been significantly disrupting the IT infrastructure market over the past couple of years. The firm estimates last year cloud infrastructure spending totalled $26.4bn, up 18.7 per cent from the year before.

Kuba Stolarski, research manager, server, virtualization and workload research at IDC said much of the growth over the next few years will be driven largely by public cloud adoption.

Private cloud infrastructure spending will grow by 16 per cent year on year to $12bn, while public cloud IT infrastructure spending will grow by a whopping 25 per cent in 2015 to $21bn – nearly twice as much, the firm believes.

“The pace of adoption of cloud-based platforms will not abate for quite some time, resulting in cloud IT infrastructure expansion continuing to outpace the growth of the overall IT infrastructure market for the foreseeable future,” Stolarski explained.

“As the market evolves into deploying 3rd Platform solutions and developing next-gen software, organizations of all types and sizes will discover that traditional approaches to IT management will increasingly fall short of the simplicity, flexibility, and extensibility requirements that form the core of cloud solutions.”

By 2019, the firm believe, cloud infrastructure spending will top $52bn and represent 45 per cent of the total IT infrastructure spend; public cloud will represent about $32bn of that amount, and private cloud the remaining $20bn.

According to IDC, 15 per cent of the overall infrastructure spend in EMEA was related to cloud environments in 2014, up from 8 per cent in 2011. $3.4bn was spent on hardware going to cloud environments in EMEA in 2013, up 21 per cent from 2012.

Citrix, bowing to momentum, joins OpenStack

Citrix is rejoining OpenStack, the open source cloud project it abandoned for its own rival initiative

Citrix is rejoining OpenStack, the open source cloud project it abandoned for its own rival initiative

Virtualisation specialist Citrix has announced it is officially joining the OpenStack Foundation as a corporate sponsor, the open source organisation it left four years ago in order to pursue the rival Cloud Stack initiative.

Citrix said it had contributed to the OpenStack community fairly early on, but wanted to re-join the community in order to more formally demonstrate its commitment towards cloud interoperability and standards development.

As part of the announcement the company also said it has integrated the NetScaler and XenServer with OpenStack.

“We’re pleased to formally sponsor the OpenStack Foundation to help drive cloud interoperability standards. Citrix products like NetScaler, through the recently announced NetScaler Control Center, and XenServer, are already integrated with OpenStack,” said said Klaus Oestermann, senior vice president and general manager, delivery networks at Citrix.

“Our move to support the OpenStack community reflects the great customer and partner demand for Citrix to bring the value of our cloud and networking infrastructure products to customers running OpenStack,” Oestermann added.

Citrix is one of the biggest backers of CloudStack, an Apache open source project that rivals OpenStack. Citrix was aligned with OpenStack at the outset but in 2012 ended its commitment to that project in order to pursue CloudStack development.

That said, the move would suggest Citrix is aware it can’t continue going against the grain too long when it comes to vendor and customer mind-share. OpenStack, despite all of its own internal politics and technological gaps, seems to have far more developers involved than CloudStack. It also has more buy-in from vendors.

All of this is to say, going the CloudStack route exclusively is counterintuitive, especially in cloud – which is all about heterogeneity (which means interoperability is, or should be, among the top priorities of vendors involved). But, Citrix maintains that it will continue to invest in CloudStack development.

Laurent Lachal, lead analyst in Ovum’s software practice told BCN the move is a classic case of “if you can’t beat ‘em, join ‘em.”

“But there needs to be more clarity around how OpenStack fits with CloudStack,” he explained. “The CloudStack initiative is no longer as dependent on Citrix as it used to be, which is a good thing. But the project still needs to get its act together.”

Microsoft to improve transparency, control over cloud data

Microsoft wants to improve the security of its offerings

Microsoft wants to improve the security of its offerings

Microsoft has announced a series of measures to give customers more control over their cloud-based data, a move it claims will improve transparency around how data is treated as well as the security of that data.

The company announced enhanced activity logs of user, admin and policy-related actions, which customers and partners can tap into through a new Office 365 Management Activity API to use for compliance and security reporting.

Microsoft said by the end of this year it plans to introduce a Customer Lockbox for Office 365, which will give Office users the ability to approve or reject a Microsoft engineer’s request to log into the Office 365 service.

“Over the past few years, we have seen the security environment change and evolve. Cyber threats are reaching new levels, involving the destruction of property, and governments now act both as protectors and exploiters of technology. In this changing environment, two themes have emerged when I talk with our customers – 1) they want more transparency from their providers and more control of their data, and 2) they are looking for companies to protect their data through leading edge security features,” explained Scott Charney, corporate vice president, trustworthy computing at Microsoft.

“In addition to greater control of their data, companies also need their technology to adhere to the compliance standards for the industries and geographic markets in which they operate.”

The company is also upping its game on security and encryption. Office 365 already encrypts data in transit, but in the coming months Charney said the company plans to introduce content-level encryption, and by 2016 plans to enable the ability for customers to require Microsoft to use customer-generated and customer-controlled encryption keys to encrypt their content at rest.

It also plans to bolster network security through Azure-focused partnerships with the likes of Barracuda, Check Point, Fortinet, Websense, Palo Alto Networks, F5 and Alert Logic, and broaden the security capabilities of its enterprise mobility management suite.

Microsoft has over the past couple of years evolved into a strong proponent of and active participant in discussions around data security and data protection, including legislative change impacting these areas in the US. It’s also among a number of US cloud providers that are convinced many still lack trust in the cloud from a security standpoint, consequently hampering its ability to make inroads into the cloud market, which gives it an added incentive to double down on securing its own offerings.

Cisco, Elastica join forces on cloud security monitoring

Cisco will resell Elastica's cloud service monitoring technology

Cisco will resell Elastica’s cloud service monitoring technology

Networking giant Cisco is teaming up with Elastica, a cloud security startup, in a move that will see the two firms combine their threat intelligence and cloud service monitoring technologies.

The partnership will also see Cisco resell Elastica’s cloud application security and monitoring solution (CloudSOC) to its customers.

“The combination of Cisco’s threat-centric security portfolio and Elastica’s innovation in cloud application security provides a unique opportunity. Our global customers gain additional levels of visibility and control for cloud applications and it enhances our portfolio of advanced cloud-delivered security offerings,” said Scott Harrell, vice president of product management, Cisco Security Business Group.

“We are excited to partner with Elastica to deliver an even richer portfolio of on–premises and cloud application security to protect businesses across the attack continuum – before, during, and after an attack,” Harrell said.

The move is a big win for Elastica, a startup that existed stealth in early 2014 and just last month secured $30m in funding. Cisco will provide the security startup with a large and varied channel that spans both the enterprise and scale-out markets, while Cisco can plug a gap in its burgeoning cloud-centric portfolio (that said, it’s possible the move is a precursor to an acquisition).

“CIOs want to empower employees with advanced cloud apps that help enterprises stay agile, productive and competitive in the marketplace. The power of these cloud apps – information sharing and built-in collaboration capabilities – also require a completely new approach to security,” said Rehan Jalil, president and chief executive of Elastica.

“Elastica’s cloud app security technology, together with Cisco’s broad security portfolio and footprint, will help us catalyze the safe and compliant use of cloud apps so that our customers can continue to securely make their businesses more agile and productive,” Jalil said.

IBM reports flat revenues but cloud revenue is up 75%

IBM reported strong performance in cloud despite nearly three years of sequential quarterly declines

IBM reported strong performance in cloud despite nearly three years of sequential quarterly declines

For Q1 2015 IBM reported flat revenues year on year and operating income slightly up on last year, due in part to currency impacts and some of the recent restructuring efforts at the firm, respectively. But the company also reported strong performance in its ‘as-a-service’ segment.

The company reported strong growth in its Power and mainframe businesses, with quarterly mainframe revenue more than doubling (with particularly strong growth in China). The company said Power showed strong performance in the scale-out systems market as well, in part due to the expansion of Power architecture in SoftLayer datacentres.

But at $19.6bn in the first quarter of 2015 revenue at dropped for the 12th consecutive quarter at IBM if a stronger dollar and the impact of divested businesses are taken into consideration.

The company’s chief financial officer Martin Schroeter aimed to reassure the market that bold moves to invest in new areas like Internet of Things and restructure its business were having a positive impact.

IBM is spending billions to shift its focus on lower-margin strategic initiatives like cloud, big data, mobile, security and IoT, and is continuing to “rebalance” its workforce at the same time.

“As we continue the transformation of our business, I’d expect a similar level of workforce rebalancing next quarter, which will impact our year-to-year profit performance,” Schroeter said.

“At our investor briefing at the end of February, we spent a lot of time on how we are transforming our business to where we see long-term value in enterprise IT. We have a core portfolio that’s high value to our clients and high value to us. Quite frankly, it’s essential.”

“While the market for these capabilities isn’t necessarily growing, we continue to reinvent and innovate to deliver that value,” he added.

But performance in areas of strategic importance for IBM looks promising. Schroeter said the annual “as-a-service” run rate stands at $3.8bn, up $1.5bn in the last year. Cloud revenue was up over 75 per cent from last year; on a trailing 12-month basis, the company reported cloud revenue of $7.7bn, with analytics up more than 20 per cent and social more than 40 per cent.

VMware open sources IAM, cloud OS tools

VMware is open sourcing cloud tools

VMware is open sourcing cloud tools

VMware has open sourced two sets of tools the company said would accelerate cloud adoption in the enterprise and improve their security posture.

The company announced Project Lightwave, which the company is pitching as the industry’s first container identity and access management tool for cloud-native applications, and Project Photon, a lightweight Linux operating system optimised for running these kinds of apps in vSphere and vCloud Air.

The move follows Pivotal’s recent launch of Lattic, a container cluster scheduler for Cloud Foundry that the software firm is pitching as a more modular way of building apps exposing CF components as standalone microservices (thus making apps built with Lattice easier to scale).

“Through these projects VMware will deliver on its promise of support for any application in the enterprise – including cloud-native applications – by extending our unified platform with Project Lightwave and Project Photon,” said Kit Colbert, vice president and chief technology officer for Cloud-Native Applications, VMware.

“Used together, these new open source projects will provide enterprises with the best of both worlds. Developers benefit from the portability and speed of containerized applications, while IT operations teams can maintain the security and performance required in today’s business environment,” Colbert said.

Earlier this year VMware went on the container offensive, announcing an updated vSphere platform that would enable users to run Linux containers side by side with traditional VMs as well as its own distribution of OpenStack.

The latest announcement – particularly Lattice – is part of a broader industry trend that sees big virtualisation incumbents embrace a more modular, cloud-friendly architecture (which many view as synonymous with containers) in their offerings. This week one of VMware’s chief rivals in this area, Microsoft, announced its own container-like architecture for Azure following a series of moves to improve support for Docker on its on-premise and cloud platforms.

Microsoft debuts container-like architecture for cloud

Microsoft is trying to push more cloud-friendly architectures

Microsoft is trying to push more cloud-friendly architectures

Microsoft has announced Azure Service Fabric, a framework for ISVs and startups developing highly scalable cloud applications which combines a range of microservices, orchestration, automation and monitoring tools. The move comes as the software company looks to deepen its use of – and ties to – open source tech.

Azure Service Fabric, which is based in part on technology included in Azure App Fabric, breaks apart apps into a wide range of small, independently versioned microservices, so that apps created on the platform don’t need to be re-coded in order to scale past a certain point. The result, the company said, is the ability to develop highly scalable applications while enabling low-level automation and orchestration of its constituent services.

“Service Fabric was born from our years of experience delivering mission-critical cloud services and has been in production for more than five years. It provides the foundational technology upon which we run our Azure core infrastructure and also powers services like Skype for Business, InTune, Event Hubs, DocumentDB, Azure SQL Database (across more than 1.4 million customer databases) and Bing Cortana – which can scale to process more than 500 million evaluations per second,” explained Mark Russinovich, chief technology officer of Microsoft Azure.

“This experience has enabled us to design a platform that intrinsically understands the available infrastructure resources and needs of applications, enabling automatically updating, self-healing behaviour that is essential to delivering highly available and durable services at hyper-scale.”

A preview of the service will be released to developers at the company’s Build conference next week.

The move is part of a broader architectural shift in the software stack powering cloud services today. It’s clear the traditional OS / hypervisor model is limited in terms of its ability to ensure services are scalable and resilient for high I/O applications, which has manifested in among other things a shift towards breaking down applications into a series of connected microservices – something which many equate Docker and OpenStack with, among other open source software projects.

Speaking of open source, the move comes just days after Microsoft announced MS Open Tech, the standalone open source subsidiary of Microsoft, will re-join the company, in a move the company hopes will drive further engagement with open source communities.

“The goal of the organization was to accelerate Microsoft’s open collaboration with the industry by delivering critical interoperable technologies in partnership with open source and open standards communities. Today, MS Open Tech has reached its key goals, and open source technologies and engineering practices are rapidly becoming mainstream across Microsoft. It’s now time for MS Open Tech to rejoin Microsoft Corp, and help the company take its next steps in deepening its engagement with open source and open standards,” explained Jean Paoli, president of Microsoft Open Technologies

“As MS Open Tech rejoins Microsoft, team members will play a broader role in the open advocacy mission with teams across the company, including the creation of the Microsoft Open Technology Programs Office. The Programs Office will scale the learnings and practices in working with open source and open standards that have been developed in MS Open Tech across the whole company.”

AWS bolsters GPU-accelerated instances

AWS is updating its GPU-accelerated cloud instances

AWS is updating its GPU-accelerated cloud instances

Amazon has updated its family of GPU-accelerated instances (G2) in a move that will see AWS offer up to times more GPU power at the top end.

Announced on the tail end of 2013, AWS teamed up with graphics processing specialist Nvidia to launch the Amazon EC2 G2 instance, a GPU-accelerated instance specifically designed for graphically intensive cloud-based services.

Each Nvidia Grid GPU offers up to 1,536 parallel processing cores and give software as a service developers access to higher-end graphics capabilities including fully-supported 3D visualization for games and professional services.

“The GPU-powered G2 instance family is home to molecular modeling,  rendering, machine learning, game streaming, and transcoding jobs that require massive amounts of parallel processing power. The Nvidia Grid GPU includes dedicated, hardware-accelerated video encoding; it generates an H.264 video stream that can be displayed on any client device that has a compatible video codec,” explained Jeff Barr, chief evangelist at AWS.

“This new instance size was designed to meet the needs of customers who are building and running high-performance CUDA, OpenCL, DirectX, and OpenGL applications.”

The new g2.8xlarge instance, available in US East (Northern Virginia), US West (Northern California), US West (Oregon), Europe (Ireland), Asia Pacific (Singapore), and Asia Pacific (Tokyo), offers four times the GPU power than standard G2 instances including: 4 GB of video memory and the ability to encode either four real-time HD video streams at 1080p or eight real-time HD video streams at 720P; 32 vCPUs; 60 GiB of memory; 240 GB (2 x 120) of SSD storage.

GPU virtualisation is still fairly early on in its development but the technology does open up opportunities for the cloudification of a number of niche applications in pharma and engineering, which have a blend of computational and graphical requirements that have so far been fairly difficult to replicate in the cloud (though bandwidth constraints could still create performance limitations).