Rackspace’s new UK data centre aims to combine efficiency with environmental change

Picture credit: Rackspace

Rackspace has officially launched its new UK data centre in West Crawley, cutting the ribbon in front of local dignitaries, partners and a smattering of hacks.

The launch, on April 22 – not coincidentally Earth Day – offered the chance for Rackspace and its partner Digital Realty to map out its future and the importance of two key areas; the Open Compute Project – to whom the data centre design has been donated – and environmental change.

The 130,000 square foot facility, built on a 15 acre campus, has a power usage effectiveness (PUE) rating of 1.15, which Digital Realty SVP sales and marketing Matt Miszewski noted was “almost unheard of in commercially available multi-tenant data centres.” The average data centre PUE is nearer 1.7, and Rackspace admitted their original target was 1.25.

Other environmentally-conscious features include a sloped roof to harvest rainwater, as well as circular lights which utilise natural light and cooling using natural air – or, as an exec noted, to take advantage of the UK’s natural temperature.

“The progress has been incredibly impressive,” Miszewski said. “In partnership we can say this is one of the greatest data centres on the planet.” “We’ve heavily invested here, and we appreciate the opportunity to serve the community,” added Rackspace COO Mark Roenigk.

PUE remains an important metric to assess the environmental impact data centres create, however it’s not universally recognised. Professor Ian Bitterlin, chair of the British Computer Society’s data centre group on server lifecycle, recently told E&T: “The problem with PUE is that people confuse it with a data centre ‘goodness’ metric. It is not.” He added: “A data centre can have a PUE of 1.2 but be wasting 90% of the total energy because the utilisation is low. Improving server effectiveness is the only way to improve data centre effectiveness.”

Rackspace’s alliance with the Open Compute Project enables further energy efficiencies, including less weight, less waste and less wattage than traditional server design.

The company anticipates customers from around the globe to utilise the Crawley data centre, but expects most of the demand to come from the UK and Europe, where various other vendors have been building data centres for greater latency and data sovereignty from European customers.

Rackspace was unable to detail specifics on customer movement, but CloudTech understands customers would be migrating data over to the centre by end of May.

Take a look at another picture of the data centre below:

Disclosure: Your correspondent’s travel expenses for this story were paid for by Rackspace.

Rackspace moves managed cloud into UK Digital Realty facility

Rackspace has moved its managed cloud platform into Digital Realty's new Sussex datacentre

Rackspace has moved its managed cloud platform into Digital Realty’s new Sussex datacentre

Rackspace has launched its latest UK datacentre this week, moving its cloud platform into datacentre giant Digital Realty’s new facility Crawley, West Sussex facility.

Digital Realty said the datacentre is among the most environmentally friendly in the UK. It delivers Power Usage Effectiveness (PUE) of 1.15 and deploys ‘indirect outside air’ cooling technology instead of mechanical cooling, which according to the company means the overhead energy required to operate the datacentre has been cut by almost 80 per cent.

Rackspace said the 130,000 square foot datacentre will house its managed cloud services.

“This data centre is the epitome of intelligent 21st century infrastructure engineering.  We partnered with industry leaders to design and deliver one of the most environmentally friendly and reliable data centres in Europe. Our customers depend on us for their mission critical managed IT services and this new data centre furthers our commitment to delivering world class services to those customers,” said Mark Roenigk, chief operating officer of Rackspace.

“This is our tenth global datacentre and this expansion will enable us to grow with our customers for many years to come.  We are proud of the energy efficiency achieved with the innovative design that will become the starting point for boosting the adoption of more efficient technologies in the UK and Europe.  We are honoured to operate and provide a positive impact in the Crawley community,” Roenigk said.

The facility provides 6MW capacity across two halls, which will eventually double to four in the near future with a total 12MW capacity, and was built to house up to 50,000 servers. It’s also Open Compute-compliant, so it can house Open Compute Project-based rack designs, which no-doubt factored into Rackspace’s decision to move into the facility; the company is an active participant in the open source hardware project.

“With the addition of the Crawley site, the Digital Realty – Rackspace collaboration, which began in 2011, has been extended to a third continent. We are delighted to see Rackspace establish its new managed cloud data centre with such outstanding eco-credentials,” said William Stein, Digital Realty’s chief executive officer said.

“With competition growing for facility services across the UK and Europe, we are pleased Rackspace choose Digital Realty as a provider to collaborate on this bespoke facility in Crawley,” he added.

From Dev to Prod in the Cloud By @XebiaLabs | @DevOpsSummit [#DevOps]

As a developer, I want the pain the effort required to go from a code change to a running version of my app (which I can test) to be as minimal as possible. Luckily, there are a whole bunch of frameworks and tools that will give me an on-demand environment: Vagrant, Terraform, all the virtualization and cloud management platforms and now, of course, containers and all the related orchestration frameworks too.

But I really also don’t want to have to re-tool my environment provisioning scripts or container definitions every time I add or change some aspect of my app. And there are other use cases to consider: if I’m working in a plane, I’ll (for a little while longer, at least ;-)) want an offline version using Vagrant, Docker or so; when I want to test a bigger setup in e.g. EC2 that is a bit more like my production environment, I’ll use something like AWS OpsWorks.

read more

Salesforce: Use of wearables in the enterprise to triple in two years

Salesforce says use of wearables in the enterprise will triple over the next couple of years

Salesforce says use of wearables in the enterprise will triple over the next couple of years – under the right conditions

Use of wearables in the enterprise will more than triple in the next two years, with smartwatches emerging as a popular candidate to deliver sales and customer service improvements, Salesforce claims.

The CRM company surveyed over 1,400 working adults, 500 of which are wearable tech adopters, to find out how wearable technology is being used in the enterprise, with smartwatches emerging as the most impactful platform in terms of delivering improved sales or customer service experiences or data that can generate insights to improving those processes (digital lanyards and smart glasses rank second and third, respectively).

While wearable tech is still quite a nascent segment (only a fifth of those surveyed overall use wearable for the most basic use cases) research does suggest employees are sold on the potential of these technologies to have a material impact on their businesses.

Salesforce said 79 per cent of adopters agree wearables will be strategic to their company’s future success; 76 per cent report improvements in business performance since deploying wearables in the enterprise; and 86 per cent of adopters plan to increase their wearables spend over the next 12 months.

Just over half of adopters (54 per cent) claim their company supports Bring Your Own Wearable (BYOW) model, while 40 per cent said they companies plan to support BYOW in the future.

“Wearables are the next phase of the mobile revolution. Like smartphones before them, the key to success for wearables in the enterprise is all about the killer business apps,” said Lindsey Irvine, global director of strategic partnerships, Salesforce. “This research demonstrates the tremendous opportunity for wearable use cases to drive significant business value.”

About 52 per cent of respondents said they use or plan to use wearables for real-time access to customer data; 49 per cent for hands-free instruction or guides to field service; and 48 per cent for access to business analytics and alerts.

But according to the research about 30 per cent of adopters cite the lack of business applications as a primary challenge in deploying wearables, and just 8 per cent of wearable adopters said they’re ready to gain actionable insights from the volume of employee and customer data generated from wearables.

Salesforce said that a rich app ecosystem will be required for enterprises to feel confident in deploying and integrating wearables with their existing IT landscape and business processes. Improvements in wearable tech will also be required – among respondents who indicated they have yet to incorporate wearables into their business plans, 25 per cent said that they’d be motivated by lower cost and 15 per cent by devices that can better multitask.

OMG targets data residency in the cloud with new working group

The OMG is forming a working group to develop practical solutions for managing data residency requirements in cloud services

The OMG is forming a working group to develop practical solutions for managing data residency requirements in cloud 

The Object Management Group (OMG) has formed a new working group to study issues of documenting and controlling data across distributed cloud environments, a big inhibitor of cloud for those with strict data sovereignty requirements.

OMG’s Data Residency Working Group will study how to document and control data and online documents where they physically reside, and work with experts to provide practical, multi-disciplinary solutions to help organisations manage the growing gap between regulation and technology.

Richard Soley, chairman and chief executive of OMG said the move is in response to growing uptake of cloud services, at a time when data residency and data privacy laws don’t necessarily align with the technology trend.

“There is a groundswell of concern about data residency, especially in Europe,” Soley said.

“For example, European Union Safe Harbour Principles mandate that companies outside the EU that store Personally Identifiable Information (PII) about EU residents must comply with EU data protection requirements. Many other countries have also restricted how data originating within their borders can be stored abroad.”

“The goal of the Working Group is to develop a taxonomy to help organizations realize the promise of cloud computing while complying with new and evolving privacy regulations, and user demands for data residency.”

Data residency is a huge challenge for many firms looking to use cloud services, in part because it’s difficult to satisfy regulatory requirements for keeping data in-country; data is often sharded or backed up in a range of different locations, particularly for platforms offered by some of the larger geographically distributed cloud service providers. Forming consensus around standard, practical procedures to manage data residency within the context of cloud specifically could go some way towards satisfying regulators in certain niches (i.e. financial services, healthcare) and allowing enterprises to broaden their options when it comes to their IT systems.

Seth Proctor, chief technology officer of NuoDB, a database firm that recently worked with the OMG to survey its members on their data woes, said the organisations recently found nearly nine in ten respondents claimed to have data residency challenges.

“As data increasingly is accessed and shared across geographic boundaries, governmental and other regulatory agencies worldwide have begun adopting stringent laws and regulations about how data can be collected, stored, shared, and transferred,” Proctor said.

“To meet these new data protection and privacy requirements, we need consistency in how we define, discuss, and address the issue of data residency, so that we as an industry can create practical solutions.”

For those of you who may be interested in participating, the Data Residency Working Group is due to have its first meeting in Berlin, Germany on Tuesday June 16.

IBM adds second SoftLayer datacentre in the Netherlands

IBM is launching a second SoftLayer datacentre in the Netherlands

IBM is launching a second SoftLayer datacentre in the Netherlands

IBM has announced the launch of a SoftLayer datacentre in the Netherlands, its second in the country. The move comes the same week IBM reported cloud revenue increases of close to 75 per cent.

The company said the new datacentre, located in Almere just outside Amsterdam, will double SoftLayer capacity in the region and provide customers with more in-country options for data storage and geographically isolated services.

“This new facility demonstrates the demand and success IBM Cloud is having at delivering high-value services right to the doorstep of our clients,” said James Comfort, IBM general manager of cloud services.

“We’re reaching customers in a way that takes all the guess work out of moving to the cloud. They can build and scale applications, run the toughest big data workloads, have the level of security they need, all in country and connected to a truly global platform,” Comfort said.

IBM has moved to rapidly expand its cloud services in the past year. The company has opened up 13 new SoftLayer datacentres in the past 10 months alone as it looks to shift its focus onto lower-margin strategic initiatives like cloud, big data and security.

That said, despite sequential quarterly revenue declines the company recently reported is annual “as-a-service” run rate stands at $3.8bn, up $1.5bn in the last year. Cloud revenue was up over 75 per cent from last year; on a trailing 12-month basis, the company reported cloud revenue of $7.7bn, with analytics up more than 20 per cent and social more than 40 per cent.

DevOps Drives Record Growth for @CloudTest | @DevOpsSummit [#DevOps]

SOASTA, the leader in performance analytics, today reported record growth of the CloudTest community, exceeding 30,000 registered users of the CloudTest platform in Q1 2015. SOASTA also announced widespread adoption of its Web and mobile testing solutions, with more than 1,600 customers completing more than 285,000 tests using CloudTest during the quarter. This rapid growth shows that DevOps-driven digital businesses are embracing a more continuous approach to testing, and CloudTest is meeting their needs for fast and efficient global performance measurement.

read more

Moving workloads to the cloud: A comprehensive guide

(c)iStock.com/mattjeacock

The rapid diffusion for the cloud computing paradigm and promised benefits for the adoption of cloud infrastructure are attracting a growing number of businesses and organisations.

Of course, it is essential for organisations to maximise the benefits of migration to cloud architecture by reducing costs and minimising risks. Cloud computing represents a fundamental change in how companies use and provide their services. For many businesses of small and medium size, it represents a choice to compete in a business environment with powerful competitors.

IT managers are today inundated with countless business proposals. For this reason, I will give you some useful insights for moving workloads to the cloud.

Identify decision makers within the upper management of the enterprise and be sure of their commitment

The adoption of cloud architecture is a process that requires strong effort for the entire enterprise. Every function, application and data has to be moved to the cloud; for this reason, it is necessary to have a strong commitment from the management.

Top management is responsible for the harmonious growth of the company, and technology represents a key factor for business development today.

Managers have to establish reasonable goals for adopting the cloud computing paradigm. A migration to the cloud requires a team effort to plan, design, and execute all the activities to move the workloads to the new IT infrastructure. The migration process could be managed by three different teams with a deep expertise in the following areas:

  • Infrastructure
  • Data and application
  • Cyber security

The divisions have to coordinate their efforts, defining the transition plan and focusing on those activities that need a joint effort.

Public or private cloud, which to choose?

Enterprises have to choose the proper cloud architecture. One of the most important decisions is related to the adoption of a public or private cloud infrastructure.

The choice depends on various factors, including the size of the enterprise and the budget reserved to the IT services of the company. A public cloud is usually offered by specialised companies of large dimensions (e.g. Amazon, Google and Microsoft) which provides cloud infrastructure at low cost, including expenses for ordinary management of the architecture and of the hosted data.

Companies that choose a public cloud have little control of data. Data and applications are shared among numerous business with obvious repercussions on security and privacy.

In a private cloud, company data and applications are hosted in a remote data center dedicated to a single business, giving much more control to the businesses in term of security, privacy and flexibility. Obviously a private cloud is more expensive than a public one.

A third option is represented by a turnkey cloud: pre-tested and certified software and/or hardware and storage that could be quickly deployed by private companies and cloud providers. Turnkey clouds are especially convenient for organisations that lack IT resources; they allow small enterprises to adopt standard business applications from a big cloud provider through software such as a service (SaaS) model and use a cloud data center for services like email.

Choose the right cloud service provider

The choice of a provider requires the evaluation of long list of options specifically related to the users’ business. The principal elements to consider for almost every company are:

Service Levels: This characteristic is essential when businesses have strict needs in term of availability, response time, capacity and support. Cloud Service Level Agreements (Cloud SLAs) are an important element to choose the right provider and establish a clear contractual relationship between a cloud service customer and a cloud service provider of a cloud service. Particular attention has to be reserved to legal requirements for the protection of the personal data hosted in the cloud service.

Support: The support is a parameter to consider carefully. It could be offered online or through a call centre, and in some cases it could be necessary to refer to a dedicated resource with explicit timing constraints.

Security: What is the security level offered by the providers and which mechanisms are in place to preserve our applications and data? These and many other questions have to be formulated to the cloud provider to evaluate this essential feature for the overall architecture.

Compliance: Choose the cloud architecture according to the compliance with the standards for the specific industry. Privacy, security and quality are principal compliance to evaluate in this phase.

Prepare a detailed business plan to move to the cloud

It is necessary for a business plan to define the workflow for the migration to cloud infrastructure. The plan has to detail the resources involved in the process and related efforts. It must include the list of the services to migrate, the timeline of the operations, and the related costs on an annual basis.

In the drafting of the document, it is necessary to consider company business needs and requirements for the cloud provider that we need to choose. The migration impacts on every sector of the company, ranging from IT staff to the legal team that will deal with new types of technology contracts, so it is necessary to prepare the personnel in time.

Map business services to cloud IT services

The cloud computing model could be implemented at different levels. It could be very useful to list all the IT traditional services used/provided by the business and map them on the related cloud services below:

Assess company applications and workloads

Once traditional IT services are mapped in cloud services, it is necessary to assess applications and workloads singularly. In this phase, IT staff in charge of the migration needs to determine which applications and data can be readily moved to a cloud infrastructure, which service to adopt, and which delivery models (public, private, or hybrid) meets the business needs of the company. It is a good practice to start from the lowest-risk applications, which usually have a minimum impact on the business continuity of the organisation.

Adopt a flexible interoperability model

Almost every application migrated to a cloud service has connections with various other applications and systems. It is crucial to preventively evaluate the impact of the migration on these connections and prevent any interruption in data flows.

The communication between applications is typically classified into three categories:

  • Process integration, where an application invokes another in order to execute a specific operation.
  • Data integration, where applications share common data.
  • Presentation integration, where different applications provide computational results at the same time, mainly for the composition of a user’s dashboard.

The migration to a cloud infrastructure must be supported by a careful review of the overall interoperability of the business. Every interaction between systems inside the company and with outside entities has to be assessed and maintained in the new cloud infrastructure.

In many cases, it is not so easy to maintain the integration level and to ensure interoperability; ‘reintegration’ activity of all the components subject to the migration is necessary.

Avoid being locked into a particular cloud service supplier/vendor

One of the greatest concerns for company managers in the migration phase is to avoid being locked to a particular cloud service provider. The problem is particularly concerning at the Software-as-a-Service (SaaS) and Platform-as-a-Service (PaaS) levels.

For high management and IT staff, it is important to have an alternative strategy defined before the migration process will start.

Implement security and privacy requirements

Security and privacy are probably the most concerning issues for enterprises that decide to adopt a cloud infrastructure. Below are just a few questions that every IT security manager has in mind when he approaches the cloud computing paradigm.

  • Confidential data are securely stored in the cloud?
  • Which are the risks related to the exposure to the cyber threats?
  • Can we trust the cloud service provider’s personnel?
  • Which is the level of security offered in the SLA?
  • Which are the security mechanisms in place?
  • Are we compliant with security standards? Which one?

Privacy is closely related to security. A huge amount of sensitive data and personally identifiable information (PII) are stored by enterprises into cloud architectures, and there is the need to preserve them from intentional cyber attacks and accidental incidents.

An efficient approach for privacy and security issues is necessary to avoid loss of business caused by incidents (e.g. data breach) and non-compliance with government regulations.

Companies have to consider security and privacy issues according to the needs of the industry they work for. The key security constructs on the basis of which security policies must be analysed are infrastructure, data, identity, and end-user devices.

To improve security and privacy of cloud architecture, companies that decide to move their workloads to the cloud have to:

  • Decide which data migrate to the cloud and request the implementation of necessary measures to ensure integrity of the information and preserve its confidentiality. Let’s imagine the source code of the core applications developed by a company that need to be moved into the cloud; the software repository needs to be hardened against external attacks and their access must be regulated to prevent data leakage from insiders
  • Map company data for requesting security classification
  • Review the cloud providers’ security/privacy measures (e.g. physical security, incident notifications) and make sure that they are documented in the cloud SLA
  • Identify sensitive data
  • Define/Review the authorisation and authentication processes
  • Examine applicable regulations and carefully evaluate what needs to be done to meet them after a migration to cloud computing
  • Manage the risks of security or privacy violations, evaluating the impact on the company business for every task/activity moved to the cloud

It is crucial to understand that the migration process itself could expose company data to cyber threats and cause incidents. That is why the IT staff has to consider how to secure data and applications during the transition.

Manage the migration as a project

The migration to cloud architecture must be formalized by IT staff and shared with managers of different departments inside the company. Every activity must be defined, planned and executed, and the transition itself must be managed as an articulated project. As described in a previous point, it is necessary to define a formal project plan accepted by upper management. Every activity must be tracked and related costs and risks must be monitored during the migration.

It could be useful to prepare a sort of Statement of Objectives (SOO), which describes the goals that every department expects to achieve with regard to the migration of its services and application to the cloud.

A similar document, ordinarily used in government environments, has the primary goal to prepare personnel for moving its activities to the cloud infrastructure.

The SOO could include information regarding the following activities:

  • Conducting an inventory of every asset and service of the company.
  • Defining metrics to evaluate the evolution of activities during the migration to the cloud.
  • Application Mapping
  • Identifying appropriate service models (e.g. SaaS, IaaS) and deployment models (e.g. private, public)
  • Developing the business case to quantify cost and benefits
  • Migration planning

Once the migration is complete, it is necessary to verify the efficiency of procedures/services in the new environment according to the metric defined in the SOO document. The test phase has to be conducted, limiting the impact of the strategic functions of the company and if possible, using non-critical data.

I always suggest particular attention to privacy and security issues due to the rapid evolution of the security industry, which requires a dynamic approach. Security and risk assessments must be continuously conducted in compliance with international standards.

ISO 27018 and protecting personal information in the cloud: a first year scorecard

ISO 27018 has been around for a year - but is it effective?

ISO 27018 has been around for a year – but is it effective?

A year after it was published,  – the first international standard focusing on the protection of personal data in the public cloud – continues, unobtrusively and out of the spotlight, to move centre stage as the battle for cloud pre-eminence heats up.

At the highest level, this is a competitive field for those with the longest investment horizons and the deepest pockets – think million square foot data centres with 100,000+ servers using enough energy to power a city.  According to research firm Synergy, the cloud infrastructure services market – Infrastructure as a Service (Iaas), Platform as a Services (PaaS) and private and hybrid cloud – was worth $16bn in 2014, up 50 per cent on 2013, and is predicted to grow 30 per cent to over $21bn in 2015. Synergy estimated that the four largest players accounted for 50 per cent of this market, with Amazon at 28 per cent, Microsoft at 11 per cent, IBM at 7 per cent and Google at 5 per cent.  Of these, Microsoft’s 2014 revenues almost doubled over 2013, whilst Amazon’s and IBM’s were each up by around half.

Significantly, the proportion of computing sourced from the cloud compared to on-premise is set to rise steeply: enterprise applications in the cloud accounted for one fifth of the total in 2014 and this is predicted to increase to one third by 2018.

This growth represents a huge increase year on year in the amount of personal data (PII or personally identifiable information) going into the cloud and the number of cloud customers contracting for the various and growing types of cloud services on offer. but as the cloud continues to grow at these startling rates, the biggest inhibitor to cloud services growth – trust about security of personal data in the cloud – continues to hog the headlines.

Under data protection law, the Cloud Service Customer (CSC) retains responsibility for ensuring that its PII processing complies with the applicable rules.  In the language of the EU Data Protection Directive, the CSC is the data controller.  In the language of ISO 27018, the CSC is either a PII principal (processing her own data) or a PII controller (processing other PII principals’ data).

Where a CSC contracts with a Cloud Service Provider (CSP), Article 17 the EU Data Protection Directive sets out how the relationship is to be governed. The CSC must have a written agreement with the CSP; must select a CSP providing ‘sufficient guarantees’ over the technical security measures and organizational measures governing PII in the Cloud service concerned; must ensure compliance with those measures; and must ensure that the CSP acts only on the CSC’s instructions.

As the pace of migration to the cloud quickens, the world of data protection law continues both to be fragmented – 100 countries have their own laws – and to move at a pace driven by the need to mediate all competing interests rather than the pace of market developments.

In this world of burgeoning cloud uptake, ISO 27018 is proving effective at bridging the gap between the dizzying pace of Cloud market development and the slow and uncertain rate of legislative change by providing CSCs with a workable degree of assurance in meeting their data protection law responsibilities.  Almost a year on from publication of the standard, Microsoft has become the first major CSP (in February 2015) to achieve ISO 27018 certification for its Microsoft Azure (IaaS/PaaS), Office 365 (PaaS/Saas) and Dynamics CRM Online (SaaS) services (verified by BSI, the British Standards Institution) and its Microsoft Intune SaaS services (verified by Bureau Veritas).

In the context of privacy and cloud services, ISO 27018 builds on other information security standards within the IS 27000 family. This layered, interlocking approach is proving supple enough in practice to deal with the increasingly wide array of cloud services. For example, it is not tied to any particular kind of cloud service and, as Microsoft’s certifications show, applies to IaaS (Azure), PaaS (Azure and Office 365) and SaaS (Office 365 and Intune). If, as shown in the graphic below, you consider computing services as a stack of layered elements ranging from networking (at the bottom of the stack) up through equipment and software to data (at the top), and that each of these elements can be carried out on premise or from the cloud (from left to right), then ISO 27018 is flexible enough to cater for all situations across the continuum.

Software as a Licence to Software as a Service: the Cloud Continuum

Software as a Licence to Software as a Service: the cloud continuum

Indeed, the standard specifically states at Paragraph 5.1.1:

“Contractual agreements should clearly allocate responsibilities between the public cloud PII processor [i.e. the CSP], its sub-contractors and the cloud service customer, taking into account the type of cloud service in question (e.g. a service of an IaaS, PaaS or SaaS category of the cloud computing reference architecture).  For example, the allocation of responsibility for application layer controls may differ depending on whether the public cloud PII processor is providing a SaaS service or rather is providing a PaaS or IaaS service upon which the cloud service customer can build or layer its own applications.”

Equally, CSPs will generally not know whether their CSCs are sending PII to the cloud and, even if they do, they are unlikely to know whether or not particular data is PII. Here, another strength of ISO 27018 is that it applies regardless of whether particular data is, or is not, PII: certification simply assures the CSC that the service the CSP is providing is suitable for processing PII in relation to the performance by the CSP of its PII legal obligations.

Perhaps the biggest practical boon to the CSC however is the contractual certainty that ISO 27018 certification provides.  As more work migrates to the cloud, particularly in the enterprise space, the IT procurement functions of large customers will be following structured processes in order to meet the requirements of their business and, in certain cases, their regulators. In their requests for information, proposals and quotations from prospective CSPs, CSCs now have a range of interlocking standards including ISO 27018 to choose from in their statements of requirements for a particular Cloud procurement.  As well as short-circuiting the need for CSCs to spend time in writing up detailed specifications of their own requirements, verified compliance with these standards for the first time provides meaningful assurance and protection from risk around most aspects of cloud service provision. Organisations running competitive tenders can benchmark bidding CSPs against each other on their responses to these requirements, and then include as binding commitments the obligations to meet the requirements of the standards concerned in the contract when it is let.

In the cloud contract lifecycle, the flexibility provided by ISO 27018 certification, along with the contract and the CSP’s policy statements, goes beyond this to provide the CSC with a framework to discuss with the CSP on an ongoing basis the cloud PII measures taken and their adequacy.

In its first year, it is emerging that complying, and being seen to comply, with ISO 27018 is providing genuine assurance for CSCs in managing their data protection legal obligations.  This reassurance operates across the continuum of cloud services and through the procurement and contract lifecycle, regardless of whether or not any particular data is PII.  In customarily unobtrusive style, ISO 27018 is likely to go on being a ‘win’ for the standards world, cloud providers and their customers, and data protection regulators and policy makers around the world.

 

How to Prepare Your Environment for the Software Defined Networking Era

Whether it’s VMware NSX or Cisco ACI, to adopt any software defined networking solution there is a lot of backend work that needs to be done. Before you get into the weeds around specific products, take a step back. To be successful, you’re going to need to have a level of understanding about your applications you’ve never needed before. The key is to take the proper steps now to make sure you can adopt software defined networking technologies when the time comes.

 

Preparing Your Environment for the Software Defined Networking Era

 

//www.youtube.com/watch?v=Y6pVmNrOnCA

 

 

If you’re interested in speaking to Nick in more detail about software defined technology, reach out!

 

 

By Nick Phelps, Principal Architect