How data classification and security issues are affecting international standards in public sector cloud

Cloud technology is rapidly becoming the new normal, replacing traditional IT solutions. The revenues of top cloud service providers are doubling each year, at the start of a predicted period of sustained growth in cloud services. The private sector is leading this growth in workloads migrating to the cloud. Governments, however, are bringing up the rear, with under 5 percent of a given country’s public sector IT budget being dedicated to cloud spending. Once the public sector tackle the blockers  that are preventing uptake, spending looks likely to rapidly increase.

The classic NIST definition of the Cloud specifies Software (SaaS), Platform (PaaS) and Infrastructure (IaaS) as the main Cloud services (see figure 1 below), where each is supplied via network access on a self-service, on-demand, one-to-many, scalable and metered basis, from a private (dedicated), community (group), public (multi-tenant) or hybrid (load balancing) Cloud data centre.

Figure 1: Customer Managed to Cloud Service Provider Managed: The Continuum of Cloud Services

 

Kemp aas diagram 2

The Continuum of Cloud Services

 

The benefits of the Cloud are real and evidenced, especially between the private and public cloud where public cloud economies of scale, demand diversification and multi-tenancy are estimated to drive down the costs of an equivalent private cloud by up to ninety percent.

Also equally real are the blockers to public sector cloud adoption, where studies consistently show that management of security risk is at the centre of practical, front-line worries about cloud take-up, and that removing them will be indispensable to unlocking the potential for growth.  Demonstrating effective management of cloud security to and for all stakeholders is therefore central to cloud adoption by the public sector and a key driver of government cloud policy.

A number of governments have been at the forefront of developing an effective approach to cloud security management, especially the UK which has published a full suite of documentation covering the essentials.  (A list of the UK government documentation – which serves as an accessible ‘how to’ for countries who do not want to reinvent this particular wheel – is set out in the Annex to our white paper, Seeding the Public Cloud: Part II – the UK’s approach as a pathfinder for other countries).  The key elements for effective cloud security management have emerged as:

  • a transparent and published cloud security framework based on the data classification;
  • a structured and transparent approach to data classification; and
  • the use of international standards as an effective way to demonstrate compliance with the cloud security framework.

Data classification enables a cloud security framework to be developed and mapped to the different kinds of data. Here, the UK government has published a full set of cloud security principles, guidance and implementation dealing with the range of relevant issues from data in transit protection through to security of supply chain, personnel, service operations and consumer management. These cloud security principles have been taken up by the supplier community, and tier one providers like Amazon and Microsoft have published documentation based on them in order to assist UK public sector customers in making cloud service buying decisions consistently with the mandated requirements.

Data classification is the real key to unlocking the cloud. This allows organisations to categorise the data they possess by sensitivity and business impact in order to assess risk. The UK has recently moved to a three tier classification model (OFFICIAL → SECRET → TOP SECRET) and has indicated that the OFFICIAL category ‘covers up to ninety percent of public sector business’ like most policy development, service delivery, legal advice, personal data, contracts, statistics, case files, and administrative data. OFFICIAL data in the UK ‘must be secured against a threat model that is broadly similar to that faced by a large UK private company’ with levels of security controls that ‘are based on good, commercially available products in the same way that the best-run businesses manage their sensitive information’.

Compliance with the published security framework, in turn based on the data classification, can then be evidenced through procedures designed to assess and certify achievement of the cloud security standards. The UK’s cloud security guidance on standards references ISO 27001 as a standard to assess implementation of its cloud security principles.  ISO 27001 sets out for managing information security certain control objectives and the controls themselves against which an organisation can be certified, audited and benchmarked.  Organisations can request third party certification assurance and this certification can then be provided to the organisation’s customers.  ISO 27001 certification is generally expected for approved providers of UK G-Cloud services.

Allowing the public sector cloud to achieve its potential will take a combination of comprehensive data classification, effective cloud security frameworks, and the pragmatic assurance provided by evidenced adherence to generally accepted international standards. These will remove the blockers on the public sector cloud, unlocking the clear benefits.

Written by Richard Kemp, Founder of Kemp IT Law

HP launches 3Par flash storage – building block for all flash data centres

HPHewlett Packard Enterprise (HPE) has launched new flash storage devices which it claims will bring the day of the all flash data centre and lighting fast cloud services closer.

The HPE 3PAR StoreServ Storage systems will be the data storage blocks in the flash data centres of the future, its claims. When all the memory, storage and processing of data is run on flash technology, data centres will create the most competitive environment possible for cloud services, according to HPE.

HPE has also integrated 3PAR StoreServ with its new HPE StoreOnce and HPE StoreEver product lines to ensure protection and retention keep pace with demand. It is this integration which will speed the progress of modernising data centres, according to HPE, because it means that new and mixed media types can work together in the same array while maintaining performance and enterprise-class resiliency.

Earlier in November the Storage Performance Council testified that a new world record speed was achieved by the 3PAR StoreServ 20850 all-flash array. HPE claims it produced better performance levels than the rival EMC VMAX 400K, but at half the price.

Among the new HPE offerings are a 3PAR Flash Acceleration system for Oracle, 3PAR Online Import software and support for 3d NAND drives.

The Flash Acceleration drive could makes databases perform 75% quicker while enabling legacy systems like EMC VMAX to remain in place, claims HPE. This, it says, is half the price of upgrading the legacy storage system.

3PAR Online Import software makes it easier to move off hard disk drive (HDD)-bound legacy storage, such as EMC, HDS and IBM XIV, and onto flash. Support for 3D NAND drives means that solid state drive (SSD) technology can be installed cheaply.

HPE claims it can save the massive expense involved in buying pure flash systems by creating a flash-optimised design that supports both file and block storage as well as a secondary tier of HDDs.

HPE also announced new systems to help customers as they move away from traditional backup silos in favour of integrated flash array and application data protection.

“Organisations want game-changers like flash without introducing risk,” said Manish Goel, HPE’s general manager for storage, “to meet those demands, Hewlett Packard Enterprise simplifies flash storage from the entry to enterprise.”

Microsoft Office 365: Expectations vs. Reality

There are many benefits to implementing Microsoft Office 365 including reducing capital expenditures, the ability to scale your business quickly, and simplified licensing. There have also been increased features and functionality such as Yammer, Delve and Skype for Business. Keep in mind, however, there can be some challenges associated with Office 365 implementations. Organizations need to take the proper measures to prepare for quality migration  and management of this critical suite of end user productivity services.

I’ll be hosting a webinar on 11/18, with my colleagues Jay Keating and Geoff Smith, to cover strategies for migrating and supporting mobile workforces. If you’re considering implementation, I highly recommend you register. Below are some of the topics Jay, Geoff and I will be covering:

  • Office 365 capabilities and use cases
  • Microsoft Cloud IaaS (Azure) considerations
  • Hidden challenges of migrating to Office 365
  • Licensing, version control, & AD Premium & Office 365 E3 issues
  • The ugly side of post-migration user support
  • SLAs, Quality of Service, and accountability challenges
  • Security, confidentiality, and compliance pit falls
  • Having the CFO talk – risks, benefits, costs

Register now for David’s upcoming webinar, “Microsoft Office 365: Expectations vs. Reality.” Bring your questions, as there will be a Q&A session at the end of the webinar!

 

By David Barter, Practice Manager, Microsoft Technologies

Six Reasons Your API Is the Windows Vista of APIs By @JustinRohrman | @CloudExpo #API #Cloud

Does your API suck? Okay, that one needs a little explanation.

If you’ve developed an API, it exposes some functionality to users. It might suck to learn. The documentation might be unclear and the function signatures counter-intuitive. It might suck to use, doing a lot of things, but never particularly what you really need, right now.

After a great deal of working with companies developing new API functionality, and also building out demo material from publicly available APIs (starting with the thought «this should be easy …») I have developed some opinions on the subject. Just like a restaurant that doesn’t pay attention to detail, an awkward API can have a dozen small things that add up to a big problem. Misplaced silverware, a long wait time, a slow waiter, details wrong on the order … no one of these will make you want to stand up and leave, but put together, they’ll make sure you never come back.

read more

The World’s Many IoTs ‘Power Panel’ at @ThingsExpo | #IoT #M2M #BigData

The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel’s Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas.
In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound effect on the world, and what should we expect to see over the next couple of years.

read more

Analysing the enterprise ‘multi-cloud’ market: IBM and Virtustream lead the way

(c)iStock.com/hocus-focus

By Neil McAvoy, CEO, CloudBestPractices.net

Although the hybrid cloud concept has become the popularised idea of how the enterprise market will inch its way into adoption of cloud services, twinning their own internal private cloud with public resources, this will be a short term label soon replaced by ‘multi-cloud‘.

I agree with the general direction of travel; however, I feel we’ll soon see the hybrid cloud definition as somewhat inadequate, as we’re really dealing with a generalised evolution to a ‘multi-cloud’ environment, as Search Cloud computing similarly describes in their article on the same topic.

Enterprise data centre transformation: Harnessing an enterprise cloud marketplace

The shift to this trend will go hand in hand with an associated evolution of enterprise data centre practices, including one aspect of multi-cloud implementation – the enterprise cloud marketplace.

In this HP article they say the enterprise IT organisation should evolve to become a service broker, as a foundation for organisational change, while Gartner lays out a data centre transformation framework that encompasses aspects like hybrid cloud outsourcing.

The common themes are establishing more of a brokerage operation, as part of increasing maturity of procurement of cloud services.

As the organisations go beyond one-off, Internet-centric apps that Amazon is ideal for, into their broad portfolio of IT, they will increasingly look more for tools that aid in this portfolio analysis, planning and migration.

An ‘enterprise cloud marketplace’ is an ideal platform for this type of functionality. Vendors like Gravitant offers a suite of tools that manages the life-cycle of matching application design blueprints to possible cloud hosting options, conducting price comparisons and so forth. As part of illustrating this type of function they performed a test to determine the best enterprise cloud provider, with Virtustream and IBM leading the pack. A very interesting development then is that IBM has now acquired Gravitant.

Open standards SDDC

A second aspect of multi-cloud capabilities is best understood when we also consider it in union with another key ongoing trend, the data centre virtualisation spreading into the telco industry, headlining their drive towards SDN and NFV powered telco networks.

Pioneering telco providers like AT&T are transforming their core network systems to a Cloud-centric approach via their Domain 2.0 program, describing how it will enable IoT innovations.

The enterprise market will be able to harness and build upon this wave of innovation, applying the technologies within their data centres, as well as using new telco services that they enable. As SearchDataCenter describes NFV offers to unify the data centre.

Industry forum the TMF is pioneering the best practices that will enable other service providers to undertake this transformation, working with vendors via ‘catalyst projects’ such as this case study with Microsoft to define a combined multi-cloud SDN, who also offer the multi-cloud reference architecture.

Open standards like TOSCA from OASIS are key to this scenario, offering the cloud standards for matching blueprints to Cloud providers, and orchestrating the following provisioning of services.

Sadhav of IBM discusses his involvement with the standard, and how it might be applied with OpenStack, the open source cloud platform, is discussed in detail in this whitepaper, and In this video:

“OpenStack Heat is gaining momentum as a DevOps tool to orchestrate the creation of OpenStack cloud environments. Heat is based on a DSL describing simple orchestration of cloud objects, but lacks better representation of the middleware and the application components as well as more complex deployment and post-deployment orchestration workflows.

“The Heat community has started discussing a higher level DSL that will support not just infrastructure components. This session will present a further extended suggestion for a DSL based on the TOSCA specification, which covers broader aspects of an application behavior and deployment such as the installation, configuration management, continuous deployment, auto-healing and scaling.” 

We’re additionally seeing how these same innovations can apply to the telco SDN scenario. For example, in this blog Cloudify describe how they implement TOSCA on their platform and can use this to also configure NFV services too.

DRaaS maturity

From an ROI and enterprise strategy point of view, the most interesting dynamic of Multi-Cloud capabilities is that they multiple benefits in different areas. One of the first is improved business continuity capacity.

This Redmond article describes how Microsoft is building Multi-Cloud orchestration into Windows Server, to offer DRaaS via Azure, and this presentation explains how DRaaS can be achieved using Openstack.

The post IBM and Virtustream lead Enterprise Multi-Cloud market appeared first on Cloud Best Practices.

Equinix cleared to buy Telecity but must sell London, Amsterdam and Frankfurt facilities

datacentreThe European Commission has approved the proposed acquisition of data centre operator Telecity by rival Equinix. However, to assuage anti competition concerns, Equinix had to agree to sell off a number of data centres in Amsterdam, London and Frankfurt.

BCN reported in May that Equinix and TelecityGroup agreed to the $2.35bn takeover in which US-based Equinx would buy all issued Telecity shares. The acquisition gives Equinix a stronger presence in the UK and would extend its footprint into new locations with identified cloud and interconnection needs including Dublin, Helsinki, Istanbul, Milan, Stockholm and Warsaw. Equinix provides colocation services in 33 metropolitan areas worldwide. Telecity operates data centres in 12 metropolitan areas in the European Economic Area (EEA) and Turkey.

However, the activities of Equinix and Telecity overlap in the four EEA metro areas of Amsterdam, Frankfurt, London and Paris.

In a statement issued by the EC Commissioner in charge of competition policy Margrethe Vestager said the growing economic importance of cloud services makes it crucial to maintain competition between data centres. However the deal does not necessarily stifle competition, Vestager said. “The Commission is satisfied that the commitments offered by Equinix will ensure that companies continue to have a choice for hosting their data at competitive prices,” said Vestager.

The Commission has concerns that the concentration of data centres controlled by one vendor could lead to higher prices of colocation services in the Amsterdam, London and Frankfurt metropolitan areas. The remaining competitors in these areas are unlikely to be able to match the competitive pressure currently exercised by Telecity, it had concluded, and new players would have faced significant difficulties to enter the market due to the high investment and deployment times needed.

To address the Commission’s concerns, Equinix submitted commitments, offering to divest a number of data centres in Amsterdam, London and Frankfurt.

What Type of Gamer Are You?

Are you secretly a Guardian of Light and the Traveler after work? Do you like to fight all the monsters, and run to pick up the next God of Diablo Vania game when it releases? Or do you schedule days off of work to hunker down and dig into the big Dragon Age game when […]

The post What Type of Gamer Are You? appeared first on Parallels Blog.

IHS forecast shows SDN deployments ramping up in 2015

(c)iStock.com/Henrik5000

Software defined networking (SDN) will move from early adopters into the hands of mainstream buyers by 2017, according to the latest research note from IDS.

The findings, which appear in the latest IHS Infonetics Data Centre and Enterprise SDN Hardware and Software report, also show the market for ‘in use’ SDN Ethernet switches – ostensibly the real market for SDN, compared with the Ethernet switches that lie dormant – at $1.4 billion this year, nearly doubling from last year.

During the first half of 2015, bare metal switches comprised almost half (45%) of global in-use SDN-capable Ethernet switch revenue. Dell garners 100% of branded bare metal switch revenue, while HP has the largest share of SDN-capable – both in use and not in use – branded Ethernet switch ports.

Cliff Grossner, research director for data centre, cloud and SDN at IHS, notes the continued acceleration in the marketplace. “New SDN use cases continue to emerge, and the first half of 2015 was no exception with the establishment of the software defined enterprise WAN market,” he said. “The SD-WAN market is still small, but many startups and traditional WAN optimisation appliance vendors and network vendors have jumped in.”

SD-WAN is a natural progression from SDN, theoretically benefiting wide area networking as it allows businesses to mix and match their WAN network types, better utilising their network resources. As Cahit Akin, CEO of Mushroom Networks points out, VoIP packets can be sent with better quality and reliability by optimising the WAN network through software-configured algorithmic nodes to stop latency and jitter.

In June, IHS released a note which forecast the in use software defined networking market would hit $13bn by 2019, up from 2014’s figure of $781m.

Microsoft Plans to Expand into Germany with Help from T-Systems

Microsoft has recently paired with a subsidiary of Deutsche Telekom, T-Systems, in offering Azure cloud services to Germany. This announcement comes after news of major expansion into the United Kingdom. Two data centers will be located in Magdeburg and Frankfurt. They will start delivering Azure, Office 365, and Dynamics CRM Online to Germany during the second half of 2016.

Microsoft CEO Satya Nadella has commented “Our new datacenter regions in Germany, operated in partnership with Deutsche Telekom, will not only spur local innovation and growth, but offer customers choice and trust in how their data is handled and where it is stored.” T –System will have control of customer access stored on cloud servers. In order for Microsoft to access information stored on the servers, it will have to obtain permission from either customers or T-Systems. Once permission is granted, T-Systems will supervise the activity.  Alex Stüger, area vice president for Microsoft Germany, has stated that: “this will help us meet growing demand for Microsoft cloud services in Germany, and across Europe, by providing an innovative, scalable and consistent cloud computing platform combined with a German data trustee model.”

T-systems

In addition to Stüger’s comments,  CEO of Deutsche Telekom, Timotheus Höttges, has stated, “Microsoft is pioneering a new, unique solution for customers in Germany and Europe. Now, customers who want local control of their data combined with Microsoft’s cloud services have a new option, and I anticipate it will be rapidly adopted.”

The company is also investing in cloud security. The centers will contain features such as biometric scanning, smart cards and other physical access controls while data will be protected by cure Sockets Layer/Transport Layer Security (SSL/TLS) encryption based on German certificates and multifactor authentication. To ensure data stays within Germany, a private network will be utilized to share data between regions. Doug Hauger, general manager of Microsoft National Cloud Programs has recently written in a blog post, “”These new regions will adhere to the same service and quality standards of all Microsoft Cloud regions. In addition, the services will feature the same industry-leading levels of security, privacy and control, compliance, and transparency that define the Microsoft trusted cloud. Microsoft is committed to innovating to meet the needs of its customers, and understands that customers and partners will only choose technology they trust.”

The post Microsoft Plans to Expand into Germany with Help from T-Systems appeared first on Cloud News Daily.