Category Archives: News & Analysis

Cloud28+ promises to clear up the cloudy issues of compliance

Hewlett Packard Enterprise (HPE) claims its new Cloud28+ cloud service catalogue will simplify the search for compliant cloud services for European enterprises.

Cloud 28+ is a community of commercial and public sector organisations aimed at expanding cloud service adoption across Europe. The Cloud28+ catalogue, on the other hand, is a centralized enterprise app store which now lists 680 cloud services from 150 members across the range of Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS) offerings. To date 1000 end user organisations have pre-registered to use the catalogue.

The matchmaking Cloud28+ service online catalogue, now on general availability, promises a broad range of benefits for European customers. It allows customers to specify data centre locations and providers, in accordance with local laws and business requirements. It will helps users to find cloud-native independent software vendors with whom they can partner and it will help companies market themselves more expansively by letting them publishing their own services in the catalogue. This could allow end user organisations to turn their IT teams into ‘revenue-generating engines’, claims HPE.

The main benefit of the Cloud28+ service catalogue, HPE claims, is that it gives open access to huge numbers of enterprise cloud services. This will help cloud buyers to compare the cloud market, on functional and non-functional criteria, including price, service level agreements and certification levels.

One of the main selling points of the system is that is makes it easier to comply with increasingly strict data protections laws in the EU, according to James Kinsella, founder of Zettabox a cloud storage and team sharing system and the latest addition to the Cloud28+ catalogue. “It’s a logical community for Zettabox to join, as its mission is to build a cohesive and collaborative cloud environment, for Europeans by Europeans,” said Kinsella.

The Cloud28+ technology framework is based on HPE Helion OpenStack. This will give it the portability of cloud services and eliminate vendor lock-in, said Xavier Poisson, Hybrid IT VP at HPE. “This is an important milestone on the journey to a European Digital Single Market,” said Poisson.

The overturning of the Safe Harbour agreement in European courts had tremendous implications for cloud service providers, according to one analyst. “It certainly makes services that comply with European data privacy requirements more attractive,” said William Fellows, analyst at 451 Research.

Azure Backup gets fine tuned with speed, cache and retention improvements

AzureMicrosoft’s Azure has promised more speed, lower cache demands and better data retention among a range of improvement to its cloud backup services for enterprise data.

Azure Backup now uses a technology called Update Sequence Number (USN) Journal in Windows to track the files that have changed between consecutive backups. USN keeps track of these changes to files and directories on the volume and this helps to identify changed files quickly.

The upshot of this tweak is a faster backup time. “We’ve seen up to a 50% reduction of backup times when using this optimization,” said Giridhar Mosay, Azure’s Program Manager for Cloud and Enterprise. Individual file server backup times will vary according to numbers and sizes of files and directory structure, Mosay warned.

A new algorithm that computes metadata has slashed the amount of cache space needed for each Azure Backup by 66%. The standard allocation of 15% cache space per volume size being backed up to Azure has proved prohibitive for volumes greater than 10TB. The new algorithm makes the cataloguing of the file space to be backed up a much more efficient process, which creates so much less metadata that it demands only 5% cache space, or less. Azure is now modifying its requirement for cache space to a third of the old level.

Meanwhile the resilience of the system has improved as Azure Backup has increased the number of recovery points for cloud backups. This allows for flexible retention policies to meet stringent compliance requirements such as HIPAA (the federal Health Insurance Portability and Accountability Act of 1996) for large enterprises. The new maximum number of recovery points has increased from 366 to 9999.

Other tweaks include more timeouts across the various phases of backup process to ensure that long running jobs complete reliably. Cloud backups will also run a bit more efficiently as a result of a decoupling of the processes of cataloguing and uploading the backup data. Intermittent failures, in the service to handle incremental backups, have also been identified and resolved, according to Mosay. “We are continuing our journey to make Azure backup enterprise grade,” he said.

ItsOn gets $12.5 million funding to take Smart Services into LatAm and EMEA

Itson awardCloud based mobile service provider ItsOn has raised $12.5 million in a Series D funding round led by Delta Partners Capital Limited with follow-on investments from Verizon Ventures, Andreessen Horowitz and Tenaya Capital.

ItsOn’s technology aims to make mobile commerce a more enjoyable and secure experience through a range of services, content and apps. It currently runs its Smart Services primarily from North America data centres but the cash injection will help it fund regional data centres as it launches into markets in South America, Middle East, Africa and Europe.

ItsOn says it gives mobile customers better ways to buy wireless services and interact with their service providers. Its service is described as a ‘digital transformation platform for wireless operators’ that includes an integrated cloud service, on-device software and a mobile operator interface, the Service Design Center. These three components connect to IT and business systems, so operators can provide better experiences with a faster time to market for services, offers and mobile commerce growth.

Mobile operators desperately need to improve their social skills with end users and that requires a digital transformation according to Kristoff Puelinckx, co-founder at one of ItsOn’s investors, Delta Partners. Puelinckx said ItsOn’s mobile commerce platform is ‘at least five years ahead’ of every other player in this space, thanks to its engagement skills and contextual marketing for new products, services and incentives.

The ‘great digital experience’ is generally lacking among mobile operators, who rely on time-consuming and inconvenient store visits and call centre based cold callers in order to sell new services. Operators have suddenly woken up to the fact that they need to show greater transparency and more compelling service, according to Puelinckx.

Verizon Ventures started investing in ItsOn when it invented a virtual end-to-end carrier IT system, and it poured even more money in when it then created a cloud solution for OSS, BSS and user engagement, said Verizon Ventures director Ed Ruth. “It moved the mobile service market forward and we are pleased to continue investing in ItsOn,” said Ruth. The new system, he says, will help operators sell a lot more services to consumers, SMBs and IoT companies.

“There’s a rapidly growing demand for our technology as wireless service providers face increasing end-user expectations, new opportunities and new competition,” said ItsOn CEO Dr Greg Raleigh.

EMC launches new open source tech for the software defined datacentre

EMC2EMC is launching RackHD and revised version of CoprHD and REX-Ray in its quest to be a top open source influence on tomorrow’s software defined datacentre industry.

RackHD is a hardware management and orchestration software that promises to automate functions such as the discovery, description, provisioning and programming of servers. EMC says it will speed up the process of installing third platform apps by automatically updating firmware and installing operating systems.

Meanwhile, version 2.4 of storage automator CoprHD was improved with help from Intel and Oregon State University. It can now centralise and transform storage from multiple vendors into a simple management platform and interface, EMC claims.

The updated version of storage orchestration engine REX-Ray 0.3 has added storage platform support for Google Compute Engine in addition to EMC Isilon and EMC VMAX.

These products are aimed at modern data centres with a multi-vendor mix of storage, networking and servers and an increasing use of commodity hardware as building blocks of software defined hyperscale infrastructure. In these cases the use of low-level operating systems or updating firmware and BIOS across numerous devices is a cumbersome manual task for data centre engineers, says EMC. RackHD was created to automate and simplify these fundamental tasks across a broad range of datacentre hardware.

According to EMC, developers can use the RackHD API as a component in a larger orchestration system or create a user interface for managing hardware services regardless of the underlying hardware in place.

Intel and Oregon State University have joined EMC’s CoprHD Community as the newest contributors to the storage vendor’s open source initiative. Intel is leading a project to integrate Keystone with CoprHD, allowing the use of the Cinder API and the CoprHD API to provide block storage services.

“We discovered how difficult it was to implement any kind of automation tooling for a mix of storage systems,” said Shayne Huddleston, Director of IT Infrastructure at Oregon State University. “Collaborating with the CoprHD community will allow us avoid vendor lock-in and support our entire infrastructure.”

Riverbed says it’ll make apps respond faster on BT’s cloud of clouds

BT cloud of cloudsBT is to use Riverbed’s SteelHead application accelerator in its global telecoms network to bolster its cloud of clouds strategy.

BT and Riverbed will embed the service at global business hubs in Europe, North America and Asia. Installations are to be made at any location where BT has direct links to major cloud providers and high-capacity internet breakout. The service will be globally available from early 2016 and accessible through BT’s IP Connect VPN from 198 countries and territories.

Steelhead is designed to boost application performance and optimise bandwidth use. As a result customers should get faster responses from BT’s own cloud services and other vendors’ Software-as-a-Service (SaaS) offerings. This partnership is the first time Riverbed technology has been installed into the core of a global telecoms network.

App acceleration and bandwidth efficiencies aside, customers using the new service will have greater control over their applications, a more commanding view of performance across the network and significantly more reliability and security from applications delivered over the internet, says BT.

The new service uses network function virtualisation (NFV) to help customers get a broader range of virtualised functions, such as application performance management and fast access to private and public clouds.

The inclusion of Riverbed helps BT tackle the performance and reliability of applications in the cloud, which have become a big issue for clients, according to Keith Langridge, VP of network services at BT Global Services. “This joint offering with Riverbed is a milestone on the journey to software-defined networks and creates an additional differentiator against our competitors,” said Langridge.

CIOs want the benefits of a hybrid enterprise without the challenges of application delivery that this complex environment creates, according to Paul O’Farrell, General Manager for SteelHead at Riverbed. “Riverbed invented WAN optimization in 2004 with SteelHead and now it’s the leader in application performance infrastructure,” said O’Farrell, “we’re offering an easier on-ramp to cloud computing with BT’s Cloud Connect service.”

IBM acquires Clearleap’s cloud based video

IBM Bluemix CloudIBM says it has acquired cloud based video service provider Clearleap in a bid to make video a strategic source of data on any device at any time.

Clearleap’s video services will be offered through IBM Cloud data centres around the world, which will give clients global 24×7 service and technical support for problem identification and resolution. Clients using the service can now share data and content across geographies and hybrid clouds. IBM will offer the Clearleap APIs on IBM Bluemix in 2016 so clients can build new video offerings quickly and easily.

IBM says Clearleap’s open API framework makes it easy to build video into applications and adapt it to specific business needs like custom workflows and advanced analytics. The framework also means that it works with many third-party applications that customers may already have.

In addition, the Clearleap platform includes subscription and monetization services and data centres from which to host digital video assets. This means IBM customers pass the multi screen video experience on to their own clients.

Clearleap will be integrated into the IBM Cloud platform to make it easy for clients to make money from user video experiences. IBM says this is part of its broader strategy to help clients realise the value of video as it becomes increasingly important in business.

With businesses increasingly using video for CEO webcasts, conference keynotes, customer care and how-to videos, a secure, scalable and open cloud-based system for managing these services has become a priority, says IBM.

Clearleap’s ability to instantly ramp up capacity has won it clients such as HBO, A+E Networks, the NFL, BBC America, Sony Movie Channel, Time Warner Cable and Verizon Communications. Clearleap is headquartered in Atlanta and has data centres in Atlanta, Las Vegas, Frankfurt, and Amsterdam.

“Clearleap joins IBM as visual communications are exploding across every industry,” said Robert LeBlanc, Senior VP of IBM Cloud, “clients want content delivered quickly and economically to any device in the most natural way.”

Meanwhile, in a move that will support the delivery of video services over the cloud, IBM announced a new system that lets developers create apps that tap into vast amounts of unstructured data.

IBM Object Storage, now available on Bluemix, promises simple and secure store and access

Functions. According to IBM 80% of the 2.5 billion gigabytes of data created every day is unstructured content – with most of it video.

Deutsche Telekom launches pan-European public cloud on Cisco platform

T-Mobile Computing cloudDeutsche Telekom has announced the start of a new pan-European public cloud service aimed at businesses of all sizes. The debut offering will be DSI Intercloud, run by T-Systems, which will offer Infrastructure as a service (IaaS) to businesses across Europe. In the first half of 2016, software and platforms as a service deals (SaaS and PaaS) will be launched.

The service, built on a Cisco platform by T-Systems, the business division of Deutsche Telekom, will run from German data centres and be subject to Germany’s data sovereignty regulations.

The pay-as-you-go cloud services can be ordered through Telekom’s new cloud portal, with no minimum purchase requirements and contract periods. Prices start at €0.05 per hour for computing resources and storage at €0.02 per gigabyte. Deutsche Telekom said it hopes to create the foundation for a secure European Internet of Things with high availability and scalability for real time analytics.

Data security company Covata test piloted the platform and will be the first customer to use the DSI Intercloud infrastructure service. Another beta tester was communications company Unify, which used it to investigate the viability of open source cloud platforms running from German data centres.

The new DSI Intercloud marks the latest chapter in the Cisco Intercloud initiative. In June BCN reported how Cisco had bolstered the Intercloud, which it launched in 2014, with 35 partnerships as it aimed to simplify hybrid clouds. Cisco and Deutsche Telekom say they will focus on delivering high availability and scalability for real-time analytics at the edge of the networks, in order to cater for IoT experiences. Edge of networking big data analytics is to become a key a concept in the IoT, BCN reported in December. Last week Hewlett Packard enterprises (HPE) revealed how it is helping IoT system users to decentralised all their processing jobs and devolve decision making to local areas. The rationale is to keep masses of data off the networks and deal with it locally.

Deutsche Telekom said the Cisco partnership is an important building block in expanding its cloud business and aims to at least double its cloud revenue by the end of 2018. In fiscal year 2014, net sales of cloud solutions at T-Systems increased by double figures, mainly in secure private clouds.

Red Hat helps Medlab share supercomputer in the cloud

redhat office logoA cloud of bioinformatics intelligence has been harmonised by Red Hat to create ‘virtual supercomputers’ that can be shared by the eMedlab collective of research institutes.

The upshot is that researchers at institutes such as the Wellcome Trust Sanger, UCL and King’s College London can carry out much more powerful data analysis when researching cancers, cardio-vascular conditions and rare diseases.

Since 2014 hundreds of researchers across the eMedlab have been able to use a high performance computer (HPC) with 6,000 cores of processing power and 6 Petabytes of storage from their own locations. However, the cloud environment now collectively created by technology partners Red Hat, Lenovo, IBM and Mellanox, along with supercomputing integrator OCF, means none of the users have to shift their data to the computer. Each of the seven institutes can configure their share of the HPC according to their needs, by self-selecting the memory, processors and storage they’ll need.

The new HPC cloud environment uses a Red Hat Enterprise Linux OpenStack platform with Lenovo Flex hardware to create virtual HPC clusters bespoke to each individual researchers’ requirements. The system was designed and configured by OCF, working with partners Red Hat, Lenovo, Mellanox and eMedlab’s research technologists.

With the HPC hosted at a shared data centre for education and research, the cloud configuration has made it possible to run a variety of research projects concurrently. The facility, aimed solely at the biomedical research sector, changes the way data sets are shared between leading scientific institutions internationally.

The eMedLab partnership was formed in 2014 with funding from the Medical Research Council. Original members University College London, Queen Mary University of London, London School of Hygiene & Tropical Medicine, the Francis Crick Institute, the Wellcome Trust Sanger Institute and the EMBL European Bioinformatics Institute have been joined recently by King’s College London.

“Bioinformatics is a very, data-intensive discipline,” says Jacky Pallas, Director of Research Platforms at University College London. “We study a lot of de-identified, anonymous human data. It’s not practical for scientists to replicate the same datasets across their own, separate physical HPC resources, so we’re creating a single store for up to 6 Petabytes of data and a shared HPC environment within which researchers can build their own virtual clusters to support their work.”

In other news Red Hat has announced a new upgrade of CloudForms with better hybrid cloud management through more support for Microsoft Azure Support, advanced container management and improvements to its self-service features.

MapR claims world’s first converged data platform with Streams

Navigating big dataApache Hadoop system specialist MapR Technologies claims it has invented a new system to make sense of all the disjointed streams of real time information flooding into big data platforms. The new MapR Streams system will, it says, blend everything from systems logs to sensors to social media feeds, whether it’s transactional or tracking data, and manage it all under one converged platform.

Stream is described as a stream processing tool that can handle real-time event handling and high scalability. When combined with other MapR offerings, it can harmonise existing storage data and NoSQL tools to create a converged data platform. This, it says, is the first of its kind in the cloud industry.

Starting from early 2016, when the technology becomes available, cloud operators can combine Streams with MapR-FS for storage and the MapR-DB in-Hadoop NoSQL database, to build a MapR Converged Data Platform. This will liberate users from having to monitor information from streams, file storage, databases and analytics, the vendor says.

Since it can handle billions of messages per second and join clusters from separate data centres across the globe, the tool could be of particular interested to cloud operators, according to Michael Brown, CTO at comScore. “Our system analyses over 65 billion new events a day, and MapR Streams is built to ingest and process these events in real time, opening the doors to a new level of product offerings for our customers,” he said.

While traditional workloads are being optimised, new workloads from the emerging IoT dataflows are presenting far greater challenges that need to be solved in a fraction of the time, claims MapR. The MapR Streams will help companies deal with the volume, variety and speed at which data has to be analysed while simplifying the multiple layers of hardware stacks, networking and data processing systems, according to MapR. Blending MapR Streams into a converged data system eliminates multiple siloes of data for streaming, analytics and traditional systems of record, MapR claimed.

MapR Streams supports standard application programming interfaces (APIs) and integrates with other popular stream processors like Spark Streaming, Storm, Flink and Apex. When available, the MapR Converged Data Platform will be offered as a free to use Community Edition to encourage developers to experiment.

Microsoft goes open source on Chakra JavaScript engine

Microsoft is to make the Chakra JavaScript engine open source and will publish the code on its GitHub page next month. The rationale is to extend the functions of the code, used in the Edge and Internet Explorer 9 browsers, to a much wider role.

The new open source versions of the Chakra engine are to be known as its open sourcing ChakraCore. Announcing the changes at Java development show JS Conf US in Florida, Microsoft now intends to run ChakraCore’s development as a community project which both Intel and AMD have expressed interest in joining. Initially the code will be for Windows only but the rationale behind the open source strategy is to take ChakraCore across platforms, in a repeat of the exercise it pioneered with .NET.

In a statement, Gaurav Seth, Microsoft’s Principal Programme Manager, explained that as Java Script’s role widens, so must the community of developers that support it and opening up the code base will help support that growth.

“Since Chakra’s inception, JavaScript has expanded from a language that primarily powered the web browser experience to a technology that supports apps in stores, server side applications, cloud based services, NoSQL databases, game engines, front-end tools and now the Internet of Things,” said Seth. Over time, Chakra evolved to fit many of these and this meant that apart from throughput, Chakra had to support native interoperability, scalability and manage resource consumption. Its interpreter played a key role in moving the technology across platform architectures but it can only take it so far, said Seth.

“Now we’re taking the next step by giving developers a fully supported and fully open-source JavaScript engine available to embed in their projects, innovate on top of, and contribute back to ChakraCore,” said Seth. The modern JavaScript Engine must go beyond browser work and run everything from small-footprint devices for IoT applications to high-throughput, massively parallel server applications based on cloud technologies, he said.

ChakraCore already fits into any application stack that calls for speed and agility but Microsoft intends to give it greater license to become more versatile and extend beyond the Windows ecosystem, said Seth. “We are committed to bringing ChakraCore to other platforms in the future. We’d invite developers to help us in this pursuit by letting us know which other platforms they’d like to see ChakraCore supported on to help us prioritize future investments, or even by helping port it to the platform of their choice,” said Seth.