All posts by Business Cloud News

Box, IBM and Black Duck announce security offerings amid open source vulnerabilities

Security concept with padlock icon on digital screenTwo more services have been launched with the aim of shoring up the security of the cloud, as its popularity sees it becoming increasingly targeted for attack.

File sharing company Box has launched a customer-managed encryption service, KeySafe, in a bid to give clients more control over their encryption keys without sacrificing the ease of use and collaboration features of Box. Meanwhile UK-based open source security vendor Black Duck has been recognised under IBM PartnerWorld’s ‘Ready for IBM Security Intelligence’ designation.

Box’s KeySafe aims to centralise sensitive content in the cloud, and promises new levels of productivity and faster business processes. Box Enterprise Key Management (EKM) uses Amazon Web Services (AWS) and a dedicated hardware storage module (HSM) to protect keys used to encrypt sensitive data. Box also has a service that integrates with AWS Key Management Service so customers can control their encryption keys. The service is intended to be simple and uses a software-based technology that doesn’t need dedicated HSMs.

Box says it can never access a customer’s encryption keys, which the customer owns. The main selling points of KeySafe, in addition to this independent key control, are unchangeable usage policies and audit logs and a ‘frictionless end user experience’ with simple data. Pricing is to be based on size.

In another security announcement, Black Duck’s new offering through IBM follows a research finding that 95% of mission critical apps now contain open source components, with 98% of companies using open source software they don’t know about. With 4,000 new open source vulnerabilities reported every year, Black Duck claims that cloud computing is creating greater vulnerabilities.

IBM has announced that Black Duck Hub has been validated to integrate with IBM Security AppScan in order to identify and manage application security risks in custom-developed and open source code. The hub now provides a clarified view within IBM Security AppScan which will help spot problems quicker. Black Duck Hub identifies and logs the open source in applications and containers and maps any known security vulnerabilities by comparing the inventory against data from the National Vulnerability Database (NVD) and VulnDB.

“It’s not uncommon for open source software to make up 50 per cent of a large organisation’s code base. By integrating Black Duck Hub with AppScan, IBM customers will gain visibility into and control of the open source they’re using,” said Black Duck CEO Louis Shipley.

Skyhigh, Check Point claim cloud security simplification

Cloud securityCloud access security broker Skyhigh Networks and security vendor Check Point claim they’ve jointly made security, compliance and governance policies for cloud services a lot easier to manage.

The initial launch of their combined service is aimed at regulating software, platform and infrastructure (SaaS, PaaS and IaaS) as a service offerings.

The integration of their security offerings means that mutual customers can use Skyhigh’s cloud access security broker (CASB) and Check Point’s firewall more effectively while taking less time to set up and enforce internal policies. The idea is to alleviate the work of enterprise security managers as they try to comply with external regulations and protect corporate data.

Meanwhile Skyhigh is offering a free cloud audit as it claims that an all time high adoption of cloud has not been matched by cloud security standards. According to the Q4 2015 Skyhigh Cloud Adoption and Risk Report, the average company uses 1,154 cloud services and uploads over 5.6 TB to file sharing services each month. However, this vast migration of data to the cloud is creating a security gulf, it claims, because the rush to cut costs has seen companies lose visibility and control over their IT estate.

The combined Skyhigh Check Point service promises to shed more light on the state of the network, enforce data loss prevention (DLP) policies, protect company data, consolidate usage of cloud services, identify any risky data uploads or downloads from questionable service providers and protect against data exfiltration attempts. By applying threat intelligence to analyse cloud traffic patterns, detecting anomalous behaviour and remediating against users or cloud services the two partners claim they can restore the levels of security enterprises need, by making it easier to implement.

“Companies want to embrace cloud services, but they can’t leave behind security controls as corporate data moves off-premises,” said Chris Cesio, business development VP at Skyhigh Networks.

MapR gets converged data platform patented

dataCalifornia-based open source big data specialist MapR Technologies has been granted patent protection for its technique for converging open source, enterprise storage, NoSQL and other event streams.

The United States Patent and Trademark Office recognised the detail differentiation of the Hadoop specialist’s work within the free, Java-based programming framework of Hadoop. Though the technology is derived from technology created by the open source oriented Apache Software Foundation, the patent office has judged that MapR’s performance, data protection, disaster recovery and multi-tenancy features merit a recognisable level of differentiation.

The key components of the patent claims include a design based on containers, self-contained autonomous units with their own operating system and app software. Containers can ring fence data against loss, optimise replication techniques and create a system that can cater for multiple node failures in a cluster.

Other vital components of the system are transactional read-write-update semantics with cluster-wide consistency, recovery techniques and update techniques. The recovery features can reconcile the divergence of replicated data after node failure, even while transactional updates are continuously being added. The update techniques allow for extreme variations of performance and scale while supporting familiar application programming interfaces (APIs).

MapR claims its Converged Data Platform allows clients to innovate with open source, provides a foundation for analytics (by converging all the data), creates enterprise grade reliability in one open source platform and makes instant, continuous data processing possible.

It’s the differentiation of the core with standard APIs that makes it stand out from other Apache projects, MapR claims. Meanwhile the system’s ability to use a single cluster, that can handle converged workloads, makes it easier to manage and secure, it claims.

“The patent details how our platform gives us an advantage in the big data market. Some of the most demanding enterprises in the world are solving their business challenges using MapR,” said Anil Gadre, MapR Technologies’ senior VP of product management.

IBM launches 26 new cloud services for data scientists

IBM2IBM is launching 26 services new services on its IBM Cloud which it describes as a ‘sweeping portfolio for data scientists and app developers’. Its new offering includes 150 publicly available datasets.

The new initiative aims to help developers build and manage applications and help data scientists to read events in the cloud more intuitively. The hybrid cloud service scans multiple cloud providers and uses open systems which, IBM says, will create a ready flow of data across different services.

The new cloud offerings will create a self-service for data preparation, migration and integration, IBM claims, with users being provided with tools for advanced data exploration and modelling. The four main pillars of the new service offering come under the headings of Compose Enterprise, Graph, Predictive Analytics and Analytics Exchange.

The IBM Compose Enterprise is a managed platform that aims to help developers build web-scale apps faster by giving them access to resources such as open source databases and their own their own dedicated cloud servers. Graph is a managed graph database service built on Apache TinkerPop with a stack of business-ready apps with real-time recommendations, fraud detection, IoT and network analysis uses. Predictive Analytics promises developers easy self-build machine learning models from a library of predictive apps generally used by data scientists. Analytics Exchange contains the catalogue of 150 publicly available datasets.

The Apache TinkerPop and the Gremlin graph traversal language will be the primary interface to IBM’s Graph service. IBM has previously pushed TinkerPop to join the Apache Software Foundation. In September BCN reported that IBM is to open a San Francisco facility with resources dedicated to IBM’s new Spark processing technology as the vendor seeks to get Spark users interested in IBM’s Watson developer cloud.

Data handlers are currently handicapped by having to use disparate systems for data needs, IBM claims. “Our goal is to move data into a one-stop shop,” said Derek Schoettle, General Manager, Analytics Platform and Cloud Data Services.

$19 billion Western Digital acquisition of SanDisk gets EC approval

Disk CloudThe European Commission has announced its approval of the proposed take over of storage vendor SanDisk by Western Digital. The merger of the two US-based storage rivals will not adversely affect competition in Europe, the EC has ruled.

In October 2015 BCN reported that Western Digital had announced plans buy chip maker SanDisk for around $19 billion. Flash specialist SanDisk is ranked by IDC as the largest manufacturer of NAND flash memory chips. The capacity of NAND Flash Memory products to store data in a small footprint, while simultaneously using less power but granting faster access to data, has made NAND the storage technology of choice in data centres that support cloud computing.

The market for NAND flash chips was worth $28.9 billion in 2014, according to IDC.

The Commission found that the only overlap between the activities of the hard disk manufacturers is in selling flash memory storage systems and solid-state drives to the enterprise market. In this case, the effects of the merger on competition will be minimal, it has ruled, despite their relatively high combined market share. The presence of Intel, Toshiba, Micron and Samsung in the same market will exert sufficient competitive pressure to prevent the creation of a Western Digital hegemony, the European Commission has ruled.

The Commission also investigated the vertical link between SanDisk’s production of flash memory and the downstream markets for enterprise flash memory storage systems. With flash memory an essential component of solid state drives and other flash memory storage systems the EC investigators have researched whether Western Digital will be in a position to block competitors from access to flash memory.

It also studied the likelihood that competing producers of flash memory might find themselves with an unsustainable customer base. However, SanDisk’s presence on the upstream flash memory market was judged as ‘limited’ and the presence of several active competitors makes this a manageable risk.

“This multi-billion dollar deal can go ahead without delay,” said competition policy commissioner Margrethe Vestager.

Microsoft creates Azure hub for Internet of Things

azure iotMicrosoft has put its new Azure IoT hub on general availability. In a statement, it claims the new system will be a simple bridge between its customers’ devices with their systems in the cloud. It claims that the new preconfigured IoT offering, when used with the Azure IoT Suite, can be used to create a machine to machine network and a storage system for its data in minutes.

The new Azure IoT Hub promises ‘secure, reliable two-way communication from device to cloud and cloud to device’. It uses the open protocols widely adopted in machine to machine technology, such as MQTT, HTTPS and AMQPS. Microsoft claims the IoT Hub will easily integrate with other Azure services like Azure Machine Learning and Azure Stream Analytics. The Machine Learning service uses algorithms in an attempt to spot patterns (such as unusual activity, hacking attempts or commercial trends) that might be useful to data scientists. Azure Stream Analytics allows data scientists and decision makers to act on those insights in real time, through a system with the capacity to simultaneously monitor millions of devices and take automatic action.

Microsoft launched the Azure IoT Suite in September 2015 with a pledge to guarantee standards through its Certified for IoT programme, promising to verify partners that work with operating systems such as Linux, mbed, RTOS and Windows. Microsoft claims its initial backers were Arduino, Beagleboard, Freescale, Intel, Raspberry Pi, Samsung and Texas Instruments. In the three months since the IoT Suite’s launch it has added ‘nearly 30’ more partners, it claims, notably Advantech, Dell, HPE, and Libelium.

“IoT is poised for dramatic growth in 2016 and we can’t wait to see what our customers and partners will continue to build on our offerings. We’re just getting started,” wrote blog author Sam George, Microsoft’s partner director for Azure IoT.

Privacy Shield data agreement dismissed as ‘reheated Safe Harbour’

Europe US court of justiceThe new framework for transatlantic data flows proposed by legislators for the European Commission has had a mixed reaction from the cloud industry.

The EU-US Privacy Shield agreement over data transfer replaces the 15 year arrangement that was voided by the Court of Justice of the European Union in October. The new arrangement has to meet official approval from all 28 member states of the European Union. If it does both sides will finalise the details of the new pact in the next fortnight and the agreement could come into effect in April.

The foundation of the agreement is that American intelligence agencies will no longer have indiscriminate access to Europeans’ data when it is stored in the US. EC Commissioner Vera Jourová claimed that Europeans can now be sure their personal data is fully protected and that the EC will closely monitor the new arrangement to make sure it keeps delivering.

“For the first time ever, the United States has given the EU binding assurances that the access of public authorities for national security purposes will be subject to clear limitations, safeguards and oversight mechanisms,” said Jourová, who promised that EU citizens will benefit from redress if violations occur. “The US has assured that it does not conduct mass or indiscriminate surveillance of Europeans,” said Jourová.

Whether the decision really will build a Digital Single Market in the EU, a trusted environment and closer partnership with the US remains a moot point among cloud industry experts.

Approval of the arrangement cannot be taken for granted, according to a speaker for The Greens and the European Free Alliance. “This new framework amounts to little more than a reheated serving of the pre-existing Safe Harbour decision. The EU Commission’s proposal is an affront to the European Court of Justice, which deemed Safe Harbour illegal, as well as to citizens across Europe, whose rights are undermined by the decision,” said Green home affairs and data protection spokesperson Jan Philipp Albrecht. The proposal creates no legally binding improvements and the authorities must make clear that this ‘legally dubious declaration’ will not stand said Albrecht.

The EU/US data sharing deal won’t stop surveillance, according to former Whitehouse security advisor French Caldwell. As a Gartner research VP, Caldwell once advised on national and cyber security and led the first ever cyber wargame, Digital Pearl Harbor. As the new chief evangelist at software vendor MetricStream, Caldwell said there were many flaws in the logic of the agreement.

“The legal definitions of personal data are so antiquated that, even if that data covered under privacy law is protected, there is still so much data around people’s movements and online activities that an entire behavioural profile can be built without accessing that which is considered legally protected,” said Caldwell.

Privacy protections have evolved significantly in the US, Caldwell said, and US authorities are much more aggressive than EU authorities in penalising companies that don’t follow privacy policies. “It is hard to discount nationalism and trade protectionism as underlying motivations [for European legislation],” said Caldwell.

It should alarm cloud customers to see how little has been done to give assurance of their privacy, said Richard Davies, CEO of UK based ElasticHosts. “This gives little assurance to EU customers trusting a US provider with hosting their websites or sensitive data.” Customers with servers with US companies in the EU are likely to move their data to non-US providers to minimize risk, Davies said.

Businesses will need to be much more involved with where their information exists and how it is stored. Until details emerge of the new privacy shield, many European companies wont want to risk putting data on US servers, warned Ian Wood, Senior Director Global Solutions.

However, this could be a business opportunity for the cloud industry to come up with a solution, according to one commentator. The need for transparency and accountability calls for new data management skills, according to Richard Shaw, senior director of field technical operations at converged data platform provider MapR.

“Meeting EU data privacy standards is challenging at the best of times, let alone when the goal posts are constantly being moved,” said Shaw. The only way to give the US authorities the information they demand, while complying with regulations, is to automate governance processes around management, control and analysis of data, Shaw said.

Would the Privacy Shield and the attendant levels of new management affect performance?

Dave Allen, General Counsel at Internet performance specialist Dyn said regional data centres are a start but that the data residence perspective is incomplete at best and give a false sense of confidence that the myriad of regulations is properly addressed.

“Businesses will now need to understand the precise paths that their data travels down, which will be a more complex problem given the amount of cross-border routing of data across several sovereign states. Having access to traffic patterns in real time, along with geo-location information, provides a much more complete solution to the challenges posed by the EU-US Privacy Shield framework,” said Allen.

Hitachi launches Hyper Scalable Platform with in-built Pentaho

HDS HSPHitachi Data System (HDS) has launched a rapid assembly datacentre infrastructure product that comes with a ready-mixed enterprise big data system built in.

The HDS hyper scalable platform (HSP) is a building block for infrastructure that comes with computing, virtualisation and storage pre-configured, so that modules can be snapped together quickly without any need for integrating three different systems. HDS has taken the integration stage further by embedding the big data technology it acquired when it bought Pentaho in 2015. As a consequence the new HSP 400 system creates a simple to install but sophisticated system for building enterprise big data platforms fast, HDS claims.

HDS claims that the HSP’s software-definition centralises the processing and management of large datasets and supports a pay-as-you-grow model. The systems can be supplied pre-configured, which means installing and supporting production workloads can take hours, whereas comparable systems can take months. The order of the day, says HDS, is to make it simple for clients to create elastic data lakes, by bringing all their data together and integrating it in preparation for advanced analytic techniques.

The system’s virtualised environments can work with open source big data frameworks, such as Apache Hadoop, Apache Spark and commercial open source stacks like the Hortonworks Data Platform (HDP).

Few enterprises have the internal expertise for analytics of complex big data sources in production environments, according to Nik Rouda, Senior Analyst at HDS’s Enterprise Strategy Group. Most want to avoid experimenting with still-nascent technologies and want a clear direction without risk and complexity. “HSP addresses the primary adoption barriers to big data,” said Rouda.

Hitachi will offer HSP in two configurations, Serial Attached SCSI (SAS) disk drives, generally available now, and all-flash, expected to ship in mid-2016. These will support all enterprise applications and performance eventualities, HDS claims.

“Our enterprise customers say data silos and complexity are major pain points,” said Sean Moser, senior VP at HDS, “we have solved these problems for them.”

Survey reveals support for OpenStack but fears over hidden costs

openstack logoAlmost all IT professionals want to adopt OpenStack but fear the hidden costs, according to a new study by SUSE Linux.

Positive sentiment could evaporate in the face of challenges such as difficult installation, skills shortages and the fear of vendor lock-ins, the report has warned.

The study was commissioned by enterprise Linux, cloud and storage infrastructure provider SUSE. Researcher Dynamic Markets interviewed 813 senior IT professionals in the US, Canada, Germany, France, Italy and the Nordics, along with 110 from the UK. According to SUSE, 80% of the UK group said they are planning to adopt or have already moved to OpenStack private cloud. But there is serious concern about the aforementioned private cloud installation challenges and possible vendor lock-in.

Though 88% of companies said they have a private cloud at work an even higher percentage (96%) said they would use a cloud solution for business-critical workloads. Almost as many, 94%, said they see infrastructure-as-a-service as the future for the data centre.

However, many respondents confessed that the practicalities of OpenStack might get in the way and gave a series of responses that indicate there may be a high degree of difficulty involved.

Almost half of UK enterprises that have tried to implement an OpenStack cloud have failed, according to SUSE. Another 57% said they found the implementation experience difficult. Meanwhile, another 30% could be about to endure an off-putting experience, according to SUSE, since this number plan to download and install OpenStack software themselves, which (says SUSE) could exacerbate their difficulties.

Despite the open ethos of OpenStack, an alarming 91% of UK respondents are wary about falling victim to vendor lock-in when they choose a private cloud infrastructure.

Keeping control of the infrastructure will be made even harder by the impossibility of finding staff, said the report, as 89% say a lack of available talent in the market is making them reluctant to embark on a private cloud project.

The Cloud may be the future but there are clear concerns about how it should be integrated and managed, according to Mark Smith, SUSE’s senior product marketing manager. With cost the primary motivator for adopting the cloud, many IT professionals worry that there will be a price to pay later, according to SUSE.

Oracle launches a mission critical PaaS from its Slough data centre

OracleOracle has added new Platform-as-a-Service (PaaS) and Infrastructure-as-a-Service (IaaS) cloud offerings from its Slough data centre, which currently caters for 500 UK and global customers.

Clients from both the private and public sector are being promised tailored versions of the new services, which include Oracle’s Database, Dedicated Compute, Big Data and Exadata cloud services.

Oracles claims that it is offering enterprises a mission critical PaaS and outlined four main selling points for the new services. Clients will now be able to develop, test and launch applications much more rapidly and cheaply, it claims. No supporting figures were given to exemplify this, however. Secondly, the new service will give companies greater flexibility without compromising their security, Oracle claims.

It will also use Hadoop’s open-source software framework for storing data and running applications on clusters of commodity hardware. This, says Oracle, will provide massive storage for any kind of data, boost the available pool of processing power and allow the data centre to handle a far greater volume of concurrent jobs. Oracle claimed that this can be delivered as a secure, automated service that meshes with existing enterprise data in Oracle Database. The fourth plank of its new offering is instant access to a virtual computing environment to run large scale applications on the Oracle Cloud.

Oracle currently has 19 data centres running its Oracle Cloud from various points of the globe. Last week it announced the intention to open a new Cloud data centre in Abu Dhabi. Oracle will be investing in two new cloud sales centres in Amsterdam and Cairo along with new offices opening this year in Dubai, Dublin and Prague.

In December 2015, BCN reported that Oracle’s co-chief executive Safra Catz warned fiscal 2016 will be “a trough year for profitability as we move to the cloud.”

In January 2016, however, BCN reported that Oracle had announced aggressive expansion plans with a recruitment drive for junior and senior sales staff to be based in six cities across EMEA.

The cloud software giant is now actively headhunting for 1,400 new cloud sales staff to work out of sales HQs in Amsterdam, Cairo, Dubai, Dublin, Malaga and Prague.