Todas las entradas hechas por Business Cloud News

IBM and SanDisk join forces to create software defined flash storage for cloud

Sandisk infiniflashFlash storage maker SanDisk and IBM are working together on a new software defined, all flash storage system for data centres. News of this collaboration comes days after BCN revealed that EMC had introduced a new category of flash storage for the same market.

SanDisk’s new InfiniFlash, a high capacity and performance flash-based software defined storage system, features IBM’s Spectrum Scale file system. The joint product is the two manufacturers’ answer to the increasing demands faced by data centres who can never get enough capacity and performance and will need more flexibility in future. An all flash system gives the requisite computing power and the IBM-authored software definition provides the agility, according to SanDisk.

Flash is the only technology that can support the many variables of the modern hybrid cloud, according to SanDisk, which listed bi-modal IT, traditional and cloud native applications and the increasing workload created by social, mobile and real-time processing as drivers for the need for a powerful storage infrastructure. The InfiniFlash for IBM Spectrum is described as ultra-dense and scalable, meaning that it can be bought in small increments that can be easily snapped to together to quickly build a hyperscale infrastructure. SanDisk claimed it offers the lowest price per IOPS/TB on the market and the option for independent storage.

SanDisk claims InfiniFlash has five times the density, fifty times the performance and four times the reliability of traditional hard disks, while using 80% less power. Pricing starts at $1 per gigabyte (GB) for an all-flash system. When used with software stacks designed to reduced data (through de-duplication and other techniques) the cost of storage could fall to around 20 cents per GB, claims SanDisk.

The IBM Spectrum Scale, meanwhile, uses software definition to create efficiencies through file, object and integrated data analytics designed for technical computing, big data analysis, cognitive computing, Hadoop Distributed File System, private cloud and content repositories.

Ravi Swaminathan, SanDisk’s general manager of System and Software Solutions, promised the ‘best of both worlds’ to data centres. “Customers can afford to deploy flash at petabyte-scale, which drive business growth through new services and offerings for their end-customers,” said Swaminathan.

Red Hat and Eurotech to jointly re-engineer the cloud for better IoT

redhat office logoRed Hat is teaming up with Italy’s Eurotech in a bid to help Internet of Things projects get bigger and more flexible without sacrificing their security.

The companies have pooled their technical powers to combat the scale, latency, reliability and security weaknesses within complex Internets of Things. Their joint ambition is to obviate the need for the mass consignment of data to the cloud for real-time processing. Instead, they want to set up a more robust alternative system that works by using essential data aggregation, transformation, integration and routing.

North Carolina based open source champion Red Hat and Camaro based machine to machine (M2M) system maker Eurotech say they have two objectives for the IoT: simplify and accelerate. They aim to combine open source cloud software and M2M platforms into a single architecture that bridges the gap between operational and information technology.

All the inherent weaknesses of the IoT – from its lack of scalability to its insecurity – can be tackled by pushing computing power to the network edge, according to the partners. This will help IoT project managers to avoid the risk of shipping masses of data to the cloud for real-time processing. With all the essential data aggregation, data transformation, integration and routing taking place locally, and less exposed to a journey across the cloud, security and performance can be tightened up.

Another productivity dividend will come from placing the processes close to the operational devices. By devolving power away from the centre and allowing remote devices to trigger business rules the partners aim to automate greater numbers of machine processes.

The foundations of this new architecture will be Red Hat’s Enterprise Linux and JBoss Middleware along with Eurotech’s Everyware Software Framework and Everyware Cloud. These are to be integrated to provide the security, management and application support spanning the whole hierarchy of the cloud from device tier to the data centre, according to a Red Hat statement.

“Open Source and Java are important pillars in both our strategies. These factors ensure a good alignment,” said Eurotech CMO Robert Andres, CMO, Eurotech.

Cisco to buy cloud orchestrator CliQr for $260 million

Cisco corporateCisco has announced its intention to buy CliQr Technologies, a Californian start up that specialises in making apps run faster in the new bare metal, virtualised and container environments that are becoming increasingly pivotal in cloud computing.

Under the terms of the agreement, Cisco will pay $260 million in cash and assumed equity awards, plus retention based incentives. The acquisition is expected to close in the third quarter of 2016, subject to closing conditions. The CliQr team will join Cisco’s Insieme Business Unit reporting to Prem Jain, Cisco’s general manager.

Announcing the acquisition at Cisco’s Partner Summit, Cisco’s VP of Corporate Development Robert Salvagno said that the new technology will help its systems integrators and service provider partners to simplify the marshalling of resources and help get private, public and hybrid cloud projects running quicker. CliQr has out-of-the box support for all major public cloud environments.

Cisco said it will now continue to integrate CliQr across its data centre portfolio.

Cisco had already integrated CliQr with its Cisco Application Centric Infrastructure (ACI) and Unified Computing systems (UCS) prior to acquisition, in a bid to improve the movement of applications between on-premise and cloud environments. Having achieved that it now aims to integrate CliQr across its data centre portfolio, extending the ‘orchestration of services’ to cover eventualities such as bare metal computing, containerised systems and all the various types of virtualisation.

CliQr simplifies management by giving customers a single system for managing the application lifecycle across hybrid IT environments. Cisco claims the system is intuitive and can simplify the most complex systems. With computing becoming a hybrid of traditional on premise IT and services running in the cloud, many CIOs and network managers have been left behind by the new complexity and inefficiencies and blockages have emerged which Cisco claims it can now smooth out.

Among the productivity improvements promised by CliQr is a feature that allows managers to create a single, secure application profile that can then be whizzed across any data centre, public or private cloud. Other managerial time savers include a consistent policy making scheme, an application optimiser across hybrid systems, a one click rollout and ‘complete visibility’ and control across applications, cloud environments and users.

OPNFV announces second major release – Brahmaputra

Digital illustration of Cloud computing devicesThe Linux Foundation-inspired OPNFV Project has taken a new step closer to its ideal of network liberalisation with a new release of its software.

Network Function Virtualisation (NFV), the telecoms industry’s answer to the Stock Market’s Big Bang, aims to open the market for creating software that runs the multitude of functions within any network. The OPNFV Project aims to create a carrier-grade, integrated, open source platform that uses NFV to create telecoms networks that are infinitely more flexible and adaptable than the traditional proprietary systems that locked the software within the rigid backbone of telecoms hardware.

The Project has announced the availability of new improved version of its original offering, code-named Arno, which Telecoms.com reported on in June 2015. The new release, Brahmaputra, offers a more comprehensive standard of tools for testing NFV functionality and use cases. Brahmaputra is OPNFV’s first full experience with a massively parallel simultaneous release process and helps developers to collaborate with upstream communities. By encouraging group collaboration on feature development and addressing multiple technology components across the ecosystem, the Project aims to improve the stability, performance and automation of the system, and to consolidate its features.

The extent of collaboration is ambitious, since OPNFV aims to bring together at least 165 developers from network operators, solution providers and vendors. The focus of their joint efforts will be on integration, deployment and the testing of upstream components to meet NFV’s needs. During the integration process to create the Brahmaputra release, code was contributed by programme writers in the OpenStack, OpenDaylight, OpenContrail ONOS and ETSI developer communities. Meanwhile, there were 30 different projects accepted which created new powers, specifications and community resources to the system.

Among the improvements are Layer 3 VPN instantiation and configuration, initial support for IPv6 deployment and testing in IPv6 environments, better fault detection and recovery, performance boosts through data plane acceleration and much fuller infrastructure testing.

“The strength of any open source project depends on the community developing it,” said OPNFV director Heather Kirksey, “with an entire industry involved in the development of NFV, we’re seeing more collaboration and the strides we made in Brahmaputra create a framework for even more developers to come together.”

Enterprises pay nearly a fifth more to use European cloud – report

Money cloudThe cloud service market in the US is much more competitively priced than in Europe but Latin America gets the worst deals in the world, according to a new study. Europeans pay up to 19% more for the same services when they are hosted in home territory.

According to the new Cloud Price Index report from 451 Research, Americans enjoy the most competitive prices globally. On average Europeans pay between 7 and 19% more, depending on the complexity of the application. Asia Pacific comes second bottom in the price performance study. However, anomalies exist and deals are available to those who shop around, says the report.

The ‘protection premium’, the extra price of hosting services in-country or in-region services, rather than using the cheaper option of US services, is not just the cost of compliance. The extra investment needed by European cloud users is a result of three pressures: the need to meet local regulations, the need to boost performance by bringing apps closer to users and the use of local customer service.

In Europe, soaring local cloud demand, driven by data protection legislation, has created uncertainty about access and responsibility and confused cloud buyers and service providers. The net effect of issues like Safe Harbor, the Patriot Act and the new US-EU Privacy Shield agreement is that european buyers will pay more.

Don’t expect that to change for the better just yet, said Penny Jones, Senior Analyst for European Services. “It won’t be clear what the European Court of Justice thinks about the legislation until they have reviewed a case or two,” said Jones.

Cloud services are even more pricey in Asia Pacific and Latin America, according to the report. Comparable hosting in Asia Pacific and Latin America can cost 38% more than in the US. Taking average prices as a benchmark, Latin America has the most extreme variations in prices, thanks to its limited selection of hosting providers.

There is also an extreme price polarity between the small and large applications in Europe. Users pay double the premium for a large application, composed of computing, storage, platforms and support, in comparison to simpler virtual machines. These discrepancies are the result of skills shortages and an SME market willing to pay more for support on complex applications.

The lesson is that cloud buyers must be more diligent about researching huge price variations according to 451 Research Director Dr Owen Rogers. “One provider charged more than twice the average US price for hosting in Latin America. Another offered an 11% discount for hosting in Europe compared to the US,” said Rogers.

EMC claims it can make data centres All Flash and no downtime

EMC quantum leapAs EMC prepares for its takeover by Dell it claims it has made ‘significant changes’ to its storage portfolio, converting its primary offering to All Flash, modernising array pricings and introducing a new category of flash storage, DSSD D5 at its Quantum Leap event.

EMC’s flagship VMAX All Flash enterprise data services platform and its new DSSD D5 rack-scale flash system are part of a new drive to persuade data centres to use flash technology as their primary storage medium. The vendor claims that by 2020 all storage used for production applications will be flash-based with traditional disk relegated to the roll of bulk storage and archiving.

The new all-flash portfolio will be used by databases, analytics, server virtual machines and virtual desktop infrastructures, says EMC, which predicts that the need for predictable performance with sub-millisecond latencies will persuade data centres to make the extra investment. ENC’s new XtremIO is designed for high-end enterprise workloads, while VMAX All Flash will consolidate mixed block and file workloads that require up to 99.9999% availability, as well as rich data services, IBM mainframe and iSeries support and scalable storage growth up to four petabytes (PB) of capacity.

The DSSD D5 Rack-Scale Flash, meanwhile, is for the most performance-intensive, traditional and next-generation use cases, such as getting microsecond response times on Oracle and Hadoop based analytics jobs. Meanwhile, the new VNX Series arrays represent an entry level all-flash offering which starts at $25,000.

EMC announced that the VMAX array has been re-engineered to offer two new all-flash models: the EMC VMAX 450 and EMC VMAX 850. Both are designed to capitalise on the performance of flash and the economics of today’s latest large-capacity SSDs.

Finally, EMC also announced the DSSD D5 which, it claimed, will be a quantum leap in storage technology, with its new Rack-Scale Flash. TEMC said the new invention will be used in high production applications such as genetic sequencing calculations, fraud detection, credit card authorisation and advanced analytics.

EMC claims it will create a ten fold surge in performance levels. The storage hardware is capable latency of 100 microseconds, throughput at 100 GB/s and IOPS of up to 10 million in a 5U system. EMC DSSD D5 will be generally available in March 2016.

EC clears acquisition of EMC by Dell – won’t distort competition

Dell office logoThe European Commission has approved the acquisition of storage and software giant EMC by PC and server maker Dell.

In a statement Commissioner Margrethe Vestager declared that the deal meets the criteria of the EU’s Merger Regulation. The strategic importance of the data storage sector meant that the EC was able to approve Dell’s multi-billion dollar takeover of EMC within a short space of time, according to Vestager, who thanked the Federal Trade Commission for close cooperation.

The Commission assessed the effects of the transaction on the market for external enterprise storage systems. The Commission also investigated the risk that the merged entity could attempt to restrict access to VMware’s software for competing hardware vendors. The Commission is convinced there will be no adverse effects on customers, according to Vestager.

The Commission found that the merged entity has a moderate market share in the market for external enterprise storage systems and the increment brought about by the merger is small. The new Dell/EMC entity will continue to face strong competition from established players, such as Hitachi, HP, IBM and NetApp, as well as from new entrants, it said.

Despite VMware’s ‘strong market position’ in server virtualization software, the available evidence led the EC investigators to conclude that the merged entity would have neither the ability nor the incentive to shut out competitors. The likes of Citrix, Microsoft and Red Hat can give it plenty of competition in the server virtualisation market, the EC has judged, and it predicted that the EMC/Dell hybrid won’t have things its own way in new technology markets.

Since customers typically multi-source from more than one server virtualization software provider and VMware’s approach has traditionally been hardware and software-neutral, it offers work opportunities to a large number of vendors. Equally, in the server market, Dell has strong competitors that will continue to operate either in partnership with VMware or with third party virtualisation software providers.

The combination of Dell’s and EMC’s external enterprise storage systems products won’t have an impact on competition given the number of alternatives to VMware’s software.

The Commission also asked whether the merged entity could shut competitors out from the virtualization software used for converged and hyper-converged infrastructure systems. Here it also found there were no concerns raised. The merger, when first reported in BCN in October 2015, was valued at $60 billion.

Most data in the cloud is exposed says Thales/Ponemon study

Cloud securityA new study into encryption and key management suggests that nearly of all the companies in the world are planning to make a potentially fatal security mistake.

If the new global study is an accurate gauge of global trends, 84% of companies across the world are about to commit sensitive data to the cloud by 2018. However, only 37% of the same survey sample has an encryption plan or strategy.

With consultant PwC recently declaring that cloud computing is attracting the attention of the world’s cyber criminals and attracting a mini-boom in hacking attacks, the lack of data encryption could prove fatally negligent.

The Global Encryption Trends report, commissioned by Thales Security and conducted by IT security think-tank Ponemon, revealed that though the use of encryption is increasing, the security industry isn’t keeping pace with its criminal opponents. In the study Ponemon interviewed 5,009 individuals across multiple industry sectors in 11 of the world’s top economies, including the US, the UK, Germany, France, Brazil and Japan. If that survey is an accurate reflection of the global state of security of the cloud, there are some worrying trends, according to Thales.

While use of encryption is on the up, with nearly three times more organisations classifying themselves as extensive users in comparison with years ago, there is ‘still some way to go’, according to Thales. In 2005 a Thales study found that 16% of its global survey sample used encryption. By 2015 the proportion of encryption users had risen to 41%, of those surveyed. That still means that a minority of companies around the world are using a baseline level of cyber security, according to John Grimm, Senior Director at Thales e-Security. To make matters worse, in that time the cyber crime industry will have been far more agile and fast moving.

Other findings were that 40% of cloud data at rest is unprotected and 57% of companies don’t even know where their sensitive data resides. Sensitive data discovery ranked as the top challenge to planning and executing an encryption strategy, according to researchers.

Support for both cloud and on-premise deployment was rated the most important encryption solution and 58% of companies said they leave their cloud provider to be responsible for protecting sensitive data transferred in the cloud.

Docker launches DDC to support ‘container as a service’ offering

Container company Docker has announced a Docker Data Center along with the new concept of ‘containers as a service’ in a bid to extend its cloud based technology to customer sites.

The Docker Datacenter (DDC) resides on the customer’s premises and gives them a self service system for building and running applications across multiple production systems while under operations controls.

It has also announced the general availability of Docker Universal Control Plane, a service that has been undergoing beta-testing since November 2015, which underpins the running of the container as a service (CaaS).

The advantage of the DDC is that it creates a native environment for the lifecycle management for Dockerized applications. Docker claims that 12 Fortune 500 companies have been beta testing the DDC along with smaller and companies in a range of industries.

Since every company has different systems, tools and processes the DDC was designed to work with whatever the clients have got and adjust to their infrastructure without making them recode their applications, explained Docker spokesman Banjot Chanana on the Docker website. Networking plugins, for example, can be massively simplified if clients use Docker to define how app containers network together. They can do this by choosing from any number of providers to provide the underlying network infrastructure, rather than have to tackle the problem themselves. Similarly, connecting to an internal storage infrastructure is a lot easier. Application programming interfaces provided by the on site ‘CaaS’ allow developers to move stats and logs in and out of logging and monitoring systems more easily.

“This model enables a vibrant ecosystem to grow with hundreds of partners,” said Chanana, who promised that Docker users will have much better options for their networking, storage, monitoring and workflow automation challenges

Docker says its DDC is integrated with Docker’s commercial Universal Control Plane and Trusted Registry software. It achieved this with open source Docker projects Swarm (orchestration), Engine (container runtime), Content Trust (security) and Networking. Docker and its partner IBM provide dedicated support, product engineering teams and service level agreements.

Salesforce quarterly figures prove cloud industry resistant to IT downturn

Salesforce’s latest quarterly figures have reversed the conventional logic of valuing cloud company stock, according to stock markets reaction.

Before the cloud giant’s latest figures were released, many Wall Street analysts were looking for signs of a downturn in the cloud industry, according to Reuters, which reported that the cloud software leaders is regarded as a barometer for conditions across the cloud industry. After a poor sales outlook from Tableau earlier this month, many analysts were looking for proof of a downturn in the cloud industry. Conventional wisdom in the money markets was that poor cloud performance would follow a downturn in the IT industry, related to worries about the economy.

However, when Salesforce returned higher better than expected revenue reports in its quarterly review and raised its yearly revenue forecast, analysts began to speculate that cloud sales and IT investment may be inversely related. At the end of the first day’s trading after Salesforce’s figures were released its stock has risen 7.2%, reported Reuters.

The cloud giant company upped its revenue forecast for the year from $8.0 billon-$8.1 billion to $8.08 billion-$8.12 billion. Analysts on average were expecting a profit of 99 cents per share on revenue of $8.08 billion.

Salesforce’s Chief Financial Officer Mark Hawkins dismissed the pessimistic outlook the money markets have for the cloud industry in the current uncertain economy. “We aren’t seeing an economic impact,” said Hawkins.

The opposite of analysts’ expectations is taking place, he argued, since the cloud computing sector thrives when businesses make more careful buying decision and choose cheaper, simpler to install services that can be costed more flexibly. Another point of departure between cloud and IT company stocks is that they are bought by different people. Salesforce is often installed over the head of the IT department, Hawkins said.

In January BCN reported how BT has effectively become a reseller channel for Salesforce, giving its corporate customers the option of a hotline to Salesforce’s cloud service through its BT Cloud Connect service.