Todas las entradas hechas por Business Cloud News

Dell launches new backup and recovery services that straddle the cloud and the client

Dell office logoDell has announced a programme of new data protection initiatives to protect systems, apps and data that straddle private premise computers and the cloud.

There are four main strands to the new offerings: Dell Data Protection and Rapid Recovery, three new data deduplication appliances models, Dell Data Protection and Endpoint Recovery and a new Dell Data Protection and NetVault Backup offering.

The Dell Data Protection and Rapid Recovery system integrates with previous Dell offering such as AppAssure in order to help eliminate downtime for customer environments. Dell claims that users get ‘ZeroImpact’ recovery of systems, applications and data across physical, virtual and cloud environments. The Rapid Snap for Applications technology works by taking snapshots of entire physical or virtual environments every five minutes so users can get immediate access to data in the event of an incident. Rapid Snap for Virtual technology also offers agentless protection of VMware virtual machines.

The new Dell DR deduplication appliances are named as the Dell DR4300e, DR4300 and DR6300. The mid-market DR4300 offers up to 108TB of usable capacity while ingesting up to 23TB of data per hour. The entry-level DR4300e is a smaller scale, low-cost appliance that can scale up to 27TB. The DR63000 is a larger midmarket and small enterprise solution that delivers up to 360TB of usable capacity while ingesting up to 29TB of data per hour.

Dell Data Protection and Endpoint Recovery is a light-weight, easy-to-use software offering that gives customers endpoint protection and recovery solution for Windows clients. This is an offering for single users and starts off being free.

The Dell NetVault Backup is a cross-platform, enterprise backup and recovery solution that offers a spectrum of operating systems, applications and backup target support. A new option is to allow customers to break up backups into smaller, simultaneously executed chunks to increase performance.

Spotify shifts all music from data centres to Google Cloud

Spotify_Icon_RGB_GreenMusic streaming service Spotify has announced that it is to switch formats for storing tunes for customers and is copying all the music from its data centres onto the Google’s Cloud Platform.

In a blog written by Spotify’s VP of Engineering & Infrastructure, Nicholas Harteau explained that though the company’s data centres had served it well, the cloud is now sufficiently mature to surpass the level of quality, performance and cost Spotify got from owning its infrastructure. Spotify will now get its platform infrastructure from Google Cloud Platform ‘everywhere’, Harteau revealed.

“This is a big deal,” he said. Though Spotify has taken a traditional approach to delivering its music streams, it no longer feels it needs to buy or lease data-centre space, server hardware and networking gear to guarantee being as close to its customers as possible, according to Harteau.

“Like good engineers, we asked ourselves: do we really need to do all this stuff? For a long time the answer was yes. Recently that balance has shifted,” he said.

Operating data centres was a painful necessity for Spotify since it began in 2008 because it was the only way to guarantee the quality, performance and cost for its cloud. However, these days the storage, computing and network services available from cloud providers are as high quality, high performance and low cost as anything Spotify could create from the traditional ownership model, said Harteau.

Harteau explained why Spotify preferred Google’s cloud service to that of runaway market leader Amazon Web Services (AWS). The decision was shaped by Spotify’s experience with Google’s data platform and tools. “Good infrastructure isn’t just about keeping things up and running, it’s about making all of our teams more efficient and more effective, and Google’s data stack does that for us in spades,” he continued.

Harteau cited the Dataproc’s batch processing, event delivery with Pub/Sub and the ‘nearly magical’ capacity of BigQuery as the three most persuasive features of Google’s cloud service offering.

Google launches Dataproc after successful beta trials

Google cloud platformGoogle has announced that its big data analysis tool Dataproc is now on general release. The utility, which was one of the factors that persuaded Spotify to choose Google’s Cloud Platform over Amazon Web Services is a managed tool based on the Hadoop and Spark open source big data software.

The service first became available in beta in September and was tested by global music streaming service Spotify, which was evaluating whether it should move its music files away from its own data centres and into the public cloud – and which cloud service could support it. Dataproc in its beta form supported the MapReduce engine, the Pig platform for writing programmes and the Hive data warehousing software. Google says it has added new features and sharpened the tool since then.

While in its beta testing phase, Cloud Dataproc added features such as property tuning, VM metadata and tagging and cluster versioning. “In general availability new versions of Cloud Dataproc will be frequently released with new features, functions and software components,” said Google product manager James Malone.

Cloud Dataproc aims to minimise cost and complexity, which are the two major distractions of data processing, according to Malone.

“Spark and Hadoop should not break the bank and you should pay for what you actually use,” he said. As a result, Cloud Dataproc is priced at 1 cent per virtual CPU per hour. Billing is by the minute with a 10-minute minimum.

Analysis should run faster, Malone said, because clusters in Cloud Dataproc can start and stop operations in less than 90 seconds, where they take minutes in other big data systems. This can make analyses run up to ten times faster. The new general release of Cloud Dataproc will have better management, since clusters don’t need specialist administration people or software.

Cloud Dataproc also tackles two other data processing bugbears, scale and productivity, promised Malone. This tool complements a separate service called Google Cloud Dataflow for batch and stream processing. The underlying technology for the service has been accepted as an Apache incubator project under the name Apache Beam.

IBM launches object-based storage for the cloud

Cloud storageNew object-based cloud storage could tackle the growing challenge presented by unstructured data, according to IBM.

Announcing a new Cloud Object Storage at its InterConnect 2016 event in Las Vegas, IBM said the object storage technology it acquired from Cleversafe creates a fast, flexible, hybrid cloud storage service that gives companies new options for managing and analysing data.

Researcher IDC says 80% of new cloud apps will be big-data intensive. The cloud, mobile, IoT, analytics, social media, cognitive and other technologies all conspire to increase the data management workload, said John Morris, general manager of IBM Cloud Object Storage. Bringing Cleversafe technology to the cloud will give clients a way to keep on top of the problem.

The service offers a choice of multiple application programming interfaces and the option to store massive amounts of data on-premise, on the IBM Cloud or in a hybrid of both.

In June, when the Cloud Object Storage services is launched, it will come in three configurations: Nearline, Standard and Dedicated.

Nearline is a cloud infrastructure for infrequently accessed data charged at a lower cost and ideal for archive; back-up and other non-timely workloads. The Standard offering will provide a higher performance public cloud offering based on the Cleversafe technology with three new APIs into S3 Object storage.

The Dedicated option gives a single-tenant IBM Object Storage system running on dedicated servers in IBM Cloud data centres. This is available as an IBM managed service or as a self-managed cloud solution and gives clients access to object storage without needing for extra hardware or data centre space.

IBM Cloud Object Storage will be available in a variety of licensing models, including perpetual, subscription or consumption. This means customers can buy storage capacity with the flexibility to move data between the enterprise and the cloud as business needs change. It will also support both file and object workloads, so enterprises can have a single data storage hub that supports both traditional and web-based applications.

Huawei unveils Cloud Data Center for operators at MWC 2016

Huawei MWC 2016Equipment maker Huawei has built a cloud data centre infrastructure vehicle for transporting operators into the cloud.

It unveiled the service as Mobile World Congress 2016 in Barcelona, on the same day that Amdocs and Red Hat announced they’d created a system to help mobile operators to throw off their fixed infrastructure shackles.

The Huawei Cloud Data Center is to be an open ecosystem with joint innovations from SAP, Accenture and a range of other partners. Cloud migration services will be provided by SAP, while Accenture will offer the development of enterprise-class private cloud applications.

At the launch of the new ‘application centric, cloud 3.0 data center’ Zheng Yelai, President of Huawei IT Product Lines, promised Huawei would combine mission-critical servers, storage consolidation, cloud fabric software defined networking, modular data centers and other IT concepts into a single, highly flexible cloud data center platform.

Telco carriers can now get a single simple system for resource management, elastic expansion, convergence and visualized operations and management, Yelai claimed. Huawei promised to harmonise each carrier’s services, operations, infrastructures and networks at Huawei’s data centers. The problem for most carriers is that they have data in different silos and business support systems and operating systems that cannot be fashioned, in their existing format, into a cohesive system, according to Yelai. This means that carriers will be unable to compete with new companies that can run their data services across any telco’s network, because they were invented in the age of the cloud. By hosting the carriers, Huawei’s Cloud Data Center can liberate the carriers, said Yelai.

The cloud, Yelai said, could smash the silo-like structure of conventional IT and empower carriers with more choices in their cloud transformations. With their revitalized deployment strategies, telcos become strategically positioned as enablers of the digital economy, claimed Yelai.

Huawei’s Cloud Data Center will be a strong advocator of open standards in cloud platforms, it said, as it contributes to open source communities such as OpenStack, Hadoop and Spark. In January 2016, Huawei was elected to the OpenStack board of directors.

“We are building clouds that benefit carriers the most through shortened service provisioning, reduced OPEX and automated operations and management,” said Yelai, “These improvements allow carriers to develop new business in public cloud and effectuate their transformations.”

Amdocs combines NCSO with Red Hat OpenStack in telco cloud play

openstack logoCustomer experience specialist Amdocs claims it has created a system to convert mobile operators from physical network users into comms service providers in the cloud. It unveiled details of the new service at Mobile World Congress 2016 in Barcelona.

It has achieved this by blending its Network Cloud Service Orchestrator (NCSO) with the Red Hat Enterprise Linux OpenStack Platform. This, it says, creates an open catalogue driven system that works with any vendor’s equipment. Amdocs claimed it can help mobile operators transform from fixed infrastructure users into cloud friendly communications service providers (CSPs).

The NCSO can orchestrate the mapping of telecommunications services onto a software-led environment, claimed Amdocs. It does this by creating the conditions for continuous design, instantiation and the assurance of complex network services created from virtual network functions (VNFs).

By virtualising functions that were previously bound up with hardware, the NCSO creates a greater degree of fluidity and flexibility. This means CSPs can introduce new services and adapt to customer demand in a fraction of the time, claims Amdocs.

Amdocs chose Red Hat because its Enterprise Linux OpenStack system has emerged as a cloud platform for network function virtualisation, it said.

An Amdocs NCSO, which uses Red Hat Enterprise Linux OpenStack, has been part of several NFV lab trials with tier one telco providers globally. In the beta trials the telco users have created a range of use cases with multiple vendors, including virtual CPE (customer premises equipment), virtual EPC (evolved packet core) and virtual IMS (IP Multimedia Subsystem).

Red Hat Enterprise Linux uses the high-performance Kernel-based Virtual Machine (KVM) hypervisor as, it claims, this forms a more stable, secure and reliable operating system.

“OpenStack has become a de facto choice for NFV trials across the globe,” said Radhesh Balakrishnan, general manager of OpenStack at Red Hat

Oracle buys enterprise workload manager Ravello Systems for a reported $500m

OracleOracle has disclosed details of its acquisition of workload management specialist Ravello Systems. No financial terms were revealed over the deal, but sources familiar with the company value the sale at $500 million, according to venture capital news site Venturebeat.

Ravello, which makes tools that help enterprises manage their enterprise workloads in the cloud, signed an agreement to be acquired on February 22 with all employees joining Oracle’s Public Cloud division.

The new management features will help Oracle’s Public Cloud beef up the performance of its computing, storage and networking workloads. Oracle has launched a number of initiatives aimed at positioning its cloud business more favourably against market leaders Amazon Web Services (AWS) and Microsoft Azure.

In February BCN reported how Oracle had added new Platform-as-a-Service (PaaS) and Infrastructure-as-a-Service (IaaS) cloud offerings from its Slough data centre, which currently caters for 500 UK and global customers. Clients from both the private and public sector are being promised tailored versions of the new services, which include Oracle’s Database, Dedicated Compute, Big Data and Exadata cloud services.

Palo Alto based Ravello was founded in 2011 and was balancing the cloud workloads for clients such as Arista, Brocade, Red Hat, SUSE and Symantec. In total it had raised $54 million in funding from venture capitalists such as Sequoia Capital, Norwest Venture Partners and Bessemer Venture Partners because its Cloud Application Hypervisor offered enterprises a way to unify the application environment across public and private clouds.

Ravello CEO Rami Tamir explained on the company web site why the technology will be part of the Oracle Public Cloud. “This agreement will accelerate our ability to reach more customers,” said Tamir, “our top priority is ensuring an uninterrupted service and seamless experience for you and all of our customers and partners. Rest assured, Ravello’s service will continue as is. Ravello will join Oracle’s IaaS mission to allow customers to run any type of workload in the cloud.”

Hitachi launches customer-centric predictive analytics for telcos

AnalyticsMobile operators, telcos and service providers could soon stem the tide of subscriber defections thanks to a new cloud based predictive analytics service from Hitachi Data Systems (HDS). By forecasting customer behaviour, HDS aims to improve subscriber satisfaction and reduce churn for its clients.

The new offering, announced at the 2016 Mobile World Congress (MWC) in Barcelona, will run as the Hitachi Unified Compute Platform (UCP) 6000 for Predictive Analytics system. It uses the latest analytics software to find the patterns characteristic of unhappy customers and predict customer attrition. The system uses predictive scoring – based on events such as constant use of the help desk and failures of the network – in order to give support staff the information needed for real-time decision making. Once identified, the churn-prone subscribers can be targeted with compensatory offers before they defect.

The Hitachi UCP 6000 for Predictive Analytics is built on SAP’s HANA converged infrastructure which can conduct in-memory data interrogations of big data. HDS claims its UCP 6000 for SAP HANA can simplify the deployment of SAP solutions for telcos, which in turn will help them minimise the IT infrastructure disruption and maximise application performance.

As part of the solution, SAP HANA and SAP Predictive Analytics will allow users to run predictive models on massive amounts of data from external data points. However, as a consequence of the crunching of data in flash memory, the clients will get their insights in seconds and can nip customer uprisings in the bud. SAP’s Predictive Analytics software will automatically handle the wide dataset and make predictive modelling more effective, according to HDS.

HDS described the churn-busting service as an ‘immense opportunity’ to translate data into tangible business outcomes.

IBM launches Swift answer to Lambda at Interconnect 2016

open sourceIBM has unveiled a raft of new announcements today at Interconnect 2016, its largest ever cloud event. The rally, in Las Vegas, attracted 25,000 clients, partners and developers who were briefed on new partnerships with VMWare, IBM’s work with Apple’s Swift language, Bitly, Gigster, GitHub, Watson APIs and a new platform, BlueMix OpenWhisk.

The Bluemix OpenWhisk is IBM’s answer to Amazon Web Services’ event driven system Lambda, which allows developers to create automated responses to events when certain conditions are met. Automated responses have become a critical area for public cloud service providers and BCN recently reported how Google launched Google Cloud Functions in order to match the AWS offering to developers. All the systems aim to give developers a way to programme responses without needing to implement integration-related changes in the architecture, but IBM claims OpenWhisk is the only one whose underlying code will be available under an open-source license on the code publishing site Github.

By allowing all users open access to inspect code IBM says it can inspire greater levels of developer collaboration. IBM said OpenWhisk is highly customisable through either web services or using commands and it can be adapted to company requirements rather than being an inflexible cloud services.

OpenWhisk will work with both the server-side JavaScript framework and Apple’s increasingly popular Swift programming language. With a range of application programming interfaces (APIs) IBM claims the OpenWhisk service will have greater flexibility than the rival services from Google and AWS.

In a statement IBM explained the next phase of its plan to bring Swift to the Cloud with a preview of a Swift runtime and a Swift Package Catalog to help enable developers to create apps for the enterprise. The new Swift runtime builds on the Swift Sandbox IBM launched in December and allows developers to write applications in Swift in the cloud and create continuous integration (CI) and continuous delivery (CD) condition that run apps written in Swift in production on the IBM public cloud.

IBM also announced a new option for users to run GitHub Enterprise on top of IBM Bluemix and in a company’s own data centre infrastructure.

In another announcement IBM gave details of a new partnership with VMware aimed at helping enterprises take better advantage of the cloud’s speed and economics. A new working arrangement means enterprise customers will find it easier to extend their existing workloads from their on-premises software-defined data centre to the cloud. The partnership gives IBM users the option to run VMware computing, storage and networking workloads on top of the IBM cloud. The new level of integration applies to vSphere, Virtual SAN, NSX, vCenter and vRealize Automation. In addition the IBM cloud is now part of the vCloud Air Network from VMware and the two partners will jointly sell hybrid cloud.

Salesforce bolsters machine learning business with PredictionIO acquisition

AI-Artificial-Intelligence-Machine-Learning-Cognitive-ComputingOpen source machine learning software vendor PredictionIO has announced it is to become part of Salesforce. The Palo Alto start up has stressed that the software will continue to be available under an open source Apache license.

The addition of analytics and machine learning has become a key strategy to Salesforce as it bids to build on its cloud offerings. Last year BCN reported how Salesforce was adding new Wave Actions to its Analytics Cloud intelligence tool. More recently it bought machine learning companies RelateIQ and Tempo AI and integrated staff into its data science teams.

Machine learning, which can be used in many cloud applications, has become an area of contention in the cloud industry with other start ups in this area, such as H2O and Skytree, the subject of takeover rumours.

California based PredictionIO was formed in 2013 and a year later received $2.5 million in backing from investors including Azure Capital Partners. Other backers include CrunchFund, the Stanford-StartX Fund and Kima Ventures. Dropbox is PredictionIO’s most prominent client.

CEO Simon Chan explained the rationale for selling the firm on his company blog. As part of Salesforce, PredictionIO’s machine learning system will get immediate access to the entire Salesforce clouds. The opportunity to extend SalesforceIQ’s machine learning and intelligence was a chance not to be passed up, he said. “Being a part of Salesforce will give us an amazing opportunity to continue building our open source machine learning platform on a much larger scale,” said Chan.

Chan’s objective will be the same within Salesforce – to simplify development of machine learning technology and build it up. PredictionIO now has 8,000 developers creating over 400 apps. Chan pledged that PredictionIO’s open source technology will stay that way and will continue to be free to all users. To mark the Salesforce deal it is to dropping the PredictionIO Cluster software fee on AWS Cloudformation, which will is now free for the first time in the company’s history.