Archivo de la categoría: Big Data

Splunk CEO quits while ahead after 2% revenue growth

splunkAs big data pioneer Splunk reported better than expected quarterly revenues and shares surged by 2%, its leader for the last seven years has announced his retirement.

Splunk CEO Godfrey Sullivan, who successfully steered the software company through an initial public offering in 2012, has stepped down to be replaced by Doug Merritt, a senior VP. However, Sullivan will remain at the company as non-executive chairman and has promised a smooth transition.

Sullivan, previously the boss of Hyperion until it was acquired by Oracle, has steered the company since the days when the value of machine-to-machine (M2M) intelligence and big data analytics were relatively unknown. The Splunk IPO in 2012 was a landmark for the industry, being the first time the public were invited to invest in a company specialising in the then-new concept of big data. Splunk’s IPO was acclaimed as one of most successful tech offerings in the decade with share prices surging 108% on the first day of trading.

Under Sullivan Splunk’s customer base grew from 750 to 10,000 and annual revenues from $18 million to $600 million, according to a Splunk statement. His successor, Merritt, a veteran of SAP, Peoplesoft and Cisco, has been with Splunk since 2011 and said he would work with Godfrey on a smooth transition. “We will continue our laser focus on becoming the data fabric for businesses, government agencies, universities and organisations,” he said.

“Doug brings enormous management, sales, product and marketing skills to his new role,” said Splunk’s lead independent director John Connors. “As senior vice president for field operations for Splunk, Doug has consistently delivered outstanding financial results.”

In its results for the third quarter of financial year 2016 Splunk reported total revenues of $174.4 million, up 50% year-over-year, and ahead of analyst expectations by $14.4million.

WANdisco’s new Fusion system aims to take the fear out of cloud migration

CloudSoftware vendor WANdisco has announced six new products to make cloud migration easier and less dangerous as companies plan to move away from DIY computing.

The vendor claims its latest Fusion system aims to create a safety net of continuous availability and streaming back-up. Building on that, the platform offers uninterrupted migration and gives hybrid cloud systems the capacity to expand across both private public clouds if necessary. These four fundamental conditions are built on seven new software plug-ins designed to make the transition from production systems into live cloud systems smoother, says DevOps specialist WANdisco.

The backbone of Fusion is WANdisco’s replication technology, which ensures that all servers and clusters are fully readable and writeable, always in sync and can recover automatically from each other after planned or unplanned downtime.

The plug-ins that address continuous availability, data consistency and disaster recovery are named as Active-Active Disaster Recovery, Active-Active Hive and Active-Active Hbase. The first guarantees data consistency with failover and automated recovery over any network. It also prevents Hadoop cluster downtime and data loss. The second regulates consistent query results across all clusters and locations. The third, Hbase, aims to create continuously availability and consistency across all locations.

Three further plug ins address the threat of heightened exposure that is created when companies move their system from behind a company firewall and onto a public cloud. These plug-ins are named as Active Back-up, Active Migration and Hybrid Cloud. To supplement these offerings WANdisco has also introduced the Fusion Software Development Kit (SDK) so that enterprise IT departments can programme their own modifications.

“Ease of use isn’t the first thing that comes to mind when one thinks about Big Data, so WANdisco Fusion sets out to simplify the Hadoop crossing,” said WANdisco CEO David Richards.

Huawei shows how FusionSphere runs SAP HANA

SAP HANA VoraSAP partner and telecoms equipment maker Huawei used the SAP TechEd conference in Barcelona to demonstrate how its FusionSphere operating system handles SAP’s HANA big data processing platform on Huawei hardware.

Huawei’s FusionSphere is an open, enterprise-level cloud operating system based on OpenStack architecture. The telecoms equipment maker says it integrates software-defined computing, storage, networking and cloud management components into one system that can support private, public, and hybrid clouds. Delegates at the SAP conference were invited to see demonstrations of how the ‘open and agile’ operating system could handle running enterprise software.

Huawei argued that since FusionSphere can run business applications that have traditionally been run ‘on premise’, it will create new opportunities for mass processing of big data on the cloud. With the economies of scale and greater choice over resources that cloud gives the managers of IT operations, purchasers of IT services will have much greater buying power, according to Huawei.

Consequently, it argued, cloud based HANA projects could be run at much lower capital and operating costs, with more efficiency and service quality.

Huawei became a SAP global technology partner in 2012. Since then it has supported the SAP HANA platform and associated applications via a series of other products alongside FusionServer, with one, the Huawei Appliance for SAP HANA, currently being used in China, Europe, the Middle East and Africa.

Meanwhile, in San Francisco, Huawei used the Linux-oriented OPNFV Summit to unveil its new inventions for OPNFV Mobile Networking.

Huawei Chief Technology Officer Pang Kang told the conference the next generation of mobile architecture will be built on an open source version of NFV (network functions virtualisation).

“The OPNFV platform will be a significant step in the move to a new mobile architecture that scales, is elastic, resilient and agile,” said Pang Kang. “A carrier grade virtual infrastructure will help Huawei deliver the next-generation in mobile networking to our customers.”

Among the projects that Huawei is contributing to, within the OPNFV organisation, is Pinpoint – a big data system for failure prediction and Multi-site Virtualized Infrastructure, a distributed architecture for OPNFV.

During the OPNFV Summit, Huawei will demonstrate three OPNFV-based solutions that are targeted for commercial deployment, VNFs over OPNFV, VTN of ONOS (a carrier grade open source SDN controller) for OPNFV and Compass for OPNFV, a DevOps tool.

Veritas warns of ‘databerg’ hidden dangers

Deep WebBackup specialist Veritas Technologies claims European businesses waste billions of euros on huge stories of useless information which are growing every year. By 2020 it claims the damage caused by this excessive data will cost over half a trillion pounds (£576bn) a year.

According to the Veritas Databerg Report 2015, 59% of data stored and processed by UK organisations is invisible and could contain hidden dangers. From this it has estimated that the average mid-sized UK organisation holding 1000 Terabytes of information spends £435k annually on Redundant, Obsolete or Trivial (ROT) data. According to its estimate just 12% of the cost of data storage is justifiably spent on business-critical intelligence.

The report blames employees and management for the waste. The first group treats corporate IT systems as their own personal infrastructure, while management are too reliant on cloud storage, which leaves them open to compliance violations and a higher risk of data loss.

The survey identified three major causes for Databerg growth, which stem from volume, vendor hype and the values of modern users. These root causes create problems in which IT strategies are based on data volumes not business value. Vendor hype, in turn, has convinced users to become increasingly reliant on free storage in the cloud and this consumerisation has led to a growing disregard for corporate data policies, according to the report’s authors.

As a result, big data and cloud computing could lead corporations to hit the databerg and incur massive losses. They could also sink under a prosecution for compliance failing, according to the key findings of the Databerg Report 2015.

It’s time to stop the waste, said Matthew Ellard, Senior VP for EMEA at Veritas. “Companies invest a significant amount of resources to maintain data that is totally redundant, obsolete and trivial.” This ‘ROT’ costs a typical midsize UK company, which can expect to hold 500 Terabytes of data, nearly a million pounds a year on photos, personal ID doc, music and videos.

The study was based on a survey answered by 1,475 respondents in 14 countries, including 200 in the UK.

IBM adds Universal Behavior Exchange into its Marketing Cloud

IBMIBM says the new Universal Behavior Exchange (UBE) in its Marketing Cloud will help businesses to understand their customers better.

The vendor and cloud service provider claims UBE can solve the problem of connect up all the different sources of information available to them. In some companies this means taking data from up to 30 different systems. Cloud service UBE aims to connect and personalise all relevant information and allow marketing staff to devise more effective campaigns on Facebook and across the Web.

UBX is supported by an open ecosystem of certified partners that includes social, mobile, CRM and paid advertising solutions. Vendor partners include MediaMath, Spredfast, MutualMind, SugarCRM and Exchange Solutions.

Features in the cloud based system include a click-to-connect integration that should simplify the getting and using of data marketers. A pre-integrated network of the vendor partner’s technology should give clients a faster access to a wide range of customer behaviour types, with event and audience data available across a range of paid, owned and earned channels. The system ultimately allows users to study the behaviour of customers and create a highly personalised interaction in response, according to IBM.

“IBM is making it simpler to understand how customers prefer to engage,” said MediaMath president Mike Lamb, “Connecting advertiser data to other channels could create more timely and relevant interactions.”

In a related announcement, mobile marketing system vendor Vibes leader has announced a complementary offering. The Vibes mobile marketing platform will now personalise mobile campaigns with IBM Campaign for targeted text messaging and mobile wallet offers. It will also work IBM Marketing Cloud systems to trigger transactional and service-oriented mobile messages, like appointment reminders and service updates.

“UBX is cracking the code on big data applied to the marketing cloud, and we’re thrilled to be a part of this emerging ecosystem,” said Vibes CEO Jack Philbin.

IBM to create HPC and big data centre of excellence in UK

datacenterIBM and the UK’s Science & Technology Facilities Council (STFC) have jointly announced they will create a centre that tests how to use high performance computing (HPC) for big data analytics.

The Hartree Power Acceleration and Design Centre (PADC) in Daresbury, Cheshire is the first UK facility to specialise in modelling and simulation and their use in Big Data Analytics. It was recently the subject of UK government investment in big data research and was tipped as the foundation for chancellor George Osborne’s northern technology powerhouse.

The new facility launch follows the government’s recently announced investment and expansion of the Hartree Centre. In June Universities and Science Minister Jo Johnson unveiled a £313 million partnership with IBM to boost Big Data research in the UK. IBM said it will further support the project with a package of technology and onsite expertise worth up to £200 million.

IBM’s contributions will include access to the latest data-centric and cognitive computing technologies, with at least 24 IBM researchers to be based at the Hartree Centre to work side-by-side with existing researchers. It will also offer joint commercialization of intellectual property assets produced in partnership with the STFC.

The supporting cast have a brief to help users to cajole the fullest performance possible out of all the components of the POWER-based system, and have specialised knowledge of architecture, memory, storage, interconnects and integration. The Centre will also be supported by the expertise of other OpenPOWER partners, including Mellanox, and will host a POWER-based system with the Tesla Accelerated Computing Platform. This will provide options for using energy-efficient, high-performance NVIDIA Tesla GPU accelerators and enabling software.

One of the target projects will be a search for ways to boost application performance while minimising energy consumption. In the race towards exascale computing significant gains can be made if existing applications can be optimised on POWER-based systems, said Dr Peter Allan, acting Director of the Hartree Centre.

“The Design Centre will help industry and academia use IBM and NVIDIA’s technological leadership and the Hartree Centre’s expertise in delivering solutions to real-world problems,” said Allan. “The PADC will provide world-leading facilities for Modelling and Simulation and Big Data Analytics. This will develop better products and services that will boost productivity, drive growth and create jobs.”

New Egnyte service promises to impose strict version in the cloud

AppsCloud file service provider Egnyte has launched a Smart Reporting and Auditing service which promises to impose order on the way content is created, edited, viewed and shared.

The service is currently exclusive to Egnyte customers who want visibility and control over their organisation’s entire content life-cycle, whether files are in-house or in the cloud. The rationale is to help companies stop wasting money on the multiplication of effort involved when multiple versions of the same file exist across the diaspora of in-house systems, private and public clouds.

The promised returns on investment in these cloud services, the company says, are lower costs, less risk and higher productivity through visibility. Cost savings are promised on reducing bandwidth consumption, minimised support issues and less wasted employee time. Risk will be minimised, according to Egnyte, as fewer files will be leaked out of the organisation and suspicious activities – both internally and externally – can be highlighted. Visibility improvements will boost productivity by speeding the progress of projects and the prevention of unchecked document replication and mutation, which leads to multiple teams working on multiple different versions of the same project.

Companies and vendors have still not cracked version control yet, said one analyst, and the cloud will only make the task more complicated.

“Content is at the core of just about every business process today, but users are accessing files across multiple devices, anywhere, any time,” said Terri McClure, senior analyst at the Enterprise Strategy Group. “It is entirely too costly and there is simply too much data.”

Solving the big data analytics problem will be increasingly important, said McClure.

Devcon Construction, the largest general contractor in Silicon Valley, has used the service on trial to track confidential design plans and blueprints. “It gives complete visibility on how the files are shared and accessed, so we can effectively manage desktop and tablet device workflows out in the field,” said Joe Tan, director of IT at Devcon Construction.

The cloud service now makes detailed file analytics and insights possible, claimed Isabelle Guis, chief strategy officer at Egnyte. “It’s critical for businesses to optimise file infrastructure and protect against potential threats,” she said.

SAP unveils new powers within Analytics in the Cloud

SAP1SAP has unveiled a new user-friendly analytics service for enterprises which it claims will give better insights by offering an ‘unparalleled user experience’.

The SAP Cloud for Analytics will be delivered through a planned software as a service (SaaS) offering that unifies all SAP’s analytical functions into one convenient dashboard.

Built natively on the SAP HANA Cloud platform, it will be a scalable, multi-tenant environment at a price which SAP says is affordable to companies and individuals. The new offering aims to bring together a variety of existing services including business intelligence, planning, budgeting and predictive capacity.

According to SAP, it has fine tuned workflows so that it’s easier for user to get from insight to action, as one application spirits the uses through this journey more rapidly. It achieves this by giving universal access to all data, digesting it and forwarding the right components to the right organs of the organisation. An intuitive user interface (UI) will help all users, from specialists such as finance professionals to generalists such as line of business analysts, to build connected planning models, analyze data and collaborate. It can extend to unstructured data, helping users to spot market trends within social media and correlate them with company inventories, SAP claims.

It’s all about breaking down the divisions between silos and blending the data to make visualization and forecasting possible, said Steve Lucas, president, Platform Solutions, SAP. “SAP Cloud for Analytics will be a new cloud analytics experience. That to me is more than visualization of data, that’s realization of success,” said Lucas.

SAP said it is also working with partners to provide seamless workflows.

SAP and Google are collaborating to extend the levels of analysis available to customers, according to Prabhakar Raghavan, VP of Engineering at Google Apps. “These innovations are planned to allow Google Apps for Work users to embed, refresh and edit SAP Cloud for Analytics content directly in Google Docs and Google Sheets,” said Raghaven.

IBM augments Watson with new cognitive business consulting arm

WatsonEnterprise tech giant IBM has announced the creation of IBM Cognitive Business Solutions, a consulting practice designed to help businesses get into the cognitive computing game.

IBM continues to invest heavily in its Watson cognitive computing operation, which uses artificial intelligence and machine learning to better deal with unstructured data. This consulting business will have access to over 2,000 consultants across a wide range of industries.

“Our work with clients across many industries shows that cognitive computing is the path to the next great set of possibilities for business,” said Bridget van Kralingen, SVP of IBM Global Business Services.

“Clients know they are collecting and analyzing more data than ever before, but 80 percent of all the available data — images, voice, literature, chemical formulas, social expressions — remains out of reach for traditional computing systems. We’re scaling expertise to close that gap and help our clients become cognitive banks, retailers, automakers, insurers or healthcare providers.”

“Before long, we will look back and wonder how we made important decisions or discovered new opportunities without systematically learning from all available data,” said Stephen Pratt, global leader, IBM Cognitive Business Solutions. “Over the next decade, this transformation will be very personal for professionals as we embrace learning algorithms to enhance our capacity. For clients, cognitive systems will provide organizations that adopt these powerful tools outperform their peers.”

Speaking at a Gartner symposium IBM CEO indicated the cognitive business is a cornerstone of IBM’s overall strategy. IBM says it has already invested over a billion dollars on Watson and intends to train another 25,000 IBM consultants and practitioners on cognitive computing before the end of this year.

AWS: examine fine print in data transfer legislation

In a week that has seen the European Court of Justice rule that the Safe Harbour agreement on data transfer as invalid, the significance of data transfer legislation in South East Asia has been under discussion at Cloud South East Asia.

Answering audience questions following his Cloud South East Asia keynote this morning, Blair Layton, Head of Database Services for Amazon Web Services, argued that some of the legislation against data transfer was not always as cast-iron as they appear.

Acknowledging that such legal concerns were indeed “very legitimate,” and that there were certainly countries with stringent legal provisions that formed an obvious barrier to the adoption of cloud services such as Amazon Web Services, Layton none the less stressed that it was always worth examining the relevant legislation “in more detail.”

“What we’ve found in some countries is that, even though the high level statement might be that data has to reside in one country, what you find in the fine print is that it actually says, ‘if you inform users then it is fine to move the data,”’ he told delegates. “Also, that for sensitive data you think you may not be able to move – because of company controls, board level concerns etc. – we can have many discussions about that. For instance, if you just want to move data for back-up and recovery, you can encrypt that on the premise, maintain the keys on premise, and shift that into the cloud for storage.”

In the same session, Layton, when not extolling the impressive scope and effectiveness of Amazon Web Services in the South East Asian region and beyond, discussed other reasons for the arguable disparity between the evident regional interest in cloud services, and the actual uptake of them.

“There are in different cultures in different countries, and they have different levels of interest in technology. For example, you’ll see that…. people in Singapore are very conservative compared to the Taiwanese In other countries their IT is not as mature and they’re not as willing to try new things and that’s simply cultural.”