Category Archives: Big Data

IBM bolsters Watson Healthcare capabilities with $1bn Merge acquisition

IBM is bolstering its Watson Health Cloud with the Merge acquisition

IBM is bolstering its Watson Health Cloud with the Merge acquisition

IBM announced its intention to acquire Merge Healthcare, a medical imaging and processing platform provider, which it plans to integrate with Watson. The company said the move would bolster the cognitive computing cloud’s clinical and medical capabilities.

Merge claims its technology is used at more than 7,500 US healthcare sites and many of the world’s largest clinical research institutes and pharmaceutical firms to manage and process medical images.

IBM said it plans to integrate Merge’s medical image handling technologies with the Watson Health Cloud. The company said the move would enable it to extend Watson’s analytics to medical images and create a consolidated platform to store, analyse and suggest treatments based on them, as well as cross-reference the images against a growing trove of lab results, electronic health records, clinical studies and other healthcare-related research and data.

“As a proven leader in delivering healthcare solutions for over 20 years, Merge is a tremendous addition to the Watson Health platform.  Healthcare will be one of IBM’s biggest growth areas over the next 10 years, which is why  we are making a major investment to drive industry transformation and to facilitate a higher quality of care,” said John Kelly, senior vice president, IBM Research and Solutions Portfolio.

“Watson’s powerful cognitive and analytic capabilities, coupled with those from Merge and our other major strategic acquisitions, position IBM to partner with healthcare providers, research institutions, biomedical companies, insurers and other organizations committed to changing the very nature of health and healthcare in the 21st century. Giving Watson ‘eyes’ on medical images unlocks entirely new possibilities for the industry.”

“Medical images are some of the most complicated data sets imaginable, and there is perhaps no more important area in which researchers can apply machine learning and cognitive computing.  That’s the real promise of cognitive computing and its artificial intelligence components – helping to make us healthier and to improve the quality of our lives,” he added.

IBM sees huge potential for its Watson service in healthcare, and has moved to back that belief with a flurry of acquisitions and partnerships.

Earlier this year it bought Phytel, which provides cloud-based software that helps healthcare providers and care teams coordinate activities across medical facilities by automating certain aspects of patient care, and acquired Explorys, a provider of cognitive cloud-based analytics that provides insights for care facilities derived from datasets derived from numerous and diverse financial, operational and medical record systems.

It also announced a partnership with Apple that is seeing IBM offer its Watson Health Cloud platform as a storage and analytics service for HealthKit data aggregated from iOS devices, and open the platform up for health and fitness app developers as well as medical researchers.

WTA, SAP team on tennis analytics using HANA cloud

SAP and the WTA are partnering to develop a cloud-based analytics service for tennis players and coaches

SAP and the WTA are partnering to develop a cloud-based analytics service for tennis players and coaches

The Women’s Tennis Association (WTA) and SAP have partnered to develop tennis analytics software for players and coaches based on SAP HANA.

The cloud-based analytics platform will offer players and coaches side-by-side comparisons of the full list of match stats for both players, updated in near real-time; scoring data that analyses player’s service performance, success rate in closing out a game while serving and number of break points saved; and tracking data showing player’s serve direction and placement on the court, contact point for returning a serve and placement of rally shots.

The organisations said players and coaches will be able to access the analytics platform from WTA-authorised tablets.

“The WTA and SAP Tennis Analytics is a game-changer that will not only enhance our athletes’ preparation and performance but also the fans’ experiences when watching women’s tennis,” said Stacey Allaster, chief executive and chairman of the WTA.

“Analyzing data is fundamental to player and coach development, and this state-of-the-art technology, which more and more of our performers are now using, will take our sport to a new and exciting level and lead the way in sports technology,” she said.

Quentin Clark, chief technology officer and member of the global managing board, SAP said: “Our relationship with the WTA is another example of how SAP collaborates with partners to create ground-breaking solutions that change the way athletes utilize data and information to optimise their performance.”

SAP has partnered with a number of sports association in recent years. Earlier this year it partnered with the National Hockey League (NHL) to roll out a co-designed cloud-based platform which the two organisations said would help bring vast amounts of official NHL statistical information directly to the website in real time.

Networking the Future with SDN

SDN will be vital for everything from monitoring to security

SDN will be vital for everything from monitoring to security

The nature of business is constantly changing; customers are demanding faster, more responsive services, and as a result, firms need to ensure that their backend technology is up to scratch. Increasing adoption of the cloud, mobility and big data technologies has encouraged the IT department to address how they can best support these developing trends whilst benefiting the customer and employee experience.

By looking at the heart of their infrastructure, the network, businesses can provide more agile and flexible IT services that can quickly meet user demand.  So what improvements can be made to the networks to satiate customer demand?

A software defined network (SDN) is emerging as an obvious approach for technology decision makers, empowering them to provide a faster, more agile and scalable infrastructure. SDN is considered the next evolution of the network, providing a way for businesses to upgrade their networks through software rather than through hardware – at a much lower cost.

SDN provides holistic network management and the ability to apply more granular unified security policies whilst reducing operational expenses such as the need to use specific vendor hardware and additional technology investments. In fact, IDC recently predicted that this market is set to grow from $960 million in 2014 to more than $8 billion by 2018, globally.

A Growing Trend

Datacentres and service providers have, until now, been the most common adopters of SDN solutions. As a result there has been a notable improvement in better customer service and faster response times with firms deploying new and innovative applications quicker than ever. In the past year, we have seen firms in sectors like healthcare and education take advantage of the technology. However, while SDN is developing quickly, it is still in its early stages, with several industries yet to consider it.

There is a focus to encourage more firms to recognise the benefits of SDN in the form of the OpenDaylight Project. The OpenDaylight Project is a collaborative open source project which aims to accelerate the adoption of SDN – having already laid the foundation for SDN deployments today, it is considered to be the central control component and intelligence that allows customers to achieve network-wide objectives in a much more simplified fashion. The community, which includes more than a dozen vendors, is addressing the need for an open reference framework programmability and control enabling accelerated innovation for customers of any size and in any vertical.

Driving Business Insights

Looking ahead to the future for this new way of networking, there are a number of ways SDN can benefit the business. For example, SDN looks set to emerge as the new choice for deploying analytics in an economical and distributed way – in part due to the flexible nature of its infrastructure and the growing prominence of APIs – as the SDN optimized network can be maintained and configured with less staff and at a lower cost.

Data analytics-as-a-service is being tipped as the vehicle that will make big data commoditised and consumable for enterprises in the coming years; analyst house IDC found that by 2017, 80% of the CIO’s time will be focused on analytics – and Gartner predicts that by 2017 most business users and analysts in organisations will have access to self-service tools to prepare data for analysis themselves.

However, the right network environment will be key so that data analytics has the right environment to flourish. An SDN implementation offers a more holistic approach to network management with the ability to apply more granular unified security policies while reducing operational expenses. Being able to manage the network centrally is a huge benefit for firms as they look to increase innovation and become more flexible in response to changing technology trends.

Using analytics in tandem with a newly optimized SDN can empower IT to quickly identify any bottlenecks or problems and also help to deploy the fixes. For example, if a firm notices that one of their applications is suffering from a slow response time and sees that part of the network is experiencing a lot of latency at the same time, it could immediately address the issue and re-route traffic to a stronger connection.

Realising the Potential of SDN

In order to implement an SDN solution, it will be imperative for enterprises to firstly make themselves familiar with the technology and its components, create cross functional IT teams that include applications, security, systems and network to get an understanding what they wish to achieve and secondly, investigate best-of-breed vendor solutions that can deliver innovative and reliable SDN solutions which leverage existing investments without the need to overhaul longstanding technologies. This way, businesses can reap the benefits of SDN whilst saving time as well as money and mitigate risk.

Using analytics and SDN in combination is just one future possibility which could make it far simpler for businesses to deploy servers and support users in a more cost-effective and less resource-intensive way. It can also provide an overall improved user experience. With SDN offering the power to automate and make the network faster and big data providing the brains behind the operation; it’s an exciting match that could be an enterprise game changer.

Written by Markus Nispel, vice president of solutions architecture and innovation at Extreme Networks

Media giant hoovers up analytics specialist for $500m

Advance is buying 1010data for $500m

Advance is buying 1010data for $500m

Advance/Newhouse, the parent company of Advance Publications – owner or majority stakeholder in some of the world’s largest digital publications including Condé Nast and Reddit – announced this week it has acquired analytics and data warehouse specialist 1010data for $500m.

Advance said it intends to help scale 101data’s operations globally, with the capital being used to invest strongly in sales, marketing and engineering efforts.

“Advance believes that 1010data has a compelling vision for helping businesses unlock their true potential through data. It has created a truly innovative approach that is speeding this transition across industries,” said Nomi Bergman, president of Advance’s cable television affiliate, Bright House Networks, and a new member of 1010data’s Board.

“We believe that in the 21st century data and analytics platforms will be a core building block of all businesses. The opportunities that lie ahead for 1010data are boundless, and through our acquisition of 1010data, we are excited to partner with the company’s management team and provide the resources it needs to capitalize on all of them,” Bergman said.

1010data offers big data discovery and data sharing platforms as well as an integrated big data cloud service that can be used to analyse large datasets; it can also be plugged in to third party big data applications and mash up data from a range of sources.

The company claims to have over 750 customers across a range of sectors including telecoms, manufacturing and engineering.

Sandy Steier, co-founder and chief executive of 1010data said: “Advance is the perfect partner to help us maximize our growth potential as they fully recognize how revolutionary and impactful our technology can be. Becoming a part of Advance ensures that there will be no disruption to our customers, our employees or our business while enabling our organization to scale at an even faster rate. We are very excited about being able to leverage Advance’s significant resources to deliver an even better solution for our customers.”

Pivotal teams with Telstra on enterprise big data

Telstra and Pivotal are teaming up to push Cloud Foundry and big data services in Australia

Telstra and Pivotal are teaming up to push Cloud Foundry and big data services in Australia

Pivotal announced a partnership with Australian telco Telstra that will see the two firms jointly marketing Pivotal’s big data development services to Telstra enterprise customers.

Pivotal will also offer enterprise customers training at its newly established Pivotal Labs office in Sydney, Australia, its 16th globally.

The company said the move would help it reach a broader enterprise customer base in the region by leveraging Telstra’s existing local relationships and experience in the ICT sector.

“The arrival of Pivotal Labs in Australia presents a great opportunity for local organisations. Pivotal Labs will quickly become a software innovation hub for Australia’s largest enterprises in all industries to transform into great software companies – taking digital disruption to the next level,” said Melissa Ries, vice president and general manager APJ, Pivotal.

“Pivotal has a great relationship with Telstra, which is built on a foundation of shared visions. With Telstra’s involvement in the Cloud Foundry Foundation and our joint venture, we’re partnering to help Telstra’s customers transform into great software companies.”

The companies said that by using tools like Pivotal CF, the Cloud Foundry-based platform as a service (PaaS), and Pivotal Big Data Suite (BDS), enterprise customers will be able to improve how they develop their web and mobile services.

Kate McKenzie, chief operations officer at Telstra said: “In conjunction with our new Gurrowa Innovation Lab, Pivotal Labs will enhance our innovation offering for our customers and create a pipeline of skills to grow our development capabilities. Innovation at Telstra is about helping our customers get the best out of technology for the future and ultimately providing access to the best networks from which they can innovate, and the partnership will allow us to do just that.”

Alibaba to bolster cloud performance, proposes data protection pact

Alibaba is boosting the performance of its cloud services and reassuring customers on data protection

Alibaba is boosting the performance of its cloud services and reassuring customers on data protection

Alibaba unveiled a series of performance upgrades to its cloud platform this week in a bid to compete more effectively for big data workloads with other large cloud incumbents, and clarified its position on data protection.

The company said it is adding solid state drive (SSD) backed cloud storage, which will massively improve read-write performance over its existing HDD-based offerings, and virtual private cloud services (VPC) for high performance compute and analytics workloads. It’s also boosting performance with virtualised GPU-based technology.

“The huge amount of data and advanced computing capacity has brought great business opportunities to the industry,” said Wensong Zhang, chief technology officer of Aliyun, Alibaba’s cloud division.

“Deep learning and high-performance computing have been widely adopted in Alibaba Group for internal use. Aliyun will roll out high-performance computing services and accelerators based on GPU technology that could be applied in image recognition and deep learning to expand the boundaries of business,” Zhang said.

The company also released what it is calling a data protection pact. In its proposal Alibaba said customers will have “absolute ownership” over all of the data generated or sent to the company’s cloud services, and the “right to select whatever services they choose to securely process their data.”

It also said it would strengthen its threat protection and disaster recovery capabilities in order to reassure customers of its ability to guard their data – and the data of their clients. The company did not, however, cite any specific standards or internationally recognised guidelines on data protection in its plans.

“Without the self-discipline exercised by the banking industry, the financial and economic prosperity that exists in modern-day society would not have ensued. Similarly, without common consensus and concrete action dedicated to data protection, the future for the [data technology] economy would be dim,” the company said in a statement.

“We hereby promise to strictly abide by this pledge, and encourage the entire industry to collectively exercise the self-regulation that is vital in promoting the sustainable development of this data technology economy.”

The Natural Capital project deploys cloud, big data to better quantify the value of nature

Microsoft is teaming up with several US universities to use cloud and big data technologies to forward natural conservation efforts

Microsoft is teaming up with several US universities to use cloud and big data technologies to forward natural conservation efforts

The Natural Capital Project, a ten-year partnership between Stanford University, The Nature Conservancy, the World Wildlife Fund and the University of Minnesota to determine the economic value of natural landscapes is using Microsoft’s cloud and big data technologies to help analyse and visualise data that can help municipal policy-makers improve the environment in and around cities.

The recently announced partnership will see Microsoft offer up a range of technologies to help the project’s researchers better analyse the features impacting natural ecosystems surrounding cities, and quantify the impact of natural disasters, development or how other dependencies are brought to bear on those ecosystems.

Mary Ruckelshaus, managing director of the Natural Capital Project told BCN the project is important because it will help demonstrate both how people depend on the environment and increase awareness of their impact on nature.

“City dwellers depend on nature in many ways–wetlands, marshes, and dunes protect them and their property from coastal flooding, trees and other vegetation filter particulates for clean air, and green spaces reduce temperature stress and improve cognitive function and mental health, just to name a few,” she said.

The researchers will collect data from that broad set of sources including satellite imagery, remote sensors, and social media, and use Microsoft Azure to model the data and deliver the results to a range of mobile devices.

“Our focus with The Natural Capital Project is on enabling leaders in the public and private sector to have access to the best data, powerful analytic and visualization tools so that they can more deeply understand historical trends and patterns within the city or company, predict future situations, model “what-if” scenarios, and gain vital situational awareness from multiple data streams such as satellite imagery, social media and other public channels,” explained Josh Henretig, senior director of environmental sustainability at Microsoft.

“The increased prevalence and availability of data from satellite imagery, remote sensors, surveys and social media channels means that we can analyse, model and predict an extremely diverse set of properties associated with the ecosystems on which we depend,” he said.

Henretig explained to BCN that the Natural Capital Project is the first to try and quantify the economic and social value of natural capital, which means developing the required models and tools needed to complete the analysis will be a challenging undertaking in itself.

“That is a huge, complex undertaking, without any precedent to guide it. As a result, we face the challenge of driving awareness that these tools and this knowledge is available for leaders to draw from. In addition, the sheer diversity of global ecosystems, shared ecosystems, their states of health or decline and differing local and regional priorities make creating tools that can be adapted to assess a variety of circumstances quite a challenge.”

While Henretig acknowledge that it’s often hard for municipal policy-makers to make long-term environmental decisions when people are struggling with more immediate needs, he said the Project will help generate both vital data on the economic value of natural systems as well as suggestions for how they can move forward in policy terms.

“In partnership with cities, we are going to help turn this data—produced across multiple systems for, among other things, buildings, transportation, energy grids, and forests, streams and watersheds—into actionable information and solutions,” he said, adding that the company hopes to apply the models and techniques generated by the research partners to other cities.

Data-as-a-service specialist Delphix scores $75m in latest round

Delphix secured $75m in its latest funding round this week

Delphix secured $75m in its latest funding round this week

Data-as-a-service specialist Delphix announced this week that the company has concluded a $75m funding round that will be used by the company to bolster its cloud and security capabilities.

The funding round, led by Fidelity Management and Research Company, brings the total amount secured by the company to just over $119m since its 2008 founding.

Delphix offers what is increasingly referred to as data-as-a-service, though a more accurate way of describing it does is offer data compression and replication-as-a-service, or the ability to virtualise, secure, optimise and move large databases – whether from an application like an ERP or a data warehouse – from on-premise to the cloud and back again.

It offers broad support for most database technologies including Oracle, Oracle RAC, Oracle Exadata, Microsoft SQL Server, IBM DB2, SAP ASE, PostgreSQL, and a range of other SQL and NewSQL technologies.

The company said the additional funding will be used to expand its marketing activities and “aggressively invest” in cloud, analytics and data security technologies in a bid to expand its service capabilities.

“Applications have become a highly contested battleground for businesses across all industries,” said Jedidiah Yueh, Delphix founder and chief executive.

“Data as a Service helps our customers complete application releases and cloud migrations in half the time, by making data fast, light, and unbreakable—a huge competitive advantage,” he said.

Camden Council uses big data to help reduce fraud, save money

Camden Council is using big data to tackle fraud and save cash as its budgets slim

Camden Council is using big data to tackle fraud and save cash as its budgets slim

Camden Council is using a big data platform to create a ‘Residents Index’ to help tackle debt collection, illegal subletting and fraud.

The service, based on IBM’s InfoSphere platform, centrally stores and manages citizen data collected from 16 different systems across London – including data from Electoral Services, Housing and Council Tax Services – to help give a single view of local residents.

Authorised users can access the platform to search relevant data and highlight discrepancies in the information given to the Council by residents to help reduce fraud and save money on over-procurement of public services.

It’s also using the Index to improve the accuracy of its electoral register. Using the platform, it said it was able to fast track the registration of more than 80 per cent of its residents and identify new residents who need to vote.

“Big data is revolutionising the way we work across the borough, reducing crime and saving money just when public services are facing huge funding cuts,” said Camden councillor Theo Blackwell.

“Take School admission fraud; parents complain about people gaming the system by pretending to reside in the borough to get their kids into the most sought-after schools. Now with the Residents Index in place, Council staff can carry out detailed checks and identify previously hidden discrepancies in the information supplied to the Council to prove residency. We have already withdrawn five school places from fraudulent applicants making sure that school places fairly go to those who are entitled to them.”

“The Resident Index has proven its worth, helping the Council to become more efficient, and now contains over one million relevant records. This is just one example and we have other plans to use the benefits of data technology to improve public services and balance the books.”

Early last year Camden Borough laid out its 3 year plan to use more digital services in a bid to save money and improve the services it offers to local residents, which includes using cloud services to save on infrastructure cost and big data platforms to inform decision making at the Council.

IBM calls Apache Spark “most important new open source project in a decade”

IBM is throwing its weight behind Apache Spark in a bid to bolster its IoT strategy

IBM is throwing its weight behind Apache Spark in a bid to bolster its IoT strategy

IBM said it will throw its weight behind Apache Spark, an open source community developing a processing engine for large-scale datasets, putting thousands of internal developers to work on Spark-related projects and contributing its machine learning technology to the code ecosystem.

Spark, an Apache open source project born in 2009, is essentially an engine that can process vast amounts of data very quickly. It runs in Hadoop clusters through YARN or as a standalone deployment and can process data in HDFS, HBase, Cassandra, Hive, and any Hadoop InputFormat; it currently supports Scala, Java and Python.

It is designed to perform general data processing (like MapReduce) but one of the exciting things about Spark is it can also process new workloads like streaming data, interactive queries, and machine learning – making it a good match for Internet of Things applications, which is why IBM is so keen to go big on supporting the project.

The company said the technology brings huge advances when processing massive datasets generated by Internet of Things devices, improving the performance of data-dependent apps.

“IBM has been a decades long leader in open source innovation. We believe strongly in the power of open source as the basis to build value for clients, and are fully committed to Spark as a foundational technology platform for accelerating innovation and driving analytics across every business in a fundamental way,” said Beth Smith, general manager, analytics platform, IBM Analytics.

“Our clients will benefit as we help them embrace Spark to advance their own data strategies to drive business transformation and competitive differentiation,” Smith said.

In addition to joining Spark IBM said it would build the technology into the majority of its big data offerings, and offer Spark-as-a-Service on Bluemix. It also said it will open source its IBM SystemML machine learning technology, and collaborate with Databricks, a Spark-as-a-Service provider, to advance Spark’s machine learning capabilities.