© Ryoji Ikeda, data.tron [8k enhanced version], audiovisual installation, 2008-09. Photo by Liz Hingley
Every minute, 2 million searches are made, half a billion links are shared and 48 hours of footage are uploaded. That is a lot of data. And yet, in terms of how much is being produced worldwide, it barely scratches the surface. That is Big Data.
Big Data is the term used to describe data sets that are so large and complex that it takes a phenomenal amount of processing power to interrogate them. So why do it?
Fourteen seconds before the 2011 earthquake in Japan, every bullet train and every factory came to a halt. Many lives were saved thanks to the Quake-Catcher Network.
Many lives were saved in Japan in 2011 thanks to the Quake-Catcher Network
This network is made up of thousands of laptops with free software running in the background. The software makes use of the built-in sensors designed to protect the hard drive if the laptop is dropped. If there is an earthquake, all the sensors go off at the same time. By continuously aggregating and processing the data produced by all the sensors, it is possible to brace for impact before the earthquake strikes.
Fourteen seconds before, as it turns out.
In an increasingly connected world, our ability to capture and store data is staggering. We have sensors in everything, from running shoes to mobile phones. We are divulging more and more personal information to social networks. We supply more and more customer data to retailers, on and offline. Around 90 per cent of the data in the world today has been created in the last two years alone. Thanks to the Web, we have gone from information scarcity to information overload in two decades.
We have gone from information scarcity to information overload in two decades
Big Data needs big computers to process it. The algorithms that crunch Big Data require thousands of servers running in parallel. Currently, only governments and web giants like Google and Amazon have the necessary resources.
Barack Obama got elected off the back of it. Twice. By unifying vast commercial, political and social databases, his team was able to understand and influence individual swing voters. Google uses it to predict flu outbreaks, identify human trafficking hot spots and sell advertising.
When the Web was first conceived, it was intended to be more than an interconnected repository of information. The ultimate aim was the Semantic Web, a Web that drew meaning from information. Big Data is half the equation.
Extracted from 100 Ideas That Changed the Web by Jim Boulton, published by Laurence King Publishing, £9.95.