All posts by Jamie Davies

What is the promise of big data? Computers will be better than humans

AI-Artificial-Intelligence-Machine-Learning-Cognitive-ComputingBig data as a concept has in fact been around longer than computer technology, which would surprise a number of people.

Back in 1944 Wesleyan University Librarian Fremont Rider wrote a paper which estimated American university libraries were doubling in size every sixteen years meaning the Yale Library in 2040 would occupy over 6,000 miles of shelves. This is not big data as most people would know it, but the vast and violent increase in the quantity and variety of information in the Yale library is the same principle.

The concept was not known as big data back then, but technologists today are also facing a challenge on how to handle such a vast amount of information. Not necessarily on how to store it, but how to make use of it. The promise of big data, and data analytics more generically, is to provide intelligence, insight and predictability but only now are we getting to a stage where technology is advanced enough to capitalise on the vast amount of information which we have available to us.

Back in 2003 Google wrote a paper on its MapReduce and Google File System which has generally been attributed to the beginning of the Apache Hadoop platform. At this point, few people could anticipate the explosion of technology which we’ve witnessed, Cloudera Chairman and CSO Mike Olson is one of these people, but he is also leading a company which has been regularly attributed as one of the go-to organizations for the Apache Hadoop platform.

“We’re seeing innovation in CPUs, in optical networking all the way to the chip, in solid state, highly affordable, high performance memory systems, we’re seeing dramatic changes in storage capabilities generally. Those changes are going to force us to adapt the software and change the way it operates,” said Olson, speaking at the Strata + Hadoop event in London. “Apache Hadoop has come a long way in 10 years; the road in front of it is exciting but is going to require an awful lot of work.”

Analytics was previously seen as an opportunity for companies to look back at its performance over a defined period, and develop lessons for employees on how future performance can be improved. Today the application of advanced analytics is improvements in real-time performance. A company can react in real-time to shift the focus of a marketing campaign, or alter a production line to improve the outcome. The promise of big data and IoT is predictability and data defined decision making, which can shift a business from a reactionary position through to a predictive. Understanding trends can create proactive business models which advice decision makers on how to steer a company. But what comes next?

Mike Olsen

Cloudera Chairman and CSO Mike Olsen

For Olsen, machine learning and artificial intelligence is where the industry is heading. We’re at a stage where big data and analytics can be used to automate processes and replace humans for simple tasks. In a short period of time, we’ve seen some significant advances in the applications of the technology, most notably Google’s AlphaGo beating World Go champion Lee Se-dol and Facebook’s use of AI in picture recognition.

Although computers taking on humans in games of strategy would not be considered a new PR stunt, IBM’s Deep Blue defeated chess world champion Garry Kasparov in 1997, this is a very different proposition. While chess is a game which relies on strategy, go is another beast. Due to the vast number of permutations available, strategies within the game rely on intuition and feel, a complex task for the Google team. The fact AlphaGo won the match demonstrates how far researchers have progressed in making machine-learning and artificial intelligence a reality.

“In narrow but very interesting domains, computers have become better than humans at vision and we’re going to see that piece of innovation absolutely continue,” said Olsen. “Big Data is going to drive innovation here.”

This may be difficult for a number of people to comprehend, but big data has entered the business world; true AI and automated, data-driven decision may not be too far behind. Data is driving the direction of businesses through a better understanding of the customer, increase the security of an organization or gaining a better understanding of the risk associated with any business decision. Big data is no longer a theory, but an accomplished business strategy.

Olsen is not saying computers will replace humans, but the number of and variety of processes which can be replaced by machines is certainly growing, and growing faster every day.

IBM makes software defined infrastructure smarter

IBMIBM has expanded its portfolio of software-defined infrastructure solutions adding cognitive features to speed up analysis of data, integrate Apache Spark and help accelerate research and design, the company claims.

The new offering will be called IBM Spectrum Computing and is designed to aide companies to extract full value from their data through adding scheduling capabilities to the infrastructure layer. The product offers workload and resource management features to research scientists for high-performance research, design and simulation applications. The new proposition focuses on three areas.

Firstly, Spectrum Computing works with cloud applications and open source frameworks to assist in sharing resources between the programmes to speed up analysis. Secondly, the company believes it makes the adoption of Apache Spark simpler. And finally, the ability to share resources will accelerate research and design by up to 150 times, IBM claims.

By incorporating the cognitive computing capabilities into the software-defined infrastructure products, IBM believes the concept on the whole will become more ‘intelligent’. The scheduling competencies of the software will increase compute resource utilization and predictability across multiple workloads.

The software-defined data centre has been steadily growing, and is forecasted to continue its healthy growth over the coming years. Research has highlighted the market could be worth in the region of $77.18 Billion by 2020, growing at a CAGR of 28.8% from 2015 to 2020. The concept on the whole is primarily driven by the attractive feature of simplified scalability as well as the capability of interoperability. North America and Asia are expected to hold the biggest market share worldwide, though Europe as a region is expected to grow at a faster rate.

“Data is being generated at tremendous rates unlike ever before, and its explosive growth is outstripping human capacity to understand it, and mine it for business insights,” said Bernard Spang, VP for IBM Software Defined Infrastructure. “At the core of the cognitive infrastructure is the need for high performance analytics of both structured and unstructured data. IBM Spectrum Computing is helping organizations more rapidly adopt new technologies and achieve greater, more predictable performance.”

IBM and Cisco combine to deliver IoT insight on the network edge

Oil and gas platform in the gulf or the sea, The world energy, OIBM and Cisco have extended a long-standing partnership to enable real-time IoT analytics and insight at the point of data collection.

The partnership will focus on combining the cognitive computing capabilities of IBM’s Watson with Cisco’s analytics competencies to support data action and insight at the point of collection. The team are targeting companies who operate in remote environments or on the network edge, for example oil rigs, where time is of the essence but access to the network can be limited or disruptive.

The long promise of IoT has been to increase the amount of data organizations can collect, which once analysed can be used to gain a greater understanding of a customer, environment or asset. Cloud computing offers organizations an opportunity to realize the potential of real-time insight, but for those with remote assets where access to high bandwidth connectivity is not a given, the promise has always been out of reach.

“The way we experience and interact with the physical world is being transformed by the power of cloud computing and the Internet of Things,” said Harriet Green, GM for IBM Watson IoT Commerce & Education. “For an oil rig in a remote location or a factory where critical decisions have to be taken immediately, uploading all data to the cloud is not always the best option.

“By coming together, IBM and Cisco are taking these powerful IoT technologies the last mile, extending Watson IoT from the cloud to the edge of computer networks, helping to make these strong analytics capabilities available virtually everywhere, always.”

IoT insight at the point of collection has been an area of interest to enterprise for a number of reasons. Firstly, by decreasing the quantity of data which has to be moved transmission costs and latency are reduced and the quality of service is improved. Secondly, the bottleneck of traffic at the network core can potentially be removed, reducing the likelihood of failure. And finally, the ability to virtualize on the network edge can extend the scalability of an organization.

ABI Research has estimated 90% of data which is collected through IoT connected devices are stored or processed locally, making it inaccessible for real-time analytics, therefore it must be transferred to another location for analysis. As the number of these devices increases, the quantity of data which must be transferred to another location, stored and analysed also increases. The cost of data transmission and storage could soon prohibit some organizations from achieving the goal of IoT. The new team are hoping the combination of Cisco’s edge analytics capabilities and the Watson cognitive solutions will enable real-time analysis at the scene, thus removing a number of the challenges faced.

“Together, Cisco and IBM are positioned to help organizations make real-time informed decisions based on business-critical data that was often previously undetected and overlooked,” said Mala Anand, SVP of the Cisco Data & Analytics Platforms Group. “With the vast amount of data being created at the edge of the network, using existing Cisco infrastructure to perform streaming analytics is the perfect way to cost-effectively obtain real-time insights. Our powerful technology provides customers with the flexibility to combine this edge processing with the cognitive computing power of the IBM Watson IoT Platform.”

EMC launches storage provisioning framework for containers

Empty road and containers in harbor at sunsetEMC has announced the launch of libStorage, an open source vendor and platform-agnostic storage framework released through the EMC {code} program.

Containers have been one of the biggest buzzwords to hit the IT industry through 2015 and 2016, complications surrounding unification of the individual containers has been a challenge for developers. While several container platforms may be running in an environment, each has its own language, requiring users to treat them as silos, though EMC believe libStorage is the solution.

The offering is claimed to provide orchestration through a common model and API, creating a centralized storage capabilities for a distributed, container-driven ecosystem. libStorage will create one storage language to speak with all container platforms and one common method of support.

“The benefits of container technology are widely recognized and gaining ground all the time,” Josh Bernstein, VP of Technology at EMC {code}. “That provides endless opportunity to optimize containers for IT’s most challenging use cases. Storage is a critical piece of any technology environment, and by focusing on storage within the context of open source, we’re able to offer users—and storage vendors—more functionality, choice and value from their container deployments.”

The offering, which is available on GitHub, will support Cloud Foundry, Apache Mesos, DC/OS, Docker and Kubernetes.

“DC/OS users—from startups to large enterprises—love the portable container-operations experience our technology offers, and it’s only natural they would desire a portable storage experience as well,” Tobias Knaup, CTO at Mesosphere. “libStorage promises just this, ensuring users a consistent experience for stateful applications via persistent storage, whatever container format they’re running.”

Box sets target on US government and Europe following 37% growth in Q1

Box co-founder and chief executive Aaron Levie briefing journalists and analysts in London this week

CEO Aaron Levie briefing journalists and analysts in London 

Box has reported healthy growth over the last quarter, increasing revenues 37% to $90.2 million, which the company has attributed to a more diversified portfolio. Public sector organizations and the European market are now in the crosshairs for future growth.

The US government is an area which has seemingly been prioritized by CEO Aaron Levie and the Box team moving forward, following the announcement Box for Government achieved FedRAMP certification from the Department of Defence. As the Department of Defence claims it has some of the highest degree of scrutiny around cloud platforms and technology, the team believe the certification will create a ripple effect throughout the US.

As a number of state and local government agencies lean on federal standards for guidance on what cloud technologies to adopt, the certification could lead to positive strides for the company. Levie highlighted the certification, as well as the partnership with IBM, has created a healthy sales pipeline for the team over recent months in the public sector segment.

The company added more than 5,000 customers to its ranks over the period, taking the total number to more than 62,000 businesses. Box now has 46 million users worldwide, of which 13% are now paying. Levie also highlighted work on its customer services processes has paid off over the quarter as customer churn rate is now below 3%.

“In Q1 we achieved record revenue of $90 million, up 37% year over year,” said Levie. “We also continue to gain operational efficiency and demonstrate leverage in our business model as we move towards our commitment to achieve positive free cash flow in the fourth quarter and in January 2017. Looking ahead underlying demand for Box remains very strong and our competitive position in the market has never been better. “

We created record sales pipeline in the quarter with several seven figure deals in the mix. This has been driven by the growing demand for a modern approach to enterprise content management, our differentiated product offerings and our maturing partnerships that are becoming an integral part of our go to market strategy.”

Box’s expansion strategy over recent months has been built upon the diversification of its product portfolio, but also its partner ecosystem. Firstly from a product perspective, the team launches Box Zones which enables organizations to dictate where data is stored around the world. This offering was brought about through the partnership with IBM.

Data residency is proving to be a sensitive area in recent months due to the confusion over data residency concerns following the decision of the Court of Justice of the European Union to strike down the Safe Harbour agreement, and the subsequent criticism its successor, EU-US Privacy Shield, has received. The Box Zones offering would appear to be the company’s means of negating the impact of data residency by removing the concern of transatlantic data transmission. The team claim the offering has not only gained traction with new customers, but also created a number of upselling opportunities for companies who have operations in regions where data protection rules are more stringent than the US.

Aside from Box Zones, the team has also launched a number of new offerings including its Governance product, KeySafe and the aforementioned Box for Government offering. Aside from creating new opportunities in the US, the product diversification has also been credited with growth in new regions, which is a key pillar for the Box expansion plans.

From a partner ecosystem perspective, the quarter saw a number of new announcements as well as positive wins out of longer standing relationships. Box announced a new partnership with Adobe in April, aiming to simplify working with digital documents in the cloud, though Levie was particularly focused on the relationship with Microsoft, which has yielded positive results throughout the quarter.

“And nowhere is our ecosystem strategy more relevant than our partnership with Microsoft which continues to yield significant dividends,” said Levie. “For the first time ever customers can now collaboratively edit their Office documents that are stored in Box or edit them on their iPad or iPhone. Adoption of Office 365 continues to be a key driver for new customers to invest in Box as well as allow existing customers to expand their usage of Box.”

Partnerships currently influence around 20% of Box’s revenues which aside from Microsoft also includes AT&T and IBM. The partnership with IBM has been particularly successful in the company’s drive towards Europe, where the option to store data in Big Blue’s German and Irish data centres is attractive, according to Levie.

Singapore tests out its green finger on data centres

Location Singapore. Green pin on the map.Singapore has continued its drive towards becoming the worlds’ smartest nation by announcing trials for a Tropical Data Centre (TDC), which could potentially reduce energy consumption in data centres by up to 40%.

The Infocomm Development Authority of Singapore (IDA), the government body responsible for the development and growth of the infocomm sector, will conduct the PoC with the aim of creating a data centre which can operate in up to 38 degrees Celsius, and humidity levels of up to 90%. Data centres are generally cooled to temperatures between 20 and 25 degrees Celsius, and operate efficiently in humidity of between 50-60%. The PoC will be conducted with simulated data, creating various different ‘live’ conditions such as peak surges or transferring of data.

The trial is part of the IDA’s Green Data Centre Programme which was launched in 2014 and aims to create a more energy efficient data centre, as well as guidelines for more sustainable computing, through the implementation of emerging technologies. Partners of the programme include Dell, ERS, Fujitsu, Hewlett Packard Enterprise, Huawei, Intel, Keppel Data Centres, The Green Grid, and Nanyang Technological University.

“With Singapore’s continued growth as a premium hub for data centres, we want to develop new technologies and standards that allow us to operate advanced data centres in the most energy efficient way in a tropical climate,” said Khoong Hock Yun, Assistant Chief Executive of the IDA. “New ideas and approaches, such as raising either the ambient temperature or humidity, will be tested to see if these can greatly increase our energy efficiency, with insignificant impact on the critical data centre operations.

“To create new value in our Smart Nation journey, we need to embrace an attitude of experimentation, to be willing to develop new ideas together, and test the feasibility of progressive and positive technological advancements that has a good possibility to enhance our industry’s competitiveness.”

The IDA has run a number of initiatives over recent years in its quest for Singapore to be named as the world’s first ‘Smart Nation’. The country already received an impressive number of accolades including world’s fastest broadband nation by Ookla and the top and fastest-changing digital economy, according to Tufts University. Singapore is currently being impacted by a number of global trends including population growth, emissions and mobility, which are driving the efforts towards a smart nation.

Singapore is one of the most densely populated nations in the world, with nearly 8,000 people per square kilometre, with these number expected to rise. This is having a substantial impact on the emission levels, traffic, mobility, employment and energy demands on the city state. Singapore’s response has been to create a nation state which is driven by big data and analytics technologies, and next-generation connected and sensor networks. The new initiatives have seemingly had a positive impact on innovation within the city as the number of start-ups has increased from 24,000 in 2005, to 55,000 in 2014.

EU-US privacy debate continues as EDPS says try again

EuropeOn-going efforts to provide clarity and guidance on transatlantic data transmission are unlikely to be seen soon as the European Data Protection Supervisor (EDPS) has outlined concerns over the robustness of the Safe Harbour successor, EU-US Privacy Shield.

European Data Protection Supervisor, Giovanni Buttarelli, outlined his concerns on whether the proposed agreement will provide adequate protection against indiscriminate surveillance as well as obligations on oversight, transparency, redress and data protection rights.

“I appreciate the efforts made to develop a solution to replace Safe Harbour but the Privacy Shield as it stands is not robust enough to withstand future legal scrutiny before the Court,” said Buttarelli. “Significant improvements are needed should the European Commission wish to adopt an adequacy decision, to respect the essence of key data protection principles with particular regard to necessity, proportionality and redress mechanisms. Moreover, it’s time to develop a longer term solution in the transatlantic dialogue.”

This is in fact the second time in a matter of months an official body has expressed concerns over the EU-US Privacy Shield, as the Article 29 Working Group voiced its concerns over the mass surveillance and oversight shortcomings that it believes are found in the pact. Back in April, WP29 commented Privacy Shield had made progress but still hadn’t covered the cracks which had Safe Harbour kicked out last year.

“The WP29 notes the major improvements the Privacy Shield offers compared to the invalidated Safe Harbour decision. Given the concerns expressed and the clarifications asked, the WP29 urges the Commission to resolve these concerns, identify appropriate solutions and provide the requested clarifications in order to improve the draft adequacy decision and ensure the protection offered by the Privacy Shield is indeed essentially equivalent to that of the EU,” said the WP29 group in its official opinion at the time.

The new Privacy Shield agreement does in fact encourage European businesses and organizations to be more considered and conservative when sharing data with US entities, however critics of the new agreement have highlighted there are still too many exceptions where the US and its intelligence agencies can move around the agreement.

While the opinion of the WP29 is respected throughout the industry, it was not a concrete sign that anything within the Privacy agreement will change. This is the same for the EDPS. There are no guarantees the agreement will be changed following Buttarelli making his opinion public, though it may be a good indicator as to what need to be done to ensure the pact stands up to scrutiny under the spotlight from the European Court of Justice. This is certainly the case for David Mount, Director of Security Solutions at Micro Focus.

“Buttarelli talks of a need for significant improvements before the agreement can be viable, which raises a key point around the self-certification aspects of Safe Harbour as it once was,” said Mount. “In the past, businesses could self-certify as compliant with Safe Harbour by simply ticking a box. But this does not create a transparent and trusting climate – in fact it does the very opposite, as is the case in any self-regulated environment.

Twitter comments“Any new agreement must be more robust, as per Buttarelli’s comments, and addressing the key issue of self-certification would be a significant step. It will be interesting to see how the EU Commission responds to the EDPS and how negotiations will continue to address the varying issues of self-certification and trust.”

Support for the agreement has been mixed as some European corners have voiced concerns, and some US opinions have been relatively positive, though this may be considered unsurprising. MEP Jan Philipp Albrecht and Edward Snowden were two who demonstrated a critical stance (see accompanying picture), while Microsoft become one of the first major US tech companies to confirm its support of the EU-US Privacy Shield.

Back in April, John Frank, Vice President EU Government Affairs at Microsoft said “we recognize that privacy rights need to have effective remedies. We have reviewed the Privacy Shield documentation in detail, and we believe wholeheartedly that it represents an effective framework and should be approved.”

Although Microsoft has demonstrated a desire to bring the issue to an end, it has also found itself on the wrong side of data requests from the US government, proving it’s no push over. The company has been involved in a drawn out lawsuit, as Microsoft has refused the US government access to data which is has stored in its Dublin data centre, telling the government it “must respect the sovereignty of other countries”.

The company has also filed a lawsuit against the US government and its associated agencies, arguing the right that customers should have the right to know when the state accesses their emails or records, as well as creating the Data Trustee model. The Data Trustee model is seemingly an effort to rebuild trust in the US business, as it hands control of its data over to a European company, in this case Deutsche Telekom, who have to give consent for a Microsoft employee to access the data.

“Businesses have already started looking to alternatives for legitimate data transfers out of the EU in case the Privacy Shield option, once formally adopted, should be taken away,” said Deema Freij, Global Privacy Officer at Intralinks. “For example, Binding Corporate Rules and EU Model Clauses are still seen as strong alternatives. Businesses have been switching to EU Model Clauses to transfer personal data to the US, which they can continue to do on an ongoing basis.

“The responsibility for businesses is only going to increase when the General Data Protection Regulation (GDPR) comes into full effect in May 2018. The next two years will be a huge test for organisations across the world as they begin to realise that data sharing practices will continue to fall under close scrutiny as the concept of data privacy evolves further.”

The EU-US Privacy Shield has made progress in addressing the concerns voiced by European citizens, companies and legislative bodies in recent months, though it is unlikely to be the final answer. In three months, two separate, independent and widely respected opinions have highlighted the short-comings of the agreement, which doesn’t inspire a huge level of confidence. How the Privacy Shield creators react to the opinion is yet to be seen, though it could be one of the deciding factors on how long the transatlantic data transmission argument continues.

UK Competition and Markets Authority gives cloud providers a telling off

The seamstress the neck sews clothes in the StudioThe Competition and Markets Authority (CMA) is concerned a proportion of cloud storage providers could breach consumer protection law in their terms and conditions, as well as business practises.

Alongside the report, the CMA has sent an open letter to all cloud providers outlining guidance on how the organization can ensure they remain true to the Consumer Rights Act 2015, as well as advice to consumers on the topic.

The concerns are mainly focused around three areas. Firstly, some cloud providers are currently able to change the service or the terms of the contract without giving customers prior notice. Secondly, the cloud provider currently has the ability to suspend or terminate the contract without notice for any reason. And finally, cloud providers are able to automatically renew a contract at the end of a fixed term without giving notice or withdrawal rights.

“Cloud storage offers a convenient means of keeping family photos, favourite music and films and important documents safe, and accessing them quickly from any device,” said Nisha Arora, CMA Senior Director for Consumer. “Our review found that people find these services really valuable. However, we also heard some complaints resulting from unfair terms in contracts. If left unchanged, these terms could result in people losing access to their treasured possessions or facing unexpected charges.

“In this rapidly-developing market, it’s important that we act now to ensure that businesses comply with the law and that consumers’ trust in these valuable services is maintained. We welcome the fact that a number of companies have already agreed to change their terms, and expect to see improvements from other companies.”

Although the CMA has not confirmed which cloud providers were potentially in breach of consumer protection law, it did comment Dixons Carphone, JustCloud and Livedrive have committed to changing their terms, as well as business practises.

The CMA also commented that while they were confident there would not be any breaches of consumer protection law following the report, any non-compliance in the future could lead to enforcement action and the CMA could apply to a court for an enforcement order. If that were breached it could be contempt of court and lead to an unlimited fine.

Intel continues to innovate through Itseez acquisition

IntelIntel has continued its strides into the IoT market through the acquisition of Itseez, a computer vision and machine learning company.

Itseez, which was founded by two former Intel employees, specializes in computer vision algorithms and implementations, which can be used for a number of different applications, including autonomous driving, digital security and surveillance, and industrial inspection. The Itseez inclusion bolsters Intel’s capabilities to develop technology which electronically perceive and understand images.

“As the Internet of Things evolves, we see three distinct phases emerging,” said Doug Davis, GM for the Internet of Things Group at Intel. “The first is to make everyday objects smart – this is well underway with everything from smart toothbrushes to smart car seats now available. The second is to connect the unconnected, with new devices connecting to the cloud and enabling new revenue, services and savings. New devices like cars and watches are being designed with connectivity and intelligence built into the device.

“The third is just emerging when devices will require constant connectivity and will need the intelligence to make real-time decisions based on their surroundings. This is the ‘autonomous era’, and machine learning and computer vision will become critical for all kinds of machines – cars among them.”

The acquisition bolsters Intel’s capabilities in the potentially lucrative IoT segment, as the company continues its efforts to diversify its reach and enter into new growth markets. Last month, CEO Brian Krzanich outlined the organizations new strategy which is split into five sections; cloud technology, IoT, memory and programmable solutions, 5G and developing new technologies under the concept of Moore’s law. Efforts have focused around changing the perception of Intel from a PCs and mobile devices brand, to one which is built on a foundation of emerging technologies.

Intel’s move would appear to have made the decision of innovation through acquisition is a safer bet than organic, in-house innovation. There have been a small number of examples of organic diversification; Apple’s iPhone is one example, though the safer bet to move away from core competence is through acquisition.

Intel has dipped its toe into organic diversification, as it attempted to develop a portfolio of chips for mobile devices, though this would generally not be considered a successful venture, similar to Google’s continued efforts to organically grow into social, which could be seen as stuttering. On the contrary, Google’s advertising revenues now account for $67.39 billion (2015), with its platform being built almost entirely on acquisitions. The AdSense and Adwords services have been built and bolstered through various purchases including Applied Semantics ($102 million in 2003), dMarc Broadcasting ($102 million in 2006), DoubleClick ($3.1 billion in 2007), AdMob ($750 million in 2009) and Admeld ($400 million in 2011).

While diversification through acquisition can be seen as the safer, more practical and efficient means to move into new markets, it is by no means a guaranteed strategy. Intel’s strategy could be seen as a sensible option as there are far more examples off successful diversification through acquisition compared to organic growth. The jury is still out on Intel’s position in the IoT market but there are backing the tried and tested route to diversification.

What did we learn from PwC’s local government survey?

City HallPwC has recently released findings from its annual survey, The Local State We’re In, which assesses the challenges facing local government and their responses to them, as well as looking at public opinion on the organizations capabilities.

Here, we’ve pulled out four of the lessons we learnt from the report:

Data Analytics is top of the agenda for CEOs and Local Government Leaders

A healthy 91% of the Chief Execs surveyed confirmed Data Analytics was an area which they were well equipped. This in fact was the most popular answer for this specific question, as other areas such as business intelligence (59%), supply chain management (55%) and information governance & records management (40%) fared less so.

While it is encouraging the leaders are confident in their team’s ability to perform in the data analytics world, the research also stated local government’s use of structured and unstructured data varies quite considerably. 71% of the Chief Execs agreed they were using structured data (e.g. information in government controlled databases), whereas this number drops to 33% when unstructured data (e.g. social media and data generated through search engines) is the focal point of the question.

As the consumer continues its drive towards digital and the connected world, the level of insight which can be derived through unstructured data, social media in particular, will continue to increase. Back in 1998 Merrill Lynch said 80-90% of all potentially usable business information may originate in unstructured form. This rule of thumb is not based on primary or any quantitative research, but is still accepted by some in the industry. Even if this number has dropped, there is a vast amount of information and insight which is being missed by the local government.

But data driven decision making isn’t

Throughout the industry, data driven decision making has been seen as one of the hottest growing trends, and also as the prelude to the introduction of artificial intelligence.

Despite the media attention such ideas are receiving, it would appear these trends are not translating through to local government. Only 41% of the respondents said their organization is using data analytics to inform decision making and strategy. It would appear local government is quite effective (or at least confident) at managing data, but not so much at using it for insight.

Digital Device Tablet Laptop Connection Networking Technology ConceptPublic is not confident in local government’s ability to embrace digital

Although leadership within the local authorities themselves are happy with the manner in which their organization has embraced digital, this confidence is not reflected by the general public.

76% of Chief Execs who participated in the research are confident in their own digital strategies, however only 23% of the general public are confident in the council’s ability to manage the transition through to digital. This is down from 28% in the same survey during 2015 and 29% in 2014. The findings could demonstrate the rigidity of government bodies, especially at a local level, as it would appear the evolution of emerging technologies is outstripping local government’s ability to incorporate these new ideas and tools.

There is also quite a significant difference in how the public and the Chief Execs view cyber security. While only 17% of the Chief Execs believe their organization is at risk from cyber threats, 70% of the general public are not confident local government will be able to manage and share their personal information appropriately. 2016 has already seen a number of high profile data breaches which could have an impact on the opinions of the general public. If tech savvy enterprise organizations such as TalkTalk cannot defend themselves, it may be perceived that public sector organizations are less likely to do so.

However, local government does have the backing from the public to invest in digital

The general public would not appear to currently have great confidence in the local government’s current ability to embrace the digital age however they have seemingly given their blessing for the local government to continue investments.

39% of the general public who completed the survey highlighted their preference for engagement with local government would be through a digital platform, as opposed to the 24% who would prefer the telephone and 28% who would rather engage in person. Unfortunately, while digital is the most popular option for engaging, only 37% were satisfied with the current digital access to local government, down from 38% in last year’s research.