Category Archives: data analytics

What is the promise of big data? Computers will be better than humans

AI-Artificial-Intelligence-Machine-Learning-Cognitive-ComputingBig data as a concept has in fact been around longer than computer technology, which would surprise a number of people.

Back in 1944 Wesleyan University Librarian Fremont Rider wrote a paper which estimated American university libraries were doubling in size every sixteen years meaning the Yale Library in 2040 would occupy over 6,000 miles of shelves. This is not big data as most people would know it, but the vast and violent increase in the quantity and variety of information in the Yale library is the same principle.

The concept was not known as big data back then, but technologists today are also facing a challenge on how to handle such a vast amount of information. Not necessarily on how to store it, but how to make use of it. The promise of big data, and data analytics more generically, is to provide intelligence, insight and predictability but only now are we getting to a stage where technology is advanced enough to capitalise on the vast amount of information which we have available to us.

Back in 2003 Google wrote a paper on its MapReduce and Google File System which has generally been attributed to the beginning of the Apache Hadoop platform. At this point, few people could anticipate the explosion of technology which we’ve witnessed, Cloudera Chairman and CSO Mike Olson is one of these people, but he is also leading a company which has been regularly attributed as one of the go-to organizations for the Apache Hadoop platform.

“We’re seeing innovation in CPUs, in optical networking all the way to the chip, in solid state, highly affordable, high performance memory systems, we’re seeing dramatic changes in storage capabilities generally. Those changes are going to force us to adapt the software and change the way it operates,” said Olson, speaking at the Strata + Hadoop event in London. “Apache Hadoop has come a long way in 10 years; the road in front of it is exciting but is going to require an awful lot of work.”

Analytics was previously seen as an opportunity for companies to look back at its performance over a defined period, and develop lessons for employees on how future performance can be improved. Today the application of advanced analytics is improvements in real-time performance. A company can react in real-time to shift the focus of a marketing campaign, or alter a production line to improve the outcome. The promise of big data and IoT is predictability and data defined decision making, which can shift a business from a reactionary position through to a predictive. Understanding trends can create proactive business models which advice decision makers on how to steer a company. But what comes next?

Mike Olsen

Cloudera Chairman and CSO Mike Olsen

For Olsen, machine learning and artificial intelligence is where the industry is heading. We’re at a stage where big data and analytics can be used to automate processes and replace humans for simple tasks. In a short period of time, we’ve seen some significant advances in the applications of the technology, most notably Google’s AlphaGo beating World Go champion Lee Se-dol and Facebook’s use of AI in picture recognition.

Although computers taking on humans in games of strategy would not be considered a new PR stunt, IBM’s Deep Blue defeated chess world champion Garry Kasparov in 1997, this is a very different proposition. While chess is a game which relies on strategy, go is another beast. Due to the vast number of permutations available, strategies within the game rely on intuition and feel, a complex task for the Google team. The fact AlphaGo won the match demonstrates how far researchers have progressed in making machine-learning and artificial intelligence a reality.

“In narrow but very interesting domains, computers have become better than humans at vision and we’re going to see that piece of innovation absolutely continue,” said Olsen. “Big Data is going to drive innovation here.”

This may be difficult for a number of people to comprehend, but big data has entered the business world; true AI and automated, data-driven decision may not be too far behind. Data is driving the direction of businesses through a better understanding of the customer, increase the security of an organization or gaining a better understanding of the risk associated with any business decision. Big data is no longer a theory, but an accomplished business strategy.

Olsen is not saying computers will replace humans, but the number of and variety of processes which can be replaced by machines is certainly growing, and growing faster every day.

IBM makes software defined infrastructure smarter

IBMIBM has expanded its portfolio of software-defined infrastructure solutions adding cognitive features to speed up analysis of data, integrate Apache Spark and help accelerate research and design, the company claims.

The new offering will be called IBM Spectrum Computing and is designed to aide companies to extract full value from their data through adding scheduling capabilities to the infrastructure layer. The product offers workload and resource management features to research scientists for high-performance research, design and simulation applications. The new proposition focuses on three areas.

Firstly, Spectrum Computing works with cloud applications and open source frameworks to assist in sharing resources between the programmes to speed up analysis. Secondly, the company believes it makes the adoption of Apache Spark simpler. And finally, the ability to share resources will accelerate research and design by up to 150 times, IBM claims.

By incorporating the cognitive computing capabilities into the software-defined infrastructure products, IBM believes the concept on the whole will become more ‘intelligent’. The scheduling competencies of the software will increase compute resource utilization and predictability across multiple workloads.

The software-defined data centre has been steadily growing, and is forecasted to continue its healthy growth over the coming years. Research has highlighted the market could be worth in the region of $77.18 Billion by 2020, growing at a CAGR of 28.8% from 2015 to 2020. The concept on the whole is primarily driven by the attractive feature of simplified scalability as well as the capability of interoperability. North America and Asia are expected to hold the biggest market share worldwide, though Europe as a region is expected to grow at a faster rate.

“Data is being generated at tremendous rates unlike ever before, and its explosive growth is outstripping human capacity to understand it, and mine it for business insights,” said Bernard Spang, VP for IBM Software Defined Infrastructure. “At the core of the cognitive infrastructure is the need for high performance analytics of both structured and unstructured data. IBM Spectrum Computing is helping organizations more rapidly adopt new technologies and achieve greater, more predictable performance.”

Wipro open sources big data offering

Laptop Screen with Big Data Concept.Wipro has announced it has open sourced its big data solution Big Data Ready Enterprise (BDRE), partnering with California based Hortonworks to push the initiative forward.

The company claims the BDRE offering addresses the complete lifecycle of managing data across enterprise data lakes, allowing customers to ingest, organize, enrich, process, analyse, govern and extract data at a faster pace. BDRE is released under the Apache Public License v2.0 and hosted on GitHub. Teaming up with Hortonworks will also give the company additional clout in the market, at Hortonworks is generally considered one of the top three Hadoop distribution vendors in the market.

“Wipro takes pride in being a significant contributor to the open source community, and the release of BDRE reinforces our commitment towards this ecosystem,” said Bhanumurthy BM, COO at Wipro. “BDRE will not only make big data technology adoption simpler and effective, it will also open opportunities across industry verticals that organizations can successfully leverage. Being at the forefront of innovation in big data, we are able to guide organizations that seek to benefit from the strategic, financial, organizational and technological benefits of adopting open source technologies.”

Companies open sourcing their own technologies has become somewhat of a trend in recent months, as the product owners themselves would appear to be moving towards a service model as opposed to traditional vendor. According to ‘The Open Source Era’, an Oxford Economics Study which was commissioned by Wipro, 64% of respondents believe that open source will drive Big Data efforts in the next three years.

The report also claims open source has become a foundation stone of the technology roadmap of a number of businesses, 75% of respondent believe integration between legacy and open source is one of the main challenges and 52% said open source is already supporting development of new products and services.

IBM and Cisco combine to deliver IoT insight on the network edge

Oil and gas platform in the gulf or the sea, The world energy, OIBM and Cisco have extended a long-standing partnership to enable real-time IoT analytics and insight at the point of data collection.

The partnership will focus on combining the cognitive computing capabilities of IBM’s Watson with Cisco’s analytics competencies to support data action and insight at the point of collection. The team are targeting companies who operate in remote environments or on the network edge, for example oil rigs, where time is of the essence but access to the network can be limited or disruptive.

The long promise of IoT has been to increase the amount of data organizations can collect, which once analysed can be used to gain a greater understanding of a customer, environment or asset. Cloud computing offers organizations an opportunity to realize the potential of real-time insight, but for those with remote assets where access to high bandwidth connectivity is not a given, the promise has always been out of reach.

“The way we experience and interact with the physical world is being transformed by the power of cloud computing and the Internet of Things,” said Harriet Green, GM for IBM Watson IoT Commerce & Education. “For an oil rig in a remote location or a factory where critical decisions have to be taken immediately, uploading all data to the cloud is not always the best option.

“By coming together, IBM and Cisco are taking these powerful IoT technologies the last mile, extending Watson IoT from the cloud to the edge of computer networks, helping to make these strong analytics capabilities available virtually everywhere, always.”

IoT insight at the point of collection has been an area of interest to enterprise for a number of reasons. Firstly, by decreasing the quantity of data which has to be moved transmission costs and latency are reduced and the quality of service is improved. Secondly, the bottleneck of traffic at the network core can potentially be removed, reducing the likelihood of failure. And finally, the ability to virtualize on the network edge can extend the scalability of an organization.

ABI Research has estimated 90% of data which is collected through IoT connected devices are stored or processed locally, making it inaccessible for real-time analytics, therefore it must be transferred to another location for analysis. As the number of these devices increases, the quantity of data which must be transferred to another location, stored and analysed also increases. The cost of data transmission and storage could soon prohibit some organizations from achieving the goal of IoT. The new team are hoping the combination of Cisco’s edge analytics capabilities and the Watson cognitive solutions will enable real-time analysis at the scene, thus removing a number of the challenges faced.

“Together, Cisco and IBM are positioned to help organizations make real-time informed decisions based on business-critical data that was often previously undetected and overlooked,” said Mala Anand, SVP of the Cisco Data & Analytics Platforms Group. “With the vast amount of data being created at the edge of the network, using existing Cisco infrastructure to perform streaming analytics is the perfect way to cost-effectively obtain real-time insights. Our powerful technology provides customers with the flexibility to combine this edge processing with the cognitive computing power of the IBM Watson IoT Platform.”

What did we learn from PwC’s local government survey?

City HallPwC has recently released findings from its annual survey, The Local State We’re In, which assesses the challenges facing local government and their responses to them, as well as looking at public opinion on the organizations capabilities.

Here, we’ve pulled out four of the lessons we learnt from the report:

Data Analytics is top of the agenda for CEOs and Local Government Leaders

A healthy 91% of the Chief Execs surveyed confirmed Data Analytics was an area which they were well equipped. This in fact was the most popular answer for this specific question, as other areas such as business intelligence (59%), supply chain management (55%) and information governance & records management (40%) fared less so.

While it is encouraging the leaders are confident in their team’s ability to perform in the data analytics world, the research also stated local government’s use of structured and unstructured data varies quite considerably. 71% of the Chief Execs agreed they were using structured data (e.g. information in government controlled databases), whereas this number drops to 33% when unstructured data (e.g. social media and data generated through search engines) is the focal point of the question.

As the consumer continues its drive towards digital and the connected world, the level of insight which can be derived through unstructured data, social media in particular, will continue to increase. Back in 1998 Merrill Lynch said 80-90% of all potentially usable business information may originate in unstructured form. This rule of thumb is not based on primary or any quantitative research, but is still accepted by some in the industry. Even if this number has dropped, there is a vast amount of information and insight which is being missed by the local government.

But data driven decision making isn’t

Throughout the industry, data driven decision making has been seen as one of the hottest growing trends, and also as the prelude to the introduction of artificial intelligence.

Despite the media attention such ideas are receiving, it would appear these trends are not translating through to local government. Only 41% of the respondents said their organization is using data analytics to inform decision making and strategy. It would appear local government is quite effective (or at least confident) at managing data, but not so much at using it for insight.

Digital Device Tablet Laptop Connection Networking Technology ConceptPublic is not confident in local government’s ability to embrace digital

Although leadership within the local authorities themselves are happy with the manner in which their organization has embraced digital, this confidence is not reflected by the general public.

76% of Chief Execs who participated in the research are confident in their own digital strategies, however only 23% of the general public are confident in the council’s ability to manage the transition through to digital. This is down from 28% in the same survey during 2015 and 29% in 2014. The findings could demonstrate the rigidity of government bodies, especially at a local level, as it would appear the evolution of emerging technologies is outstripping local government’s ability to incorporate these new ideas and tools.

There is also quite a significant difference in how the public and the Chief Execs view cyber security. While only 17% of the Chief Execs believe their organization is at risk from cyber threats, 70% of the general public are not confident local government will be able to manage and share their personal information appropriately. 2016 has already seen a number of high profile data breaches which could have an impact on the opinions of the general public. If tech savvy enterprise organizations such as TalkTalk cannot defend themselves, it may be perceived that public sector organizations are less likely to do so.

However, local government does have the backing from the public to invest in digital

The general public would not appear to currently have great confidence in the local government’s current ability to embrace the digital age however they have seemingly given their blessing for the local government to continue investments.

39% of the general public who completed the survey highlighted their preference for engagement with local government would be through a digital platform, as opposed to the 24% who would prefer the telephone and 28% who would rather engage in person. Unfortunately, while digital is the most popular option for engaging, only 37% were satisfied with the current digital access to local government, down from 38% in last year’s research.

Salesforce SMB’s business leader talks data analytics, AI and the age of entrepreneurship

Sanj Salesforce

Sanj Bhayro, SVP EMEA Commercial at Salesforce

While the business world has traditionally favoured the biggest and the richest, cloud as a technology is seen as the great equalizer. Through a transition through to the cloud, SMBs are being empowered to take on the nemesis of enterprise business, with the number of wins growing year-on-year.

This, according to Salesforce’s Sanj Bhayro, is one of the most exciting trends we’re now witnessing in business throughout the world. Bhayro currently leads the EMEA SMB business at Salesforce and for almost 11 years has been part of the team which has seen the power of intelligent CRM systems grow backroom businesses to industry giants. Just look at the growth and influence of companies such as Uber and AirBnB for justification of his claims.

“The SMB business in Salesforce is one of the most exciting, because we get to work with really innovative companies,” said Bhayro. “All the innovation in the industry is coming from these small to medium sized businesses. They are disrupting the traditional market which is in turn forcing the traditional players to transform their own business models.

“Something which is interesting from our perspective at Salesforce is that when we started 17 years ago the internet wasn’t that prevalent, the cloud wasn’t a word that was used that often, and it was the SMB companies who adopted our technology. The cloud offered them the operational efficiency, the scale and the reach to take on these traditional players. These smaller organizations are looking more and more towards technology as the enabler for innovation.”

The majority of the SMBs could be considered to be too small to drive innovation in-house. For the most part, the IT department is small, and responsible for ‘keeping the lights on’, working through the cloud has enabled innovation and created opportunities for these organizations. And for the most part, the ability to be innovative is much more prominent in the smaller organizations.

Note: There is a poll embedded within this post, please visit the site to participate in this post’s poll.

The fail-fast business model is one which has captured the imagination of numerous enterprise organizations around the world. Amazon CEO Jeffrey Bezos recently claimed the fail-fast model was the catalyst for recent growth within the AWS business, though the majority are seemingly struggling to implement the right culture which encourages learning and innovating through failing. For the majority, failure is simply failure, not part of the journey to success.

But this in itself is one of the ways in which the smaller, more agile organizations are innovating and catching enterprise scale businesses. The implementation of cloud platforms speeds up the failures and lessens negative impacts on the business, to further drive the journey to innovation.

“For start-ups and early stage companies, failing is an accepted mentality. How many companies are actually the same as when they started? They failed, learned and then progressed. As businesses become bigger and bigger it becomes a lot more difficult. Certainly for larger companies there is a lot more friction around the fail-fast model. Smaller companies are culturally set up to allow them to pivot and try new things, whereas larger ones, purely because of their size, are constrained.”

Outside of the SMB team, Salesforce engineers have been prioritizing the use of artificial intelligence for future product launches and updates. This was reinforced during the company’s quarterly earnings call in recent weeks as CEO Marc Benioff backed AI as the next major growth driver. While there is potential for AI in the SMB market place, for the moment it is only for those who are ahead of the curve.

For the most part, data analytics is starting to drip down into smaller organizations, though there is still a substantial amount of data which is not being utilized. For Bhayro, as the concept of the cloud is now ubiquitous, the opportunities are almost limitless. But only once these organizations have got on top of managing their own data, breaking down the silos within the business.

Robotic hand, accessing on laptop, the virtual world of information. Concept of artificial intelligence and replacement of humans by machines.“AI translates well into the SMB business model and it will be the SMBs who drive where AI goes,” said Bhayro. “There are generally two camps when it comes to the SMB market, those who are cloud-native, those who capitalizing on the sharing-economy and those who are more traditional organizations. The shift that the traditional business has to make to break down the silos, and to move towards a cloud back-end is far more difficult than a company like Deliveroo who started in the cloud and can scale. Never the less that shift has to be made.”

“So much data is being created and there’s so much that you can do with it. The problem is that so many companies are not doing enough with their data. Recent reports stated that most companies can only analyse 1% of their data. Even before we start moving towards AI technologies, the way we service intelligence is through insight. We need to provide the right tools to make data available and malleable, to everybody in your business. These data analytics tools are the first steps and then we can look forward to AI technologies.”

The UK government has made numerous schemes available to SMBs to encourage the growth of this subsector in recent years, and Bhayro believes these efforts have been playing off in the international markets.

“I delighted to say that the UK takes a leadership position (in relation to SMB growth and innovation in comparison to the rest of Europe),” said Bhayro. “Something in the region of 95-96% of the companies in the UK are SMBs, and the government is currently doing the right things to encourage and propel entrepreneurs. I think we’re in the time of entrepreneurship, and this is the time for people to have the vision and grow. These companies are having wonderful ideas, and they are moving into the growth period, but it’s the customer experience which really differentiates them from the competition. Not many of these companies are set up to achieve customer experience objectives, but this is where we (Salesforce) come in.”

Accenture outlines future of cloud and data analytics in sport

Accenture 3Although the digital age has created a wealth of opportunities for organizations to create new revenue streams and attract new audiences, maintaining engagement of these customers is becoming an increasing difficult job, according to Accenture’s Nick Millman.

The availability and ease of information in the 21st century has created a new dynamic where consumers are now becoming increasingly competent at multi-tasking and operating several devices, which has made the task of keeping a viewer’s attention throughout the course of a sporting event more challenging. Millman, who leads the Big Data & Analytics Delivery at Accenture, are using this dynamic to create new engagement opportunities for the Six Nations.

“There will be a number of people who will watch the entirety of a match, however there will be others who will be playing with their tablet or phone and enjoying the multi-screen experience,” said Millman. “To keep the level of engagement, sports need to become more digital themselves, providing more insight and data to fans who are watching the game. Ideally you want them to be on their phone looking at something which is relevant to the game as opposed to Facebook or what their friends are doing.”

Accenture first teamed up with the Six Nations as a technology partner four years ago, where the initial partnership focused on demonstrating the company’s mobility capabilities through creating the official app. What started as a basic app now acts as a delivery platform where Accenture can showcase their data analytics capabilities, processing more than 2 million rows of data per game and creating visuals in (near) real-time to tell a different story behind the sport itself.

The data itself is not necessarily the greatest use to the fans, so Accenture has brought in rugby experts year-on-year to help understand the nuances of the information. This year Nick Mallet, Ben Kay and David Flatman helped the team tell the story. This is the same in the business world. Data analysts themselves may not be able to make the right decisions when it’s comes to the application of the data, as they wouldn’t understand the market in the same way as a Managing Director who has been in the industry for 30 years. The application of data in sport and the business world will only be effective when it is merged with expertise and experience to provide context.

Accenture 2“One of the interesting things which we saw is that there is now an interesting dynamic between data driven decisions and gut feel,” Millman highlighted. “In some cases when you are watching the game you may think that one player would be considered the best on the park, but the data tells a different story. Seeing one hooker for example hit every line out perfectly might make him look like the most effective, but the data might suggest the opposition hooker who produced several small gains when carrying the ball had a greater impact on the game.

“This can translate into the business world also, as a marketing team may have a better feel about a product which it wants to push out to the market, but the data team have evidence which shows resource should be focused on a different area of the business,” said Millman. “I don’t think there is a right answer to what is better, data driven decision making or intuition, but it’s an interesting dynamic. The successful businesses will be the ones who are effective at blending the data and the skills to come to the right outcome.”

While the role of analytics is becoming more prominent in sport and the business world, there is still some education to be done before the concepts could be considered mainstream. Analytics may be big business in the enterprise segments, but there are still a large proportion of SMBs who do not understand the power of data analytics for their own business. The ability to cross sell, develop a stronger back story of your customer, maintain engagement or even implement artificial intelligence programs is only available once the core competencies of big data and analytics are embraced within the organization.

Accenture 1For Accenture, wearables and IoT are next on the horizon and potentially virtual reality in the future. This year the app was available on the Apple watch, as Millman is starting to see trends which could shift the consumption of data once again.

“It’s still early days, but some of the consumption of data is likely to shift from tablets and smartphones,” said Millman. “Like it shifted from desktops to laptops to smartphones and tablets, it may shift to wearable devices in the future.

“Also this year we build a prototype using virtual reality to immerse people into the rugby experience. I’m not sure VR will become mainstream in a sporting context in the next 12-18 months but I think increasingly VR and AR (augmented reality) will become a part of the sports viewing experience.”

Can your analytics tools meet the demands of the big data era?

New productSpeaking at Telco Cloud, Actian’s CTO Michael Hoskins outlined the impact big data is having on the business world, and the challenges which are being faced by those who are not keeping up with the explosion of data now available to decision makers.

The growth of IoT and the subsequent increase is data has been widely reported. Last year, Gartner predicted the number of connected ‘things’ would exceed 6.4 billion by the end of 2016 (an increase of 22% from 2015), and continue to grow to beyond 20.8 billion by 2020. While IoT is a lucrative industry, businesses are now facing the task of not only managing the data, but gaining insight from such a vast pool of unstructured information.

“Getting a greater understanding of your business is the promise of big data,” said Hoskins. “You can see things which you never were able to before, and it’s taking business opportunities to the next generation. The cloud is really changing the way in which we think about business models – it enables not only for you to understand what you are doing within your business, but the industry on the whole. You gain insight into areas which you never perceived before.”

Actian is one of a number of companies who are seemingly capitalizing on not only the growth of IoT and big data, but also the fact it has been rationalized by decision makers within enterprise as a means to develop new opportunities. The company has been building its presence in the big data arena for five years, and has invested more than $300m in growing organically, as well as acquiring new technology capabilities and expertise externally. As Hoskins highlighted to the audience, big data is big business for Actian.

Actian - Mike Hoskins

Actian’s CTO Michael Hoskins

But what are the challenges which the industry is now facing? According to Hoskins, the majority of us don’t have the right tools to fully realize the potential of big data as a business influencer.

“The data explosion which is hitting us is so violent, it’s disrupting the industry. It’s like two continents splitting apart,” said Hoskins. “On one continent we have the traditional tools, and on the other we have the new breed of advanced analytics software. The new tools are drifting away from the traditional, and the companies who are using the traditional are being left behind.”

Data analytics as a business practise is by no means a new concept, but the sheer volume, variety and speed at which data is being collected means traditional technologies to analyse this data are being made redundant. Hoskins highlighted they’re too slow (they can’t keep up with the velocity of collection), they’re too rigid (they can’t comprehend the variety of data sets), and they’re too cumbersome (they can’t manage the sheer volume of data). In short, these tools are straining under the swell.

The next challenge is scaling current technologies to meet the demands, which leaves most cases is a very difficult proposition. It’s often too short-term, too expensive and the skills aren’t abundant enough. Hoskins believes the time-cost-value proposition simply does not make sense.

“The journey of modernization goes from traditional, linear tools, through to business intelligence and discovery, this is where we are now, through to decision science,” said Hoskins. “Traditional tools enable us to look back at what we’ve done and make reactive decisions, but businesses now want to have a forward looking analytics model, drawing out new insights to inform decision making. But this cannot be done with traditional tools.

“This is the promise of advanced analytics. The final stage is where we can use data analytics to inform business decisions; this is where data becomes intelligence.”

GE launches asset management offering for manufacturing industry

Engine manufactoringGE Digital has launched its suite of its suite of Asset Performance Management (APM) solutions, a cloud-based offering running on its Predix platform, to monitor industrial and manufacturing equipment and software.

The company claims industrial customers can now use data and cloud-based analytics to improve the reliability and availability of their GE and non-GE assets. While APM would generally not be considered a concept, GE claims its offering is the first commercially available to support the industrial data generated by a company’s assets, both physical and software based.

The launch builds on underlying IoT trends within the industrial and manufacturing industry to move towards a proactive performance strategy for their assets, repairing said assets before a maintenance issue as opposed to reacting to a fault.

“GE’s deep expertise in developing and servicing machines for industry gives us a greater understanding of real business operations and the insights to deliver on industry needs,” said Derek Porter, GM for Predix Applications at GE Digital. “With the launch of our APM solutions suite, GE is commercialising its own best practices for customers.”

The offering is split into three tiers. Firstly, a machine and equipment health reporting system will provide a health-check on the asset, detailing performance levels in real-time. Secondly, a reliability tool predicts potential problems within an asset, allowing engineers to schedule maintenance activities. And finally, a maintenance optimization tool will be available later in 2016 to optimize long-term maintenance strategies, which GE claim will enable customers to increase the lifecycle of the asset and reduce downtime.

Note: There is a poll embedded within this post, please visit the site to participate in this post’s poll.

The company also launched the generally available module of GE Digital’s Brilliant Manufacturing software suite, Efficiency Analyzer, which will be available through a new SaaS pricing model. Once again, the product offering is built on the need to analyse and activate data collected within manufacturing operations, to improve operational efficiency. One of the first use cases advertised by the company has been within its own transportation division.

“GE’s Brilliant Manufacturing Suite has enabled significant reduction in unplanned machine downtime resulting in higher plant efficiency,” said Bryce Poland, Advanced Manufacturing Brilliant Factory Leader, GE Transportation. “As part of our digital thread strategy, we will increase our machines and materials visibility by 400% in 2016.”

From Data Collection and Analysis to Business Action

Data Collection and AnalysisGuest post from Azmi Jafarey. Azmi is an IT leader with over 25 years of experience in IT innovation. He was CIO at Ipswitch,, Inc. for the last nine years, responsible for operations, infrastructure, business apps and BI. In 2013, he was named CIO of the year by Boston Business Journal and Mass High Tech. You can hear more from Azmi on his blog: http://hitechcio.com/

 

Here is a progression that most businesses experience in the data arena.

  • You go from no data or bad data to “better” data.
  • You start having reports regularly show up in your mailbox.
  • The reports go from being just tables to showing trend lines.
  • You evolve to dashboards that bring together data from many sources.
  • You fork into sets of operational and strategic reports and dashboards, KPIs driven, with drill down.

By this point, you have Operational Data Stores (ODSs), data warehouses, a keen sense of the need for Master Data, keeping all systems in synch and an appreciation of defined data dictionaries.  You expect data from all functions to “tie together” with absolute surety – and when it does not, it is usually traced to having a different understanding of data sources, or data definitions.  But you are there, feeling good about being “data driven”, even as you suspect that that last huge data clean-up effort may already be losing its purity to the expediency of daily operations.  How?  Well, someone just created a duplicate Opportunity in your CRM, rather than bother to look up if one exists.  Another person changed a contact’s address locally, rather than in a Master.  And so it goes.

Sadly, for most businesses “data-driven” stops at “now you have the numbers” — an end in itself.  At its worst, reports become brochure-ware, a travel guide for the business that is “interesting” and mainly used to confirm one’s suspicions and biases.  Also, at its worst, many “followed” KPIs consume enormous amounts of time and effort to come up with a number, paint it green, yellow or red when compared to a target, and then these act mainly as trigger points for meetings rather than measured response.

I have nothing against meetings.  I am just anxious for the business mindset to go beyond “descriptive” and “predictive” analytics to “prescriptive” analytics.   Thus for Sales we seem to stop at “predictive” – forecasts are the holy grail, a look into the future, couched in probability percentages.  Forecasts are indeed very useful and get reacted to.  It is just that it is a reaction whose direction or magnitude are usually delinked from any explicit model.  In today’s world instinct cannot continue to trump analysis.  And analysis is meaningful only in the context of suggesting specific action, tied to business results as expected outcomes.  The data must not only punt the can down the road – it must tell you exactly how hard and in which direction to punt.  And the result must be measured for the next round to follow.

One of the really interesting things about data modeling, predictive and prescriptive analytics is that for all three the starting point is precisely the same data.  After all, that is what you know and have.   The difference is the effort to model and the feedback loop where measurable action and measured consequence can be used to refine action and hence outcomes.  Part of the problem is also that the paradigm in today’s business world is for leaders who provide direction on actions to be farthest from those who know data well.  Without personal exploration of relevant data, you revert to an iterative back-and-forth requesting new data formats from others.  The time to search for such “insight” can be dramatically shortened by committing to modeling and measuring results from the get go.  Bad models can be improved.  But lacking one is to be adrift.

Before you begin to wonder “Is the next step Big Data?  Should we be thinking of getting a Data Scientist?” start with the basics: training on analytics, with a commitment to model.  Then use the model and refine.