Guest post from Azmi Jafarey. Azmi is an IT leader with over 25 years of experience in IT innovation. He was CIO at Ipswitch,, Inc. for the last nine years, responsible for operations, infrastructure, business apps and BI. In 2013, he was named CIO of the year by Boston Business Journal and Mass High Tech. You can hear more from Azmi on his blog: http://hitechcio.com/
Here is a progression that most businesses experience in the data arena.
- You go from no data or bad data to “better” data.
- You start having reports regularly show up in your mailbox.
- The reports go from being just tables to showing trend lines.
- You evolve to dashboards that bring together data from many sources.
- You fork into sets of operational and strategic reports and dashboards, KPIs driven, with drill down.
By this point, you have Operational Data Stores (ODSs), data warehouses, a keen sense of the need for Master Data, keeping all systems in synch and an appreciation of defined data dictionaries. You expect data from all functions to “tie together” with absolute surety – and when it does not, it is usually traced to having a different understanding of data sources, or data definitions. But you are there, feeling good about being “data driven”, even as you suspect that that last huge data clean-up effort may already be losing its purity to the expediency of daily operations. How? Well, someone just created a duplicate Opportunity in your CRM, rather than bother to look up if one exists. Another person changed a contact’s address locally, rather than in a Master. And so it goes.
Sadly, for most businesses “data-driven” stops at “now you have the numbers” — an end in itself. At its worst, reports become brochure-ware, a travel guide for the business that is “interesting” and mainly used to confirm one’s suspicions and biases. Also, at its worst, many “followed” KPIs consume enormous amounts of time and effort to come up with a number, paint it green, yellow or red when compared to a target, and then these act mainly as trigger points for meetings rather than measured response.
I have nothing against meetings. I am just anxious for the business mindset to go beyond “descriptive” and “predictive” analytics to “prescriptive” analytics. Thus for Sales we seem to stop at “predictive” – forecasts are the holy grail, a look into the future, couched in probability percentages. Forecasts are indeed very useful and get reacted to. It is just that it is a reaction whose direction or magnitude are usually delinked from any explicit model. In today’s world instinct cannot continue to trump analysis. And analysis is meaningful only in the context of suggesting specific action, tied to business results as expected outcomes. The data must not only punt the can down the road – it must tell you exactly how hard and in which direction to punt. And the result must be measured for the next round to follow.
One of the really interesting things about data modeling, predictive and prescriptive analytics is that for all three the starting point is precisely the same data. After all, that is what you know and have. The difference is the effort to model and the feedback loop where measurable action and measured consequence can be used to refine action and hence outcomes. Part of the problem is also that the paradigm in today’s business world is for leaders who provide direction on actions to be farthest from those who know data well. Without personal exploration of relevant data, you revert to an iterative back-and-forth requesting new data formats from others. The time to search for such “insight” can be dramatically shortened by committing to modeling and measuring results from the get go. Bad models can be improved. But lacking one is to be adrift.
Before you begin to wonder “Is the next step Big Data? Should we be thinking of getting a Data Scientist?” start with the basics: training on analytics, with a commitment to model. Then use the model and refine.