Paralysis by data analysis is a common ailment in organizations, and it’s probably going to get worse. The more data available, the greater the temptation to examine, twist and tweak. In the bad old days, companies had just a couple of data points and some gut feeling. Decisions were taken on incomplete data, but they were taken all the same. Nowadays, decisions are often pushed out until all the numbers can be run. Big data compounds the problem. Yet at least one survey shows that the quantity of data is not the real problem – the difficulty is in the decision support and acting on appropriate data analysis.
Computing power alone is not the answer
One corporate data analysis dream goes like this: powerful computers analyze tons of data in real time to generate decision support lists of insights and prioritized action items; human beings are freed from data analysis drudgery to be able to concentrate on matters of strategic importance. Well… nice try! Data analysis needs to begin with strategic decisions about what’s important and what isn’t. It takes human beings to figure out what can usefully be fed into the computer in the first place. One analogy is that of oil extraction. Oil companies don’t put in huge pipes to vacuum everything up, but smaller pipes that first confirm the presence of the oil field. It’s the same with data and sending the raw extracted data off for refining afterwards.
The overfitting problem
In some way, big data appears to have blindsided many people with respect to a phenomenon they already knew about. Statisticians have been aware for a long time of the danger of ‘overfitting’, meaning making a model that pays too much attention to ‘noise’ instead of the real data. With big data, the issue is very similar. While there are real success stories with big data, what is gold dust for a minority of users is simply noise for others.
Steps to overcome paralysis
A few simple steps may be enough. Firstly, start with the end in mind and accordingly use smart data rather than all the data. Secondly, understand that there is no ‘big data’ endpoint, where absolute truth can be found: in other words, seek excellence, but don’t chase after perfection. When you know what results you want and which indicators (KPIs) will tell you if you’re on track, you can sort out any conclusions from your analysis more quickly too. Is it relevant to your objective? Then use it to get there faster or more efficiently? Is it irrelevant? Then put it to one side, at least for the moment.
Importance analysis
As part of the ‘smart data’ approach, Analytica also provides importance analysis on different factors to see whether they should be kept in a model, or discarded because they are unlikely to affect outcomes. While such decision support choices may only become apparent after one round of analysis, they help organizations to converge on clearer understanding and action plans faster. Accepting a minimum of iteration and a ‘good enough’ rather than an ‘absolute best’ approach also contributes to de-paralyzing an organization.
If you’d like to know how Analytica, the modeling software from Lumina, can help you to make clear, realistic forecasts and action plans by better and simpler data analysis, then try the free edition of Analytica to see what it can do for you.