8 not very technical problems with analytic technology
In a couple of talks, including last Thursday’s, I’ve rattled off a list of eight serious problems with analytic technology, all of them human or organizational much more than purely technical. At best, these problems stand in the way of analytic success, and at least one is a lot worse than that.
The bulleted list in my notes is:
-
Individual-human
-
Expense of expertise
-
Limited numeracy
-
-
Organizational
-
Limited budgets
-
Legacy systems
-
General inertia
-
-
Political
-
Obsolete systems
-
Clueless lawmakers
-
Obsolete legal framework
-
I shall explain.
The expense of expertise. Highly skilled Oracle DBAs are expensive. The same can be said for many other categories of people, whether in IT or business units, needed to exploit the opportunities of analytic technology. Newer, simpler approaches to analytic database management can clearly help. The verdict is more mixed so far on newer business intelligence or data mining technologies.
Limited numeracy. If you’re reading this, you’re very likely more numerate (i.e., more capable with numbers, mathematics, and analysis) than the average person, or indeed than the average knowledge worker. A typical knowledge worker can understand an analytic claim, on some level – but can he think critically about it? Can he make valid analytic arguments of his own? That’s less clear. Maybe it’s possible to build a new company in which analytic competence is a prerequisite for employment (Google comes to mind as one candidate.) But uniform analytic ability is almost inconceivable at an established enterprise. That’s one of the biggest reasons converting an ongoing concern into a top-to-bottom “analytic enterprise” is a lot harder than business intelligence gurus sometimes suggest.
Actually, I do think practical numeracy goes up each decade. Go, for example, to a message board discussing sports, and the chances are high you’ll pretty soon see concepts like “small sample size” used perfectly correctly. That would have been much less likely 30 years ago, even aside from the fact that in those days no such things as “message boards” happened to exist. I credit this favorable trend to a multitude of factors, from the available of BI and related technologies, to the promotion of probability and statistics at multiple levels of at least the United States educational curriculum. If nothing else – when I was in school, it was very rare for a girl to be serious about learning calculus or any kind of advanced mathematics.* Thankfully, that seems to have changed.
*I do recall my main college girlfriend cursing her way through a physical chemistry course, which implies some working knowledge of partial differential equations. But she was pretty unusual for her day.
Even so, that’s not enough. Get me in a discussion about politics or charity, and I’ll argue that few things are more important than boosting numeracy through the education system. But that’s hardly an answer in a business-today time frame. If your enterprise is trying to deploy analytics universally across the organization today – well, that’s a very hard challenge.
Limited budgets. For every benefit there is a cost. And when it comes to analytics, the costs (at least some of them) can be a lot easier to quantify than the benefits. This can make it hard to get the budget to do proper analytics.
Legacy systems. In many cases, the best and most cost-effective analytic products are fairly new ones. But companies tend to have the older and costlier ones already in place. Yes, I’m seeing a reasonable number of “escape from Oracle” or even “escape from DB2” projects. But a lot more enterprises just try to work within what they have.
And that’s even before we start to talk about stovepipes, silos, or data integration. It’s also before we consider the difficulty of modernizing OLTP systems to either gather more data for analysis or incorporate more analytics into operational business processes.
General inertia. Budget limitations and legacy limitations can both be excuses for doing nothing. But something nothing gets done even without the aid of such excuses.
Obsolete systems. In the particular case of government, the legacy systems problem can be ridiculously bad. In large part, this is due to broken procurement processes. Computer systems are bundled up into specific government contracts, which are then bid and awarded in a long process, whose results can be challenged, making the process even longer. When the contract is finally awarded, a long implementation cycle begins. Systems are years out of date before they are put in. By the time they are replaced, they are a lot more ancient than that. And, as big as they are, they are isolated projects, with little cross-government standardization, and with little effort put into facilitating data integration across systems.
The consequences are horrific. Better collaboration tools might have averted the 9/11 attacks. Weak analytics allow all kinds of fraud to continue. And considering just the costs of data integration projects, untold billions of dollars are directly spent due to government computer failings.
Obsolete legal framework. The procurement process is just one major example of obsolete technology-related laws. The liberty and privacy problem is even more profound. Intellectual property rights and censorship,* of course, are other major areas of nonsense, even in countries that are basically pretty free. Part of the problem is that legal structures for a less advanced world are sometimes ill-suited for the power of the modern technology.
*Of course, those areas are not particularly tied to analytics. But I thought I’d throw them in anyway while I was on a roll.
Technology-challenged lawmakers. These issues are hard, and even technologically savvy lawmakers would struggle with them. At least, I think so; to my knowledge, that hypothesis has never come close to being tested. What I do know is that when laws meet technology, nonsense commonly ensues.
Comments
Leave a Reply