As you might expect, I choose topics to write about in response to the conversations and questions that I hear week after week, mostly with clients, sometimes with colleagues and occasionally during industry events. There has been a lot of chatter about the cloud recently for obvious reasons, but this last week has been marked by two interesting conversations with very large global organisations in fast moving markets.
Essentially, it was the same question asked twice, but one that has been coming up in various ways all year. What stuck out in one of the meetings was the way the question was positioned by the CEO. It was clearly designed to be a challenge to the CIO: ‘I want to know the real activity in our key markets and across our product lines as and when I need to focus in. However, I can’t trust either the P&L reporting, as this is a smoothed and massaged set of figures, or the conventional reporting, as this doesn’t give me the flexibility to slice and dice what is actually behind the figures’.

So there you have it. A perfect example of the shift from traditional back office business intelligence (BI) reporting in a structured manner, to the unstructured use of various forms of data around front office market activities, a point I made in an earlier blog post. You can work out the obvious responses to how you might approach this problem by data collecting, analysis engines, etc. But all of this assumes that this is structured and repeatable (and expensive and time consuming to build and maintain as well) and for structured, substitute stability in business activities.
The real question in all of this is how to deal with rapidly changing dynamic situations which are inherently low on repeatability, may be even one offs, where the consequences are significant and the need for information with which to make a decision is vital. The second and equally unspoken part of this is that vast quantities of information are being created and stored internally and externally so why can’t this be accessed and used? Well, we all know the answer to that. It’s because the data is inconsistent in format and semantic or contextual meaning; so we need to extract, transform, and re-load it (ETL) into a database so it can be used. With that, we are right back where we started from with the same cost and time challenges posed by the construction of a structured data warehouse.
But do we need all three stages of ETL for transitory data that we will use for a very short period around something unique and then discard? There are now several technology vendors who are offering the tools for very effective extraction to access the data from a variety of chosen sources, and then to enable this to be held for direct use by some equally effective new analysis engines. Just think for a moment about this whole process. Normal reporting across an enterprise is about exceptions, so normalisation and standardisation is obviously a key aspect and given the amount of data, the sum of ‘human time’ required with the results needs to be low.
Now go back to the original question at the beginning of this piece, the people involved are already committed to wanting and needing to spend time on the topic, and they are not asking for a normalised comparison within their current reporting. So if you can extract the data you need from the sources that exist in a focussed enough manner then the final analysis of this critical data can be a lot more human extensive. Given the kind of situation that this is likely to be, i.e. a sharply focussed event or issue on the edge of the business that needs to be addressed in a very specific way, it becomes a much more manageable matter. The question is all about being able to access and import the key extracts of data from a wide selection of sources and formats to improve the information on which the decision will be based.
So who should do this and how? Well in both the defence and financial industries there are well established players able to deliver really impressively, but it’s always been expensive. Now it’s changing as the companies move into mainstream verticals. We are getting some great results with some players such as Kapow Technologies – have a look at their Web extraction capabilities. Or, take a look at the sophisticated work that Palantir have done with some government agencies in this excellent video.
I leave you with a last thought, if the goal of an organisation is to win business through better knowledge of markets then why would we be looking internally for the information we need to understand those markets, and activities? And, if we are not looking internally then why should our standard internal methods be the right approach for the very different external requirements?