Again not a new topic – but I realise increasingly that many basic requirements don‘t change too much, it’s the capabilities we have at our disposal that do change, and this leads to new answers. As the focus on real time data analysis becomes the norm for helping front office workers make optimised decisions based on sliced and diced information, the amount of information to assimilate from the screen when making that oh so valuable human judgement will grow. At the same time as processing sharply focussed event data streams there will be a need to understand more about the context or background situation.
Think of it this way; it’s good to know that temperatures are climbing by 4°C when you are selling ice cream, but if the context is coming out of the biggest cold spell of the winter it won’t have the same impact as if the same 4°C rise comes on top of the biggest heat wave of the year! Now in real life it won’t be such a simple challenge, a host of other factors will need to be considered, how about the impact on a supply chain of a hot spell ending suddenly with a cold spell and then temperatures climbing sharply again. Likelihood will be that the supply chain will contain ice cream that will need to be sold before further orders can come through so that’s another factor to consider. Add a market the size of North America or Europe and it magnifies all the issues from weather forecasting, to the length of supply chain, to a local view of what constitutes ice cream weather, and probably a few other things as well.
IBM Research labs have been experimenting with the whole notion of making large amounts of data easy to understand using new visual formats. Best of all its being done in public so you get a real chance to understand. Called ‘A Visual Bill Explorer’, this is a web-based visualization of 2009 U.S. congressional legislation, a dry, complex, large and difficult to understand topic if ever there was one! However the demonstration tour does add a certain amount of amusement by featuring Tiger Woods and his congress award, a sure fire hit when picked before media stories following his car crash started to explode!
In the same vein, but with a strongly practical application, MIT has been experimenting with visualisation in the supply chain. In their work on this topic they realised that real value lay with using the visualisation to help display the context and source of the data. To quote from their very interesting report; “SourceMap’s real virtue is that the data itself is fully sourced. Like links at the bottom of a Wikipedia article and the accompanying edit history, you know exactly who added the data and where the data came from”. In the MIT example this is used to build a map of a supply chain in respect of a specific item of IKEA furniture.
The MIT work seems to have achieved something very important which is how to deal with the oldest problem in computing; GIGO, or Garbage In, Garbage Out. The danger with real time analytics is that we could have found new power to achieve this much faster as we feed the funnel with more and more data from external sources about which we might not know too much. Context data is particular risky in this respect as by its very nature we are relying on others to help establish and maintain the data.
But just how will contextual data look? Well an example that illustrates it well is the hugely interesting Housing and Transport Affordability Index site covering the USA. It’s a particular favourite of mine for several reasons. It illustrates just what a MashUp is meant to achieve before the term and many examples became sanitised into ‘portals’, it shows huge amounts of data organised simply and visually, and it is a perfect example of a community-built and maintained contextual database.
Don’t just take a quick look; take the time to really understand exactly what is available here and how it could be used. There are three examples included; household greenhouse gas emissions, impact of gas prices, and how to compare neighbourhoods on various factors. Think of it as the underlying contextual input to a detailed and specific set of business models for real time data analysis in an energy supply company as an example. These are new aspects to an old topic that will increasingly be part of the game change that genuine real time data analysis will bring, and require new skills and capabilities to use successfully. It’s not just the ‘cut and dice’ of internal data, it’s the ‘compare and contrast’ with the market data that is significant!