Werner Heisenberg was a German physicist and one of the key creators of quantum mechanics. In 1927 he published his uncertainty principle for which he is best known. This principle states: “It is impossible to determine accurately both the position and the velocity of a particle at the same instant.” This might require a short explanation for the physicist illiterates like me. Position is the identification of the relative location, in other words: where you are. Velocity is the speed and direction, in other words: where you are going. There is a short ‘true story’ that illustrates this. It is rumored that Heisenberg went for a drive one day and got stopped by a traffic cop. The cop asked, “Do you know how fast you were going?” and Heisenberg replied, “No, but I know where I am.”

Data at rest

The same seems to apply for many organizations today. They know where they are (or have been) but they do not know where they are going. The main reason for this is that their (big) data is at rest. It is mostly inactive data that is stored physically in any digital form, for example a database or a datawarehouse.  It used primarily for historic reporting or analysis on mostly internal data. Although the quality is high (Datawarehouses for example are often associated with a high level of data quality and creation of the single version of the truth), the time-to-market is often low (batch oriented overnight architectures) and the value of the data is therefore relatively low.

Data in motion

To cope with the new economic reality many organizations could do with an intelligence make-over and transform from hindsight analysts to foresight action takers. There is a need for a new type of organization where the (big) data is not in rest but in motion or even in use. Where it is quickly flowing through the organization and changing business outcomes on the fly. Not by a small team of IT experts but by everybody within the organization participates. Research (for example by The Economist: http://www.capgemini.com/resources/the-deciding-factor-big-data-decision-making) shows that most C-level executives are convinced that value from data can be found in real-time. One of the winner of the CRM excellence award, Turkcell (a Turkish Telco), found that a right offering to random people yielded more that batch time offerings to segmented groups. Another example why speed is important.


Big Data has created a paradigm shift in the way we look at decision making. We see structured data coming from inside the organization (like ERP, POS) or unstructured data like sensors in machines or applications. But this is also the time where external data, from websites or social media, tells us much more about our own performance. Not with facts or dimensions from your Datawarehouse but with opinions on Twitter and likes on Facebook. This is the time where Facebook can predict when somebody is about to cheat or commit suicide, where Google can predict a flu outbreak or retailers can predict that your teenage daughter is pregnant. It is about bringing outside intelligence inside your organization.

It’s the speed, stupid

Although the name suggests otherwise Big Data is not about volume. Volume is about data at rest. It is about storing massive volumes of data against lower cost, probably in the cloud. Take for example my CD collection. I used to have two shelves of CD’s. Now it is empty. Why? Because I have Spotify. I have about all the songs in the world available via internet at a fraction of the costs (one-fourth of the price of one new CD covers my monthly subscription). Big Data obviously is not about where you are (position) but about where you are going (velocity) with speed as the deciding factor. To paraphrase Bill Clinton’s successful 1992 campaign slogan: It’s the speed, stupid. Supported by technology – like in-memory – we must quickly find value in data. For example by using complex event processing tools or data exploration and visualization. For me, there is no uncertainty in that.