Whether you work in a managerial or operational environment, you are likely to be bombarded with information. The challenge with this is typically described as that of having access to the right facts and figures at the right time. However, I believe there is a more fundamental challenge which is sometimes overlooked: this is the challenge of interpretation. For example, you are presented with information which, you are assured by your team, is correct and can use confidently in a certain way. This, as I hope to show, is not always the case. Worryingly, interpreting data & information is increasingly being pushed as a science rather than an art. Depending on the environment you work in, the apparent veracity of the information can be very misleading. Let me describe this using three examples: The Good Take the retail environment. Perhaps you own, or franchise, stores and all the organisation uses the same EPOS tills and stock codes. Everything is precise, barring the odd cross-stock mistakes. Depending on your control mechanisms, errors introduced by this type of mistake are likely to be quite small. As a manager, you can probably trust your reports. This doesn’t necessarily mean other large challenges don’t exist but, in this instance, perhaps you have an error rate of 1-2%! The Bad Taking the example of Local to Central Government reporting structures, it is easy to see that demands for additional statistics create issues. Typically, the information asked for is unusual; it may not be routinely collected locally in the manner required. Also, it is usually required in a timescale that demands the use of existing information rather than having the luxury to collect it properly. So the information is gathered locally, and massaged into the format demanded. The challenge here is whether the information from each source is semantically the same. Can you rely on a report detailing school absenteeism when families move between councils/schools and when figures are demanded per term and the information is collected per annum. What is the error rate then? 20-30%? The Ugly One police force reports an incident and identifies a person using structured reporting that uniquely identifies the role of that person as a witness. In another force, the same type of incident is captured in unstructured form. For example, witnesses may be identified in the text using a description such as "John SMITH identified the accused because of his..." If this information needs to be shared nationally in structured form, is John Smith the accused because his name is within 3 words of "the accused", or is he the witness? Will Mr Smith get a tap on the shoulder from a force that does not read the original unstructured report. In this case, the accuracy of the information could be 100% wrong! Is this inevitable and is it even possible to fix this? In my next articles I want to explore, in greater detail, the issues behind each of these examples.