Picture the scene:

Your name is John and you have decided to go to the shopping mall to choose your wife a gift for her birthday. As you walk through the shopping mall to John Lewis, a video display of Angelina Jolie announces:

 “Treat your wife this year John…Bulgari diamonds for that special occasion”. 

When you arrive in the store another display with David Beckham in asks:

“Welcome back John, how are those pair of jeans going?… There’s a sweatshirt on sale today that would go really well with them.”

Aside from the privacy implications and ethical constraints in this scenario, do you think this is realistic? Does the technology currently exist to deliver this? The illustration of futuristic personal advertising used in the film The Minority Report has had much written about it since the film was released in 2002. Retina scanners, insect robots, jet packs and personalised advertising to name a few, were all concepts developed at a 1999 think tank of leading scientists to identify which new technologies would be common place by 2054. For those who aren’t familiar, here is a 1 minute cut from the film.

As futuristic as this clip seems, the technology that underpins the personalised marketing all currently exists. The only difference is that it hasn’t been combined in a single instantaneous system as is the case in the fictional film. So what are the developments in this area that have facilitated it? There are 3 components to consider:

  • Individual identification: recognising who the person is
  • Understanding their state of mind / need state
  • Executing personalised display adverts

Individual identification: there are quite a few technologies that can recognise who someone is using video or scanners.

Figure 1: (From left to right) ATM retina scanners, facial recognition and gait technology

  1. Retina scanners: as used in the film

Despite being one of the most widely known biometric technologies, retina scanners are one of the least utilised. Developed in 1975, retina scanners have been around for a while but their use until recently has been limited to government security. An individual’s retina is unique and in the last few years retina scanners have been developed by banks for ATM security. So it seems realistic that they could start to be used for other commercial purposes.

  1. Facial recognition:

Computer algorithms can now identify an individual using the structure of their face. Although less accurate than retina scanners, Nametag, which uses Google Glass can take a photo of someone, match it against a database of individuals to provide personal information from someone’s social media profiles. Currently there are more than 2 million faces on their database.

  1. Body identification:

Although less well developed and probably the least accurate of the three identification technologies, it has been shown that anything from someone’s walk to their knees can be used to identify an individual. 

Understanding their state of mind / need state
Advertising is most effective when it arouses emotion and a company called EMOVU can now analyse the emotional response from participants that watch video content. From analysing a respondent’s facial structure it initially detects the age and gender of the participant. It can then track the facial reactions to a video to determine basic emotional responses. What makes the technology objective and the outcome measurable is that despite a multitude of cultural differences, humans tend to elicit the same basic emotional responses. And whilst the offering is currently used to understand the emotional response from watching an advert, the application could be reversed to detect an individual’s initial emotional state and then an advertisement selected (real time) that targets the desired emotional state. As in the case of The Minority Report: “John Anderton…you could use a Guinness right about now.”

Figure 2: Example of EMOVU Analytics

Advert Execution:
There are several technical and functional requirements in order to execute the adverts in the same way as in the Minority Report. The main one is integrating the technologies at a real-time speed. ‘Individual identification’, ‘understanding the need state’ and ‘execution’ happen instantaneously in the film although at the speed of development surely it won’t take 40 years.

In terms of the actual delivery, whilst it doesn’t incorporate the need state, Alan Sugar’s ‘OptimEyes‘ system are now displaying targeted adverts at petrol stations. This doesn’t recognise who an individual is so could never link to a CRM database profile, but it does uses facial recognition to identify age and gender to deliver personalised advertising at a customer segment level. In terms of tailoring the advert by linking to a CRM database and including John’s name as part of the advert, a company called rednun have now created the capability to do this. But this requires you to receive a text and provide information upfront. So what if you can’t detect who someone is using video or scanners or an online form? This is where predictive analytics comes in. By understanding past behaviour (eg: from your social, search or browsing history) to understand what you might do next, ‘hyper-personalised’ content can be delivered. It’s not tailored to you specifically, and doesn’t account for your need state, but it’s a step up from your ‘age and gender grouping’ targeted content.
It often seems that too large a proportion of technological advancements are developed commercially for advertising, but there are other applications for a wider role of video analytics. Developments in analytics that use CCTV footage can monitor and implement automated decisions relating to the movement of people, traffic and assets to detect congestion at stations, shopping malls and even theft from art galleries. Fairly soon regulators will need to decide what the ethical limitations are, as far the technology is concerned 2054 isn’t looking too far away.