A is for “Attaching analytics…”

Publish date:

A is for “Attaching analytics…”Extending the thinking – attaching data lakes to your data instances My learned and experienced colleague wrote recently on The Physics and Limits of Big Data – storage is not your issue and I see value in extending his story around entertainment.   Take his example – a massive data lake […]

A is for “Attaching analytics…”
Extending the thinking – attaching data lakes to your data instances

My learned and experienced colleague wrote recently on The Physics and Limits of Big Data – storage is not your issue and I see value in extending his story around entertainment.
 
Take his example – a massive data lake sorting everything broadcast by an entertainment provider – online, satellite, cable – the method is irrelevant. You are faced with two challenges:
 
A big data storage issue and a long term big data analytics issue
 
The former – which as Steve Jones shows is solvable. Scary but possible. The latter is harder.
 
Any series/episode/sports game/film these days is subject to a battery of analysis – from the classic Nielsen boxes, blogs, Facebook, Twitter,  Bebo, etc.
 
A raft of thoughts, analysis, sentiments and rants are available to producers and marketing to steer them at the point of content delivery and shortly after.
 
But what happens afterwards? The data is very transient – Facebook might not be here in 5 years as it’s getting old at 10  – users expire and remove their content, Twitter archives are heavily monetized – and blogs come and go.
 
How do you capture that information so that you can do not just immediate analytics, but also longer term trend analysis by an actor, studio, writing theme, etc?
 
The point here is that yes, you need ways of storing very large amounts of content data. Yes, you need ways of capturing instant (first 72hrs) feedback. But you also need a way of capturing information for the long term – you need to attach data and historic insights to your data for future provisioning.
 
Our thinking is that you can create an inexpensive data lake attached to your content – to store both structured [e.g. Neilsen] and unstructured data [blogs, twitter, sentiment], with all of the data available by state, county and even district – and attach the tools on top to extract perspectives – today, tomorrow and in five years time when you are considering commissioning a new series.
 
If you avoid throwing anything away, the value that you don’t know about today allows you to create a future proof view of your intelligence – and provide the tooling to marketing, 3rd parties and distribution channels to maximize the impact of re-release and new material.
 
You need content and the wider data lake attached – and you need to plan for it now…

Related Posts

Insights & Data

Gesture recognition for a safer, more inclusive society

Date icon August 12, 2021

The emergence of hot tech: Gesture control and touchless user interfaces ~ for a low-touch,...

Insights & Data

Time to shift the gear with software defined vehicles

Date icon August 5, 2021

Software and Data Drive Change in the Automotive Industry.

B2C

Sky-High Expectations and New Journeys: How Telco Is Transforming

Date icon January 18, 2021

In the New Normal more than ever, Telecom operators need to ensure their business and...