Thriving on Data #5 – Cognito Ergo Sum
The emergence of increasingly aware ‘cognitive computing’ implies a new era of technology and human collaboration. Artificial Intelligence moves one step further as it adapts ‘fuzzy’ human ways of absorbing and interpreting information, rather than following pre-defined rules. One of its main merits is the lack of friction between the individual seeking assistance and the technology providing it. Faster, more accurate and better decisions via ‘self-learning’ robotic software means leaner operations, better insights and more business value. But don’t worry: a machine can only be so human, can’t it?
Is AI really elementary, my dear Watson?
Ray Kurzweil famous predicted that by 2029, computers would pass the ‘Turing Test’ – the moment at which intelligent machine behavior would be indistinguishable from that of a human. This prediction came before the arrival of the fax machine. He now predicts that by 2045, computers will be a billion times more powerful than all of the human brains on earth. Now, that’s deep learning isn’t it?
Systems such as IBM Watson are on the threshold of commercially viable processing and indeed self-learning of semantic language content. Whilst their progression has not been stellar thus far, their ability to digest unstructured information at millions of times the speed of human cognition coupled with the emerging field of cognitive and quantum computing is narrowing the context gap at a faster rate than ever before.
We stand at a tipping point comparable with the last industrial age. The next evolution in computing has the potential to create machines that will know the answer to our questions before we ask them. This capability will be based on an insatiable ability to process enormous amounts of human and machine-generated log data in its raw and unstructured form to analytically derive context, meaning and perhaps most importantly of all, dynamic underlying relationships.
A large proportion of our effort in the current digital drive to business agility – and indeed consumer and market responsiveness – lies in the ‘drowning’ field of integration. This torrent of interface logic attempts to structure relationships that change too frequently; they are simply ‘unstructurable’.
What if in future, we left the analysis and interpretation of key data events inside and outside our organizations in the virtual hands of cognitive computers powered by the crowd in the cloud?
Instead of today’s CIO focusing on ‘obsessive integration syndrome’ they would drive ‘federated intelligence’. IT would admit defeat in the fight for corporate control and structure of data and, subcontract this process to regulated machines that enhance core data against internal learning algorithms to adapt to the changing needs of the customer, employee, shareholder and regulator.
In 2011, IBM’s Watson performed as a contestant on the US game show ‘Jeopardy’ and won. What is astounding is that its exploits were not coded by human engineers, but self-taught by reading Wikipedia – all of it.
This cognitive reasoning underpinned by semantic computing (or natural language processing) has the potential to automate patient diagnosis, research and development activities, product ideation cycles, financial and risk decisioning and supply-chain optimization to name but a few potential candidates. It is even powering ‘personality profiling’ powered by big data and psychology.
The future for doctors, pharmacists, actuaries, quants, product designers, data scientists, auditors, recruiters and pilots will be very different than today. They will be pushed ‘higher up the information supply-chain’ to interpret, govern and action the change rather than participating in the journey.
A GE turbine alone generates more event data in one day than the entire global twitter feed. This ‘data dark matter’ cannot be easily interpreted using current structured and highly manual data processing techniques plus, the impact of the expanding sharing economy is pushing future data ownership from the CEO back to the consumer in the crowd.
A future emerges in which machines facilitate strategic, economic and political decisions and accelerate a global process of creation, ideation and rationalization. They will commoditize the acquisition, marshaling and interpretation of global thoughts, actions, events and sentiment from socially-enabled people, machines and sensors to enrich a globally accessible repository of open data.
So the ‘most human’ of decisions in future may not require a pulse at all.
Early cognitive, deep learning and natural language capabilities have arrived. Our own ‘personal data Sherlock’ may await us in emerging technologies such as:
Future IT service models will drive cloud-based adoption of increasingly cognitive ‘pick & mix’ application stores and APIs that will be combined to filter, analyse and deliver intelligent semantic context against global data events in a form that makes sense to both the enterprise brand and employees.
We can now rent brain power by the hour and be in with the in-crowd. After all, why bother to continually integrate your disparate systems against this torrent of global event data when, a cognitive ‘Enterprise Relationship Bus’ could summarize your daily status against potential risks and opportunities in direct linkage to your customers, products and services?
There are tangible real-world examples in place today from cognitive cooking through to personal diet advice, from improved heart disease diagnosis to reducing global crisis response times and from biodiversity and conservation through to faster recruitment screening; the possibilities are endless.
Conscious technology, even more conscious enterprises: it may make the difference between just being there and a true digital existence.
Your expert: Manuel Sevilla
Part of Capgemini’s TechnoVision 2015 update series. See the overview here.