Data remains a hot topic and these days even small and medium sized businesses are collecting data. Data is the new oil, and businesses are rushing to get their hands on it, in every industry imaginable. The issue is that often we are collecting, analysing and processing more data than we can effectively deploy, putting businesses into the dilemma of not being able to use the insights.
This is principally because we are rapidly approaching the physical capabilities of how small we can make computer chips. Moore’s law dictates that the number of transistors (a device used to switch electrical signals to electrical power) that exist within a semi-conductor will double every two years – this has held true until around 2013. However, even Gordan Moore himself said, there is a natural point of saturation if there isn’t a leap in the fundamental technology. Transistors are currently around 17 nanometres long, which is 500 times smaller than a red blood cell. How much smaller can they get?
Scientists at MIT (Massachusetts Institute of Technology) have been theorising an alternative method of computing, quantum computing. It operates in an entirely different manner using a process known as quantum tunnelling.
If we want to explain quantum computing, we first need to understand what happens in classic computers. The smallest piece of digital information exists as a bit in any computer. Bits are always represented as either a one or a zero. These ones and zeros put together, make up a binary code which is the foundation of all software and complex files such as audio and video. Quantum computers instead use qubits. A qubit is slightly different – it represents, again as a bit would, one unit of information either a one or a zero. However, the benefit is that a qubit doesn’t represent just one of these states, it can exist in a combination. Simply, the qubit exists as either a one or a zero if it remains unobserved but, the instant you measure the qubit, it collapses into one of the definite states, either a one or a zero. This is called superposition.
Superposition allows for a significant boost to memory and processing speeds. Take 4 bits in a classic computer. These 4 bits have a possibility of 16 combinations, only one of which can be used, as demonstrated below.
However, 4 qubits could theoretically be in all 16 combinations at once due to superposition, practically increasing the processing power by 16 times.
This has untold benefits for processing speeds and the amount of data we can process at once. This type of power spike will have dramatic and lasting effects on industries that rely upon large scale simulations. For example, environmental science for future weather predictions or health where medical professionals are trying to simulate a brain. This could also be the driving force that powers the deep learning behind neural networks, bringing about real artificial intelligence.
One thing is for certain though – businesses will use this new technology to finally crack the big data dilemma where companies are unable to analyse the large amounts of data they are collecting about their consumers. Consulting companies, such as Capgemini, will be able to harness this processing power in the future, which will help their clients analyse data quickly and gather critical insights in real time. They’ll also be able to use the technology to transform their clients’ digital architecture in profound new ways, for example, through virtual machines. The use cases for this new technology are almost endless and it will without a doubt shape the future of computing.
Rishi joined Capgemini’s Future of Technology capability in March 2018 having studied Strategic Marketing at Imperial College London. He recently finished work on the Lloyds IAM project and has now moved into the Applied Innovation Exchange for a three month secondment as an innovation consultant. Prior to joining Capgemini, Rishi’s experience was primarily within start-ups, having worked for an augmented reality advertising firm before moving to start his own venture.