Skip to Content
FutureThinking_Web-banners_1200x350px_Computing_20200629_MK-22

Computing Futures

You ain’t seen nothing yet

Digital transformation is just the beginning – a beguiling combination of bits, neurons and qubits will help push your business into a new age of computer-powered disruption.

We are approaching the end of an era. The existing strategies for creating more compute power for technology systems in business are coming to an end. Yet the demand for power will continue to rise at an unprecedented rate. So what does this mean for your business?

Your company will need new hardware infrastructure to power ever-more sophisticated software systems, with ever-increasing data volumes. Neuromorphic computing and quantum computing systems will fill this void and are already moving from R&D labs to business environments.

Businesses must start preparing for a Compute-Next reality now that will leverage bits, neurons and qubits in a new hybrid platform. Those that adopt new architectures, applications, tools and skills fastest will have a significant advantage.

Benefits include better risk models in finance, faster drug discovery in pharma, more efficient flows of goods in logistics, faster development of new materials in manufacturing, and better consumer insights in retail. These benefits will come from the ability to build more complex mathematical models through the next generation of computing systems.

Bigger gains from even bigger thinking

For over 50 years, according to Moore’s law, computational power has doubled about every two years. This premise has been a driving force of technological and social change, economic growth andoverall increases in human wellbeing.

The continuing miniaturization of microprocessor architecture has driven this trend. Miniaturization reached the five-nanometer level in 2020 and, although one additional step is envisioned to three-nanometer, we are approaching the boundaries of what is physically possible.

New approaches – such as high-performance cloud computing, neuromorphic computing and quantum computing – are actively being explored. These new approaches rely less on miniaturization and use different principles to create further gains.

Neuromorphic computing – which uses large-scale integration systems to mimic the human brain – offers a natural fit to artificial intelligence, the application of which supports developments in a host of computing-intensive areas, such as the Internet of Things, Industry 4.0 and Digital Twins.

Quantum computing, meanwhile, combines quantum physics, information theory and computer science to create a new field of computation. This new field makes it possible to simulate processes through complex mathematical models that were not considered feasible before.

Alongside these advances, classical computing will also continue to evolve, leveraging both neuromorphic and quantum computing to create a hybrid computer that can prepare, process and store ever-increasing amounts of data. This trend will be supported by advances in software abstraction that will allow us to distribute computing to the cloud.

Developing new ways to carry the load

We believe that the future of compute is not a choice between neuromorphic or quantum computing, but instead a hybrid combination. Businesses will apply the most suitable method of computation dependent on the characteristics of the problem at hand.

In the majority of cases, this will be a combination of bits, neurons and qubits. Neuromorphic and quantum computing will work in synergy with classical super- and
high-performance computing. Here, classical compute systems will provide storage, process data and orchestrate workloads between specialized hardware.

In the next three years alone, we will enter into the exascale computing era, where systems will be able to perform at least one exaflops, or one quintillion, calculations per second. The exascale computing era will open an avenue into the exploitation of neuromorphic and then quantum computing.

But the effective combination of bits, neurons and qubits will require a unified, cross-technology programming environment that helps developers get the most from their applications, without having to deal with each logic’s complex specificity.

That means an alternative software stack is required to make the best use of future compute approaches. The stack needs to not only deal with the massive parallelization of processing threads, but also the different processor architecture requirements. We believe that new classes of middleware will help to prepare data and program algorithms.

Furthermore, for hybrid computing – which lies between the new era of bits, qubits, and neurons – additional software will be required to orchestrate computational loads. That includes tasks such as hardware control, data persistency and advanced job scheduling.

Here’s something you’re never gonna forget

The technology stack is developing, yet successful applications of this new era of computing will necessitate expert knowledge of software and hardware layers. Taking advantage of these emerging technologies will not be straightforward. Business leaders should recognize that their organizations need to start investigating Computing Futures now. The technology change will be as disruptive as the IT revolution of business in the early 1980s.

By using frameworks, patterns, and building blocks that are developed in close cooperation with strategic technology partners, organizations can start to experiment and become more efficient and successful in these areas of emerging computing technology.