Skip to Content

Can quantum technology assist in the next COVID crisis? Part 3

Capgemini
2022-03-31

In a series of three articles we will zoom in on three potential applications of quantum computing in light of the COVID-19 pandemic. This third article will focus on the coronavirus evolution as a machine learning application. We will also zoom in on the overall conclusion of the potential of quantum computing in complex situations like COVID-19.

It is better to prevent than to cure. Preventing a pandemic or epidemic consists of identifying the whereabouts and risks of zoonoses and predicting the spread of an existing one. In doing so, organizations such as the One Health Institute at the School of Veterinary Medicine at the University of California, Davis are trying to gain a better understanding of the correlation between spillovers and animal-human interactions. Spillovers happen when a high pathogen prevalence of an animal (simply said, fluids containing infectious cells) encounters a potential host (human). Such a spillover can lead to human-to-human transferable diseases (such as COVID-19) but it doesn’t have to, as is the case with rabies. The One Health Institute’s PREDICT Project has been able to identify 1,200 viruses belonging to families that are known to have the potential to infect people and cause epidemics with 40 risk factors for those viruses to spill over and spread between humans. Identifying these viruses and risk factors required about 170,000 samples from animals and people in about 30 countries. As PREDICT expects that there are about 1.67 million yet-to-be-discovered viruses, both computing- and sample collecting challenges arise.

Another import factor in preventing (or managing) an epidemic is predicting the spread of an outbreak. In predicting the spread of an outbreak we have to deal with numerous factors, including human behavior, social conditions, environment, etc. Outcomes of such predictions give insight into topics such as the number of cases and deaths. Based on these numbers policy makers take (preventive) measures. However, combining this data and running different scenarios with multiple contexts to measure economic and non-economic effects of measure remains a challenge. Running predictions such as this is similar to the Netflix Problem.

Predicting the evolution of the coronavirus is a machine learning application. Similar to predicting the evolution of the stock market, or recommending the next movie on Netflix, it looks for meaningful patterns in the data. In stock market predictions, analysis of the covariance matrix is used to suggest new assets trades, or hedge portfolio items. In recommendation systems at Netflix, user behavior is categorized in different user groups by searching for meaningful features. Chances are that if person A has watched The Notebook, a recommendation of another romantic movie will be successful. In the case of the coronavirus, we have the same problem. With limited information, we are trying to find the features that best define the spread and transmission of the virus. Using these features, we want to predict the number of fatalities, new diseases, and the locations of break-out zones.

Such recommendation systems typically rely on principle component analysis (PCA). PCA is a machine learning technique that is used for determining the features in a data set that have the most predictive value. With PCA, you can summarize the data into a small set of describing features, the largest principal components (in the Netflix example this could be the genre romantic dramas), and then use those for prediction.

Calculating the principal components is a mathematical exercise that relies either on singular value decomposition (SVD) or eigenvalue decomposition. Computing the SVD has a time complexity of O(n^3), where n is the size of the covariance matrix. For small datasets, this is no problem, but for larger datasets it becomes intractable. Take, for example, the Netflix problem, where the datasets include tens of thousands of movies and millions of users. The resulting dataset includes hundreds of millions datapoints. Calculating the principle components in this case become impossible.

Instead, an approximation is made. Instead of determining all principle components, only the few most prominent components are determined. Using the most prominent principle components, an elementary prediction can be made that reflects the preferences of broad use groups. However, for specialized predictions, more principle components must be included. For example, a prediction of a movie that is not only in your favorite genre, but also stars your favorite actor might be more successful.

Quantum computers may be able to determine the principle components much more efficiently. Quantum principle component analysis (qPCE) efficiently exploits the inherent structure of the quantum state in a process called self-tomography to reveal information about the eigenvectors corresponding to large eigenvalues. This process can be used to encode information about a recommendation model in a quantum state, and the principle components can be found by mapping the principle components to the largest eigenvectors.

Often, the bottleneck for quantum algorithm is the transformation of classical data into quantum data. However, a qPCE algorithm does not require access to the full dataset. Instead, it efficiently sub-samples the dataset by performing function calls when needed. In a Grover-type operation, the dataset is sampled without the need for quantum ram, or expensive transformation of classical data to quantum data.

Altogether, qPCE may provide an exponential speed-up over any classical algorithm. Using qPCE, the internal structure of the data is revealed in a number of holistic variables with high predictive value. In Markovian systems, systems where the most recent events have most predictive value, we would want to recalculate the principle components regularly. In these systems qPCE, may have a significant impact. In the case of the stock market, this means that risk may be reduced by identifying relevant correlations. For Netflix, this means better suggestions for new series or movies.

The evolution of the coronavirus is a typical Markovian process. In the past few months, the news seemed to change the perspective on a daily basis. At the same time, accurate predictions of the number of fatalities or required IC beds were even more important. For these predictions, where regular and precise calculation of the principle component is required, qPCE may provide significant impact.

Conclusion

Looking at the impact that COVID-19 is having on society, economy, and healthcare, we can envision future use cases for the role of quantum computing in vaccine development, optimization solutions, and in identifying and managing the spread of viruses.

As the COVID-19 crisis has been stretching its endurance, the societal and financial effects are accumulating. For example, the US has seen its GDP plummet by 30 percent, and COVID-19 has contributed to driving an additional 12 million people below the extreme poverty threshold in 2020. This leaves us to presume that investing in any solution that has the probability of shortening the stretch of the next pandemic is worthwhile.

The challenge with aligning and allocating investments lies in the still (largely) unknown roadmap to clear use of quantum computing. Many use cases are still to be defined, and quantum’s real potential is expected only in ten years’ time.  Nevertheless, there is value to be realized with quantum computing in a shorter period, as NISQ computers might already speed up computing. Furthermore, developments in the amount of- and stability of qubits on the one hand and efficiency of quantum algorithms, on the other hand, is developing at a rapid rate. Therefore, it is to be expected that clear business cases will be presented in three years. Because quantum computers are a natural fit for quantum chemistry, we may expect that a quantum advantage will first be realized in this domain.

Even though quantum computing’s added value is still a couple of years down the line, we should prepare for it. Without dismissing the great technological developments that have been made before and during the current pandemic, different skillsets will likely be required in applying quantum computing versus using classical computing. This stems from the fact that the very foundation of quantum computing is different, and therefore the layers building on that foundation, such as programming languages, will be different too. Implementation of middleware, infrastructure, and development tools will be complex and time-intensive, and necessary skills will be hard to find. Companies would be smart, facilitating early quantum enthusiasts, and promoting them to create awareness and explore quantum use cases. In the long run, we will need a variety of profiles, including quantum algorithm experts, developers, testers, hardware engineers, and business developers.  If quantum computers are to become mainstream in ten years, students should enroll today.

Download TechnoVision and get a handy guide to pivot in these challenging times.

This article is the concluding part of a series of three, co-authored by Renate Wolters and Julian van Velzen.