Skip to Content

Monte Carlo: is this quantum computing’s killer app?

Camille de Valk
16 Dec 2022

As the quantum computing revolution unfolds, companies, start-ups, and academia are racing to find the killer use case

Among the most viable candidates and strongest contenders are quantum computing Monte Carlo (QCMC) simulations. Over the past few years, the pace of development has certainly accelerated, and we have seen breakthroughs, both in hardware and software, that bring a quantum advantage for finance ever closer.

  • Roadmaps for hardware development have been defined and indicate that an estimated quantum advantage is within a 2–5-year reach. See for example IBM and IonQ, who both mention 2025 as a year where we can expect the first quantum advantage.
  • End-to-end hardware requirements have been estimated for complex derivatives pricing at a T-depth of 50 million, and 8k qubits. Although this is beyond the reach of current devices, simple derivatives might be feasible with a gate depth of around 1k for one sample. These numbers indicate that initial applications could be around the corner and put a full-blown advantage on the roadmap. Do note, however, that these simple derivatives can also be efficiently priced by a classical computer.
  • Advances in algorithmic development continue to reduce the required gate depth and number of qubits. Examples are variational data loaders, or iterative amplitude estimation (IAE), a simplified algorithm for amplitude estimation. For the “simple derivatives,” the IAE algorithm can run with around 10k gates as opposed to 100k gates for 100 samples with full amplitude estimation.
  • There is an increasing focus on data orchestration, pipelines, and pre-processing, readying organizations for adoption. Also, financial institutions worldwide are setting up teams that work on QCMC implementation.

All these developments beg the question: what is the actual potential of quantum computing Monte Carlo? And should the financial services sector be looking into it sooner rather than later? Monte Carlo simulations are used extensively in the financial services sector to simulate the behavior of stochastic processes. For certain problems, analytical models (such as the Black-Scholes equation) are available that allow you to calculate the solution at any one moment in time. For many other problems, such an analytical model is just not available. Instead, the behavior of financial products can be simulated by starting with a portfolio and then simulating the market behavior.

Here are two important examples:

  • Derivatives pricing: Derivatives – financial products that are derived from underlying assets – include options, futures contracts, and swaps. The underlying assets are expected to be stochastic variables as they behave according to some distribution function. To price derivatives, the behavior of underlying assets has to be modelled.
  • Risk management: To evaluate the risk of a portfolio, for example interest rates or loans, simulations are performed that model the behaviour of the assets in order to discover the losses on the complete portfolio. Stress tests can be implemented to evaluate the performance of the portfolio under specified scenarios, or reverse stress tests can be carried out to discover scenarios that lead to a catastrophic portfolio performance.

Classical Monte Carlo simulations require in the order of (1/ε)^2 samples to be taken, where ‘ε’ is the confidence interval. For large cases, this easily becomes prohibitive. Suppose a confidence interval of 10^(-5), billions of samples are required. Even if workloads are parallelized on large clusters, this might not be feasible within an acceptable runtime or for cost reasons. Take for example the start of the Covid-19 crisis. Some risk models looking at the impact of Covid on worldwide economies almost certainly would have taken months to build and run, and it is likely that before completion, the stock market would have dropped 20%, making the modelling irrelevant.

Quantum computing Monte Carlo promises, in theory, a quadratic speedup over classical systems. Instead of (1/ε)^2  iterations on a classical system, (1/ε) iterations on a quantum computer would attain the same accuracy. This means that large risk models that take months to complete may become feasible within just hours.

Unfortunately, it’s never as easy as it seems! Although sampling on quantum computers is quadratically faster, a large overhead could completely diminish any quantum speedup. In practice, expressing a market model as quantum data seems extremely difficult. There are a few workarounds around this problem, such as the data loaders as announced by QCWare, or a variational procedure as published by IBM, but it is yet to be seen if these work well on real problems.

However, if quantum hardware and software continue to develop at their current pace, we can expect some very interesting and valuable uses for quantum Monte Carlo applications. A business case can easily be made, because if  QCMC improves risk management simulations, then the reserved capital required by compliance regulations could be reduced, freeing up capital that can be used in multiple other ways.

Furthermore, the derivatives market in Europe alone accounts for a notional €244 trillion. A slightly inaccurate evaluation of this market could lead to a large offset to its actual value, which in turn could lead to instability and risks. Given the huge potential for derivative pricing and risk management, the benefit of significant and deterministic speedups, and an industry that is fully geared up to benefit from quantum, QCMC seems to be one of the killer applications.

However, before QCMC works successfully in production, a lot of work remains to be done. Just like in any application, proper data pipelines needed to be implemented first. The time series required for risk management need to be processed on stationarity, frequency, or time period. If policy is adjusted to daily risk management, data streams also have to be up to date. If a quantum advantage needs to be benchmarked, then its classical counterpart must be benchmarked too. Additional necessary developments, such as building the required infrastructure (given the hybrid cloud nature of quantum applications), its relation to compliance regulations, and security considerations, are still in their early stages.

Given the huge potential of quantum computing Monte Carlo, a number of pioneering financial services companies have already picked it up; Wells Fargo, Goldman Sachs, JP Morgan Chase, and HSBC are well established in their research into using QCMC or subroutines. Certainly, these front runners, will not be late to the quantum party, and they will be expecting to see benefits from these exploratory POCs and early implementations, likely in the near future.

Deploying algorithms in productionized workflows is not easy, and it is even more difficult when a technology stack is fundamentally different. But, these challenges aside, if the sector as a whole wants to benefit from quantum technology, now is the time to get curious and start assessing this potential killer app.

First published January 2021; updated Nov 2022
Authors: Camille de Valk and Julian van Velzen

Camille de Valk

Quantum Finance Specialist
As a physicist leading research at Capgemini’s Quantum Lab, Camille specializes in applying physics to real-world problems, particularly in the realm of quantum computing. His work focuses on finding applications in optimization with neutral atoms quantum computers, aiming to accelerate the use of near-term quantum computers. Camille’s background in tectonophysics research at a Dutch bank has taught him the value of applying physics in various contexts. He uses metaphors and interactive demonstrations to help non-physicists understand complex scientific concepts. Camille’s ultimate goal is to make quantum computing accessible to the general public.