양자 컴퓨팅은 현재의 전통적인 컴퓨팅이 해결할 수 없는 각종 문제를 해결할 수 있다. 하지만 아직까지 양자 컴퓨팅의 잠재력을 완전하게 실현하는 기술은 구현되지 않았다. 다만 연구실과 첨단 기술 기업들은 끊임없이 양자 컴퓨팅 연구를 수행하고 괄목할 만한 성과를 내고 있다. 그렇다면 현 시점에서 기업이나 기관들은 이러한 양자 컴퓨팅 시대를 위해 어떤 준비를 해야 할까?
지난 반세기 동안 무어의 법칙의 끊임없는 발전과 소프트웨어의 보완적 발전에도 불구하고 오늘날 컴퓨터가 해결할 수 없는 많은 문제가 여전히 남아 있다. 그중 일부는 단순한 상업적인 의미로 차세대 반도체로 해결할 수 있다고 기대하고 있다. 그러나 다른 것들은 영원히 고전적 컴퓨터가 지닌 제한적 범위로 인해 해결되지 못할 것이다. 이를 해결할 수 있는 새로운 개념으로 등장한 것이 바로 양자 컴퓨팅이다. 양자 컴퓨팅 시대의 여명기는 잠재적 공급자와 이용자에게 새로운 동기를 부여할 뿐만 아니라, ‘고전적 컴퓨팅으로는 다루기 힘든’ 문제들을 마침내 해결할 수 있다는 기대를 제공해주고 있다.
그러나 양자 컴퓨팅의 엄청난 잠재력에도 불구하고 이 기술의 타이밍과 실제 영향을 평가하는 것은 다른 어떤 혁신 기술보다 어려운 것으로 입증되었다. 그나마 다행스러운 점은 전 세계의 연구실에서 새로운 돌파구가 등장하면서 안개가 서서히 걷히고 있다는 점이다.
보스턴 컨설팅 그룹(Boston Consulting Group, BCG)은 향후 수십 년 동안 양자 컴퓨팅의 엔드 유저들이 비용 절감은 물론 연간 최대 8,500억 달러의 수익 창출 기회 형태로 그 이익을 실현할 것으로 추정하고 있다. 이러한 이익은 복잡한 시뮬레이션과 최적화에 대한 니즈가 있는 산업 분야의 기업인 기관에서 먼저 발생한다. 우선 앞으로 수년 동안에는 ‘느린 구축’ 기간으로 이러한 이익이 2024년까지 연간 20억 달러에서 50억 달러 수준에 이를 것으로 예상된다. 그러나 기술과 상업용 응용 프로그램이 성숙해짐에 따라 가치 창출은 더 빠르게 가속화될 것이다.
가장 긍정적인 추정은 첫 번째 수익이 앞으로 4~5년 후부터 시작될 것이며 ‘빅 양자 잭팟’은 아마도 10년 이후 일어날 것이라는 데 동의한다는 점이다. 이에 따르는 질문은 다음과 같다.
“양자 컴퓨팅의 혁신적 가치 창출이 최소 5년에서 10년 후라면 기업들이 지금 투자를 고려하는 이유는 무엇인가?”
답은 간단하다. 이것은 급진적인 기술로, 고도의 슈퍼 컴퓨팅 역량을 이미 보유하고 있는 기업에게조차 엄청난 초성장 도전 과제이기 때문이다. 양자 프로그래밍과 기본 ‘양자 기술 스택(tech stack)’, 즉 기반이 되는 기술은 서로에게 있어 유사점이 거의 없지만, 이 두 가지는 매우 밀접하게 함께 작동할 가능성이 있다. 따라서 얼리 어답터는 전문 지식, 기술 격차에 대한 가시성, 양자 컴퓨팅이 상업적 견인력을 얻을 때의 구조적 이점을 제공할 핵심 지적 재산을 미리 확보해야 한다.
더 중요한 점은 전문가들에 따르면 양자 컴퓨팅의 진전이 매끄럽고 연속적인 곡선을 따르지 않을 것이라는 데 있다. 이보다 양자 컴퓨팅 산업은 산발적으로 획기적 발전을 경험할 가능성이 높다. 이에 양자 컴퓨팅을 워크 플로에 통합하기 위해 투자한 기업들은 이 경험을 활용할 수 있는 위치에 있을 가능성이 훨씬 더 높다. 이는 고전적 방식으로는 다루기 힘든 고도의 계산 문제로 인해 병목 현상이 발생하고 수익 기회를 놓치고 있는 산업에 상당한 이점을 제공할 것이다.
미래 비즈니스 가치에 대한 평가는 양자 컴퓨터가 2진법 머신, 즉 기존 컴퓨팅보다 보다 더 효율적으로 해결할 수 있는 문제의 종류가 무엇인지에 대한 질문에서 시작된다. 간단한 답은 없겠지만, 몇 가지 지표로 수행해야 할 계산의 크기와 복잡성을 알 수는 있다.
예를 들어, 약물 발견을 생각해보자. 표적 질병에 도달하여 그 질병 경로를 변형시키려는 화합물을 설계하려는 과학자들에게 중요한 첫 번째 단계는 분자의 전자 구조를 결정하는 것이다. 그러나 기저 상태에서 41개의 원자를 지닌 페니실린과 같은 일상용 약물의 분자 구조를 모델링하는 데만도 약 1086 비트의 고전적 컴퓨팅을 필요로 한다.
이는 관측 가능한 우주에 있는 원자보다 더 많은 트랜지스터를 포함해야 하는 것이다. 따라서 이러한 기계는 현재 물리적으로 불가능하다. 그러나 양자 컴퓨터의 경우 이러한 유형의 시뮬레이션을 수행하는 데 286개의 양자 비트 또는 큐비트가 있는 프로세서를 필요로 하기 때문에, 가능성의 영역 내에 있다. 따라서 양자 컴퓨팅의 영향력을 극대화할 수 있는 최고의 기회는 다음과 같다.
- 재료 디자인
- 새로운 약물 발견
- 금융 서비스
- 전산유체역학
- 운송 및 물류
- 에너지
- 기상학
이러한 양자 컴퓨팅의 역량과 개발 속도를 감안하여 우리는 다음과 같은 예측을 내려 본다.
첫째, 양자 컴퓨팅의 완전한 잠재력은 대략 30년에 걸쳐 3단계로 실현될 것이다.
2025년까지는 소위 ‘양자계 검증단계(Noisy Intermediate-Scale Quantum, NISQ)’를 특징으로 할 것이다. 이는 오류가 보정된 양자컴퓨터가 상용화되기 이전 단계로, 오류가 포함된 수십~수백 개 수준의 큐비트를 활용한 중간 규모의 양자컴퓨터를 개발하는 시대를 의미한다. 이 시대는 비교적 높은 오류율과 더불어 유용하고 개별적인 기능을 점점 더 많이 수행할 수 있는 특징을 지닌다. 이들은 기존 신경망과 다소 유사한 양자 휴리스틱을 활용하는 데 사용될 가능성이 크다. IBM, 구글, 리게티(Rigetti) 등의 기업들은 양자계 검증 단계에서 시뮬레이션과 조합 최적화에 최초의 성과를 가져오는 오류 완화 부문에서의 기술적 혁신을 기대하고 있다.
다음으로, 2025년에서 2035년 사이 양자 컴퓨팅은 실질적으로 산업에서 중요성이 있는 광범위한 작업 분야에서 우수한 성능을 산출하는 소위 광범위 양자 이점(Broad Quantum Advantage)을 달성할 것으로 예상된다. 양자 이점은 양자 컴퓨터가 이전의 컴퓨터에 비해 지닌 다소의 이점으로, 기존 컴퓨터를 훨씬 앞선다는 뜻의 ‘양자 우위(quantum supremacy)’보다 제한적인 의미로 쓰인다.
이 단계는 기존 2진법 머신으로 가능한 속도, 비용, 품질에 대한 진정한 양자 도약을 이룩할 것이다. 또한 이 시대에는 ‘오류 수정’ 및 기타 유관 영역에서 상당한 기술적 장애를 극복하고 양자 프로세서의 성능과 신뢰성을 지속적으로 향상시키는 것이 필요하다. 이 단계를 준비하기 위해 자파타 컴퓨팅(Zapata Computing)과 같은 기업들은 양자 이점이 있는 분자 시뮬레이션이 상당한 비용을 절감시킬 뿐만 아니라 시장에 더 빨리 출시될 수 있는 더 나은 제품의 개발을 주도할 것으로 확신하고 있다.
그리고 궁극적으로 고장 허용 범위(Full-Scale Fault Tolerance)인 세 번째 단계는 2040년대에 도달할 것으로 예상된다. 풀스케일 폴트 톨러런스, 즉 고장 허용 범위는 풀스케일로 고장이 발생해도 동작이 가능하다는 의미이다. 이러한 고장 허용 범위를 달성하려면 양자 기술의 메이커들이 규모, 안정성과 관련된 문제를 포함하여 추가적인 기술적 제약을 극복해야 한다.
그러나 일단 이 단계에 도달하면, 양자 컴퓨터는 광범위하게 각 산업을 변화시킬 것이다. 즉, 양자 컴퓨터들이 시행착오를 크게 줄이고 특수 화학 물질 시장의 자동화를 개선하고 금융 분야에서 방어 및 위험 중심의 거래 전략을 가능하게 할 것이다.
또한 사람의 생체로 임상 시험을 하는 것이 아니라 디지털 가상으로 진행하는 ‘인 실리코(In Silico)’의 생산성을 크게 높일 수 있는 잠재력을 가지고 있다. 인 실리코 약물 발견은 개인 맞춤화형 의료에 있어 중요한 의미를 갖고 있다.
둘째, 여러 산업 분야에서 양자 컴퓨팅은 2050년까지 연간 최대 8,500억 달러의 운용 이익을 발생시킬 것이다.
주가수익비율을 15로 가정하면 양자 컴퓨팅은 2050년까지 주식 시장 가치에 거의 13조 달러를 기여할 수 있다. 보스턴 컨설팅 그룹에 따르면 이 수익은 연간 증가 흐름과 반복적인 비용 효율성의 격차를 거의 균등하게 분할할 것이다.
셋째, 가장 큰 승자는 가능한 한 빨리 기회를 식별하고 기술이 개발되는 즉시 활용할 준비를 하는 이들이 될 것이다.
오늘날 기업들은 준비를 위해 무엇을 할 수 있을까? 보스턴 컨설팅 그룹이 말하는 첫 번째 단계는 진단 평가를 수행하여 산업 분야에 대한 양자 컴퓨팅의 잠재적 영향을 확인한 후, 이 영향권에 해당하는 경우 창출 가능한 가치를 포착하기 위한 파트너십 전략을 추진하는 것이다. 이 부분에서 기업들은 기회와 과제, 컴퓨팅 리소스 사용에 대한 자체 평가를 해야 하고, 이상적으로는 담당자 선정, 연구개발도 포함시켜야 한다. 양자 컴퓨팅을 준비하는 기업이라면 적어도 다음 4가지 질문을 던져야 한다.
1. 현재 고성능 컴퓨터를 통해 문제를 해결하기 위해 많은 돈이나 기타 자원을 사용하고 있는가?
2. 고성능 컴퓨팅 또는 기타 컴퓨팅 솔루션으로도, 시뮬레이션 또는 최적화 문제를 해결하는 데 어려움이 있는가?
3. 화학 물질 및 기타 액체 또는 기체 물질을 다루는 연구나 물리적 프로토타입 제작과 같은 비효율적인 시행착오 실험에 리소스를 사용하고 있는가?
4. 조합 최적화, 미분 방정식, 선형 대수, 인수 분해를 포함해 양자 이점이 있는 분야에 기초한, 해결해야 하는 주요 문제가 있는가?
위 4가지 질문에 하나라도 ‘그렇다’가 있다면, 다음 단계는 ‘양자 가치 진단’을 수행하는 것이다. 이것은 양자 컴퓨팅이 특정 산업 내 개별 ‘통점(pain point)’에 조기에 또는 대규모로 영향을 미칠 수 있는 부분을 평가하는 것으로 시작한다. 또한 산업 또는 특정 기업에 대한 양자 컴퓨팅의 시간 경과에 따른 가치 평가를 설정하려면 여러 출처에서 전문 지식을 수집하고 종합해야 한다.
이러한 출처는 다음과 같다.
- 업계 비즈니스 리더 : 현재 고통받고 있는 문제를 해결할 때의 비즈니스 가치를 입증할 수 있다.
- 업계 기술 전문가 : 현재와 미래의 비 양자 솔루션의 한계와 문제점을 평가할 수 있다.
- 양자 컴퓨팅 전문가 : 양자 컴퓨팅이 언제 문제를 해결할 수 있는지 확인할 수 있다.
Resource:
1. Boston Consulting Group. May 13, 2019. Matt Langione, Corban Tillemann-Dick, Amit Kumar, and Vikas Taneja. Where Will Quantum Computers Create Value - and When?
3. Boston Consulting Group. October 6, 2020. Jean-François Bobier, Jean-Michel Binefa, Matt Lan – gione, and Amit Kumar. IT’S TIME FOR FINANCIAL INSTITUTIONS TO PLACE THEIR QUANTUM BETS.
Despite the relentless advance of Moore’s law and complementary advances in software over the last half-century, there are still many problems that today’s computers can’t solve. Some problems simply await the next generation of semiconductors in order to become commercially important. But others will likely remain beyond the reach of classical computers forever. It is the prospect of finally solving these “classically intractable” problems which motivates potential providers and users at the dawn of the quantum computing era.
Their enthusiasm is not misplaced. But, despite its enormous potential, assessing the timing and real-world impact of quantum computing has proven more difficult than for any of the other transformational technologies highlighted in our book, Ride the Wave. Fortunately, the fog is beginning to clear as breakthroughs emerge from research labs around the world.
In the coming decades, Boston Consulting Group (or BCG) expects end-users of quantum computing to realize gains in the form of both cost savings and revenue opportunities of up to $850 billion annually. These gains will accrue first to firms in industries with complex simulation and optimization requirements. The way forward is likely to involve “a slow build” over the next few years, reaching a relatively modest $2 billion to $5 billion a year by 2024. But then value creation will accelerate rapidly as the technology and its commercial applications mature.
The best estimates agree that the first payoffs will begin 4 or 5 years in the future and “the big quantum jackpot” probably lies over ten years out. That begs the question, “If quantum computing’s transformative value is at least five to ten years away, why are enterprises considering investments now?”
The simple answer is that this is a radical technology which presents formidable ramp-up challenges, even for companies which already possess advanced supercomputing capabilities. Both quantum programming and the underlying “quantum tech stack” bear little resemblance to their classical counterparts, although the two technologies are likely to work quite closely together. Therefore, early adopters stand to gain expertise, visibility into technological gaps and key intellectual property that will put them at a structural advantage when quantum computing gains commercial traction.
More importantly, experts believe that progress toward maturity in quantum computing will not follow a smooth continuous curve. Instead, quantum computing is likely to experience breakthroughs sporadically. Companies that have invested to integrate quantum computing into the workflow are far more likely to be in a position to capitalize on their experience and the leads they open up will be difficult for others to close.
This will confer a substantial advantage in industries in which classically intractable computational problems lead to bottlenecks and missed revenue opportunities.
The assessment of future business value begins with the question of what kinds of problems quantum computers can solve more efficiently than binary machines. There is no simple answer, but two indicators are the size and complexity of the calculations that need to be done.
Consider drug discovery, for example. For scientists trying to design a compound that will attach itself to, and modify a target disease pathway, the critical first step is to determine the electronic structure of the molecule. But modeling the structure of a molecule of an everyday drug such as penicillin, which has 41 atoms at ground state, would require a classical computer with some 1086 bits; that involves more transistors than there are atoms in the observable universe. Therefore, such a machine is a physical impossibility. But for quantum computers, this type of simulation is well within the realm of possibility, requiring a processor with just 286 quantum bits, or qubits. The best opportunities for maximizing the impact of quantum computers seem to lie in:
- Materials Design,
- Drug Discovery,
- Financial Services,
- Computational Fluid Dynamics,
- Transportation and Logistics,
- Energy, and
- Meteorology.
Given this trend, we offer the following forecast for your consideration.
First, the full potential of quantum computing will be realized in three phases over roughly three decades.
The period through 2025 will be characterized by so-called Noisy Intermediate-Scale Quantum devices (or NISQs), which will become increasingly capable of performing useful, discrete functions characterized by relatively high error rates. These will most likely be used to exploit quantum heuristics, somewhat analogous to conventional neural networks.
Companies including IBM, Google, and Rigetti, are anticipating technological breakthroughs in error mitigation techniques which will enable NISQ devices to produce the first quantum-advantaged experimental discoveries in simulation and combinatorial optimization. - Next, sometime between 2025 and 2035, quantum computers are expected to achieve so-called Broad Quantum Advantage , which will yield superior performance in a wide range of tasks which have real industrial significance.
This phase will deliver a genuine quantum-leap over the speed, cost and quality possible with conventional binary machines. This era will require overcoming significant technical hurdles in “error correction” and other areas, as well as continuing increases in the power and reliability of quantum processors. In preparation for this phase, companies such as Zapata Computing are betting that quantum-advantaged molecular simulation will drive not only significant cost savings but the development of better products that reach the market sooner.
Ultimately, a third phase called Full-Scale Fault Tolerance is expected to arrive during the 2040s. Achieving full-scale fault tolerance will require makers of quantum technology to overcome additional technical constraints, including problems related to scale and stability. But once they arrive, fault-tolerant quantum computers will transform a broad array of industries. They have the potential to vastly reduce trial-and-error and improve automation in the specialty-chemicals market, enable tail-event defensive trading and risk-driven high-frequency trading strategies in finance, and turbo-charge the productivity of “in silico drug discovery,” which has major implications for personalized medicine.
Second, across multiple industries, quantum computing will increase incremental operating income by up to $850 billion a year by 2050.
Assuming a P/E ratio of 15, quantum computing could contribute nearly $13 trillion to stock market valuations by 2050. According to the Boston Consulting Group, this payoff will be almost evenly split between incremental annual revenue streams and recurring cost efficiencies.
Third, the biggest winners will be those who identify opportunities as early as possible and prepare to exploit them as soon as the technology arrives.
That begs the question, “What can companies do today to get ready?” According to BCG , a good first step is performing a diagnostic assessment to determine the potential impact of quantum computing on your industry and then, if appropriate, developing a partnership strategy for capturing the value that can be created. The first part of the diagnostic involves a self-assessment of your company’s opportunities and challenges and its use of computing resources, ideally involving people from R&D and other functions. The key questions to ask include at least these four:
1. Are we currently spending a lot of money or other resources to tackle problems using high-performance computers? If so, are these efforts yielding low impact, delayed, or piecemeal results that appear to “leave value on the table?”
2. Does the presumed difficulty of solving simulation or optimization problems prevent us from trying high-performance computing or other computational solutions
3. Are we spending resources on inefficient trial-and-error alternatives, such as wet-lab experiments or physical prototyping? And,
4. Are any of the major problems we need to solve rooted in quantum-advantaged problem archetypes including combinatorial optimization, differential equations, linear algebra, or factorization?
If the answer to any of these questions is yes, the next step is to perform a “quantum value diagnostic.” This starts by assessing where quantum computing could have an early or outsized impact on discrete “pain points” in particular industries. Behind each pain point, you’ll find a bottleneck for which there may be multiple solutions or a latent pool of income that can be tapped in many ways.
Therefore, mapping opportunities must include solutions rooted in other technologies - such as machine learning - that may arrive on the scene sooner, at lower cost, or may be integrated more easily into existing workflows. Establishing a valuation over time for quantum computing in a given industry or for a given firm will require gathering and synthesizing expertise from a number of sources. These sources should include:
- Industry business leaders who can attest to the business value of addressing a given pain point;
- Industry technical experts who can assess the limits of current and future nonquantum solutions to each pain point; and
- Quantum computing experts who can confirm that quantum computers will be able to solve the problem and when.
Using this methodology, BCG has sized up the impact of quantum advantage for a number of sectors, with an emphasis on early opportunities.
Consider the results.
Materials Design and Drug Discovery
On the face of things, no two fields of R&D more naturally lend themselves to quantum advantage than materials design and drug discovery. Even if some experts dispute whether quantum computers will have an advantage in modeling the properties of quantum systems, there is no question that the shortcomings of classical computers limit R&D in these areas. Materials design, in particular, is a slow lab-based process characterized by trial and error.
According to R&D Magazine , for specialty materials alone, global firms spend upwards of $40 billion a year on candidate material selection, material synthesis, and performance testing. Improvements to this workflow will yield not only cost savings through efficiencies in design and reduced time to market but revenue uplift through net new materials and enhancements to existing materials. The benefits of design improvements yielding optimal synthetic routes would, in all likelihood, flow downstream, affecting the estimated $460 billion spent annually on industrial synthesis.
The biggest benefit quantum computing offers is the potential for simulation, which for many materials requires computing power that binary machines do not possess. Reducing trial-and-error lab processes and accelerating the discovery of new materials is only possible if materials scientists can derive higher-level spectral, thermodynamic, and other properties from ground-state energy levels described by the Schrödinger equation.
The problem is that none of today’s approximate solutions - from Hartree-Fock to density functional theory - can account for the quantized nature of the electromagnetic field. Current computational approximations only apply to a subset of materials for which interactions between electrons can effectively be ignored or easily approximated, and there remains a well-defined set of problems which need simulation-based solutions; outsized rewards can be expected to accrue the companies that manage to solve them first. These problems include simulations of:
- strongly correlated electron systems (for high-temperature superconductors),
- manganites with colossal magnetoresistance (needed for high-efficiency data storage and transfer),
- multiferroics (for high-absorbency solar panels), and
- high-density electrochemical systems (for lithium-air batteries).
All of the major players in quantum computing, including IBM, Google, and Microsoft, have recently established partnerships or offerings in materials science and chemistry. Google’s partnership with Volkswagen, for example, is aimed at simulations related to high-performance batteries as well as other materials. Microsoft released a new chemical simulation library developed in collaboration with Pacific Northwest National Laboratory. And IBM, having run the largest-ever molecular simulation on a quantum computer in 2017, released an end-to-end stack for quantum chemistry in 2018.
Potential end-users of the technology are embracing these efforts. One researcher at a leading global materials manufacturer believes that quantum computing “will be able to make a quality improvement on classical simulations in less than five years,” during which period value to end-users approaching some $500 million is expected to come in the form of design efficiencies (measured in terms of reduced expenditures across the R&D workflow).
As error correction enables functional simulations of more complex materials, “you’ll start to unlock new materials and it won’t just be about efficiency anymore,” a professor of chemistry reported. During the period of broad quantum advantage, BCG estimates that upwards of $5 billion to $15 billion a year in value (which they measure in terms of increased R&D productivity) will accrue to end-users, principally through the development of new and enhanced materials.
Once full-scale fault-tolerant quantum computers become available, the value could reach the range of $30 billion to $60 billion a year, principally through new materials and extensions of in-market patent life as time-to-market is reduced. As the head of business development at a major materials manufacturer stated, “If unknown chemical relationships are unlocked, the current specialty chemical market [currently $51 billion in operating income annually] could double.”
Quantum advantage in drug discovery will be later to arrive given the maturity of existing simulation methods for “established” small molecules. Nonetheless, in the long run, as quantum computers unlock simulation capabilities for molecules of increasing size and complexity, experts believe that drug discovery will be among the most valuable of all industrial applications.
In terms of cost savings, the drug discovery workflow is expected to become more efficient, with in silico modeling increasingly replacing expensive in vitro and in vivo screening. But there is good reason to believe that there will be major top-line implications as well. Experts expect more powerful simulations not only to promote the discovery of new drugs but also to generate replacement value over today’s generics as larger molecules produce drugs with fewer side-effects.
Between reducing the $35 billion in annual R&D spending on drug discovery and boosting the $920 billion in yearly branded pharmaceutical revenues, quantum computing is expected to yield $35 billion to $75 billion in annual operating income once companies have access to fault-tolerant machines.
Financial Services
In recent history, few if any industries have been faster to adopt vanguard technologies than financial services. There is good reason to believe that the industry will quickly ramp up investments in quantum computing, which can be expected to address a clearly defined set of simulation and optimization problems - in particular, portfolio optimization in the short term and risk analytics in the long term. Investment money has already started to flow to startups, with Goldman Sachs and Fidelity investing in full-stack companies such as D-Wave, while RBS and Citigroup have invested in software players such as 1QBit and QC Ware.
Discussions with quantitative investors about the pain points in portfolio optimization, arbitrage strategy, and trading costs make it easy to understand why. While investors use classical computers for all these problems today, the capabilities of these machines are limited - not so much by the number of assets or the number of constraints introduced into the model as by the types of constraints.
For example, adding noncontinuous, nonconvex functions such as interest rate yield curves, trading lots, buyin thresholds, and transaction costs to investment models makes the optimization “surface” so complex that classical optimizers often crash, simply take too long to compute, or, worse yet, mistake a local optimum for the global optimum. To get around this problem, analysts often simplify or exclude such constraints, sacrificing the fidelity of the calculation for reliability and speed. Such trade-offs, many experts believe, would be unnecessary with quantum combinatorial optimization.
Exploiting the probability amplitudes of quantum states is expected to dramatically accelerate portfolio optimization, enabling a full complement of realistic constraints and reducing portfolio turnover and transaction costs; according to the head of portfolio risk at one major U.S. bank, this equals as much as 2% to 3% of assets under management.
BCG calculates that income gains from quantum portfolio optimization should reach $200 million to $500 million in the next three to five years and accelerate swiftly with the advent of enhanced error correction during the period of broad quantum advantage. The resulting improvements in risk analytics and forecasting should drive value creation beyond $5 billion a year.
As the brute-force Monte Carlo simulations used for risk assessment today give way to more powerful “quantum walk algorithms,” faster simulations will give banks more time to react to negative market risk (resulting returns of as much as 12 basis points). The expected benefits include better intraday risk analytics for banks and near-real time risk assessment for quantitative hedge funds.
One former quantitative analyst at a leading U.S. hedge fund complained, “Brute-force Monte Carlo simulations for economic spikes and disasters take a whole month to run.” Bankers and hedge fund managers hope that, with the kind of “whole-market simulations” theoretically possible on full-scale fault-tolerant quantum computers, they will be able to better predict black-swan events and even develop risk-driven high-frequency trading.
“Moving risk management from positioning defensively to an offensive trading strategy is a whole new paradigm,” noted a former trader at a U.S. hedge fund. Coupled with enhanced model accuracy and positioning against extreme tail events, reductions in capital reserves (by as much as 15% in some estimates) will position quantum computing to deliver $40 billion to $70 billion in annual operating income to banks and other financial services companies as the technology matures.
Computational Fluid Dynamics
Computational fluid dynamics (or CFD), which involves simulating the precise flow of liquids and gases under “changing conditions” on a computer, is a costly but critical process for many companies in a range of industries. Spending on simulation software by companies using CFD to design airplanes, spacecraft, cars, medical devices, and wind turbines exceeded $4 billion in 2017, but the costs that weigh most heavily on decision-makers in these industries are those related to expensive trial-and-error testing such as wind tunnel and wing flex tests.
These direct costs, together with the revenue potential of energy-optimized design, have many experts excited by the prospect of introducing quantum simulation into the workflow. The governing equations behind CFD, known as the Navier-Stokes equations, are nonlinear partial differential equations and thus a natural fit for quantum computing.
The first bottleneck in the CFD workflow is an optimization problem in the preprocessing stage that precedes any fluid dynamics algorithm. Because of the computational complexity involved in these algorithms, designers create a mesh to simulate the surface of an airplane wing or other object. The mesh is composed of geometric primitives whose vertices form a constellation of nodes. Most classical optimizers impose a limit on the number of nodes in a mesh that can be simulated efficiently to about one billion.
This forces the designer into a tradeoff between how fine-grained and how large a simulated surface can be. Quantum optimization is expected to relieve the designer of that constraint so that bigger pieces of the puzzle can be solved, more accurately. Improving this preprocessing stage of the design process is expected to lead to operating-income gains of between $1 billion and $2 billion across industries through reduced costs and faster revenue realization.
As quantum computers mature, the benefits of improved mesh optimization are expected to be surpassed by those from accelerated and improved simulations. As with mesh optimization, the tradeoff in fluid simulations is between speed and accuracy. For large simulations with more than 100 million cells, today’s run times can be weeks long, even using very powerful supercomputers.
And that is with the use of simplifying heuristics, such as “approximate turbulence models.” During the period of broad quantum advantage, experts believe that quantum simulation could enable designers to reduce the number of heuristics required to run Navier-Stokes solvers in manageable time periods, resulting in the replacement of expensive physical testing with accurate moving-ground aerodynamic models, unsteady aerodynamics, and turbulent-flow simulations.
The benefits to end-users in terms of cost reductions are expected to start at $1 billion to $2 billion a year during this period. With full-scale fault tolerance, BCG says value creation could as much as triple; at that point, experts anticipate that quantum linear solvers will unlock predictive simulations that not only obviate physical testing requirements but also lead to product improvements (such as improved fuel economy) and manufacturing yield optimization. CFD value creation in the phase of full-scale fault tolerance is expected to range from $19 billion to $37 billion a year in added operating income.
Other Industries
During the NISQ era, more than 40% of the value created in quantum computing is expected to come from materials design, drug discovery, financial services, and applications related to CFD. But applications in other industries will show early promise as well. Examples include:
Transportation and Logistics. Using quantum computers to address inveterate optimization challenges (such as the traveling salesman problem and the minimum spanning tree problem) is expected to lead to efficiencies in route optimization, fleet management, network scheduling, and supply chain optimization.
Energy. With the era of easy-to-find oil and gas coming to an end, companies are increasingly reliant on wave-based geophysical processing to locate new drilling sites. Quantum computing could not only accelerate the discovery process but also contribute to drilling optimizations for both greenfield and brownfield operations. And,
Meteorology. Many experts believe that quantum simulation will improve large-scale weather and climate forecasting technologies, which would not only enable earlier storm and severe weather warnings but also bring speed and accuracy gains to industries that depend on weather-sensitive pricing and trading strategies.
And if quantum computing becomes integrated into machine learning workflows, the list of affected industries would expand dramatically, with salient applications wherever
- predictive capabilities (such as supervised learning and deep learning),
- principal component analysis (such as dimension reduction), and
- lustering analysis (for anomaly detection)
provides an advantage. While experts are divided on the timing of quantum computing’s impact on machine learning, the stakes are so high that many of the leading players are already putting significant resources into it today, with promising early results.
For example, in conjunction with researchers from Oxford and MIT, a group from IBM recently proposed a set of methods for optimizing and accelerating support vector machines, which are applicable to a wide range of classification problems but have fallen out of favor in recent years because they quickly become inefficient as the number of predictor variables rises and the feature space expands. The eventual role of quantum computing in machine learning is still being defined, but early theoretical work, at least for optimizing current methods in linear algebra and support vector machines, shows promise.
Resource List:
1. Boston Consulting Group. May 13, 2019. Matt Langione, Corban Tillemann-Dick, Amit Kumar, and Vikas Taneja. Where Will Quantum Computers Create Value - and When?
3. Boston Consulting Group. October 6, 2020. Jean-François Bobier, Jean-Michel Binefa, Matt Lan – gione, and Amit Kumar. IT’S TIME FOR FINANCIAL INSTITUTIONS TO PLACE THEIR QUANTUM BETS.