A Deeper Dive into Near-Term Quantum & Applications in Drug Discovery


Introduction

The development in quantum computing is only speeding up, and since my last dive into the space at the beginning of last year, I’ve met many more researchers, customers, and companies working on the space. This only deepened my focus on near-term quantum, so in this piece I narrowed in on:

  • Hybrid quantum-classical algorithms and their promising applications to drug-development
  • The kinds of business models massive or venture-scale companies will use to employ near-term quantum 

(via https://www.youtube.com/watch?v=-UlxHPIEVqA)

Near-term Quantum Approaches

To briefly summarize my last quantum piece: today, the majority of quantum computing hardware is at a stage roughly comparable to the vacuum tube or transistor stage of classical computing. It’s as if we’re currently trying to invent the integrated circuit and scale up. For comparison, the integrated circuit largely represented scalability in computing – something quantum is still struggling to achieve.

However, there are a few different near-term algorithms that have emerged from progress in our understanding of quantum physics and quantum computing. I’ll cover: algorithms run on more effective simulation of quantum systems on classical computers, and algorithms that use both quantum and classical hardware to optimize for both of their strengths. 

Quantum Simulation

Simulation of a quantum computer on a classical computer gets exponentially harder to do the more qubits you’re trying to simulate. This returns us to the reason quantum computers are helpful in the first place – they are better at doing certain types of computation more efficiently than classical computers. So the potential for quantum simulation on classical computers is capped. 

At a high level, quantum simulation is valuable because of condensed matter physics. We can understand properties of liquids and solids (which condensed matter physics deals with;generally quantum properties), well by simulating a quantum system. Subsets of condensed matter physics, like molecular mechanics, are often employed when modeling in synthetic chemistry. 

Algorithms on Classical Systems

There’s a gradient of true “quantumness” to be had in these approaches. Density functional theory (DFT) for example, has been around since the 1970’s, and employs quantum mechanical modeling to understand the electronic structure of molecules, which leaves it on the lesser end of “quantumness.” DFT is best employed with middle tier complexity molecules where high computational accuracy is not necessary.

DARPA also issued a Quantum-Inspired Classical Computing Program in November of 2021, which is an interesting indication of the DoD’s interest in the potential of this technology.

Hybrid algorithms

While quantum simulation or quantum-inspired algorithms on classical hardware employ some novel quantum research, hybrid algorithms take us a step further. These algorithms are cleverly sliced to run partially on classical hardware and partially on quantum hardware. The goal is typically to take advantage of the efficiencies of quantum hardware and the scale/stability possible with classical hardware, then synthesize them back together. 

Toy model of a hybrid Quantum-classical NN classifier from Guillaume Verdon’s University of Waterloo Lecture 22

The key to understanding the “quantumness” of hybrid quantum-classical algorithms more broadly is to understand what is being done by way of a quantum processing unit, if anything. This level of deeper use of quantum is generally rare across most industries, especially applied to drug development. 

Variational Quantum Eigensolvers (VQEs), are the oldest example of a hybrid quantum classical algorithm, and most popular type of VQA (variational quantum algorithm). Since then, VQAs have expanded in scope, and have been broadly named QNNs (Quantum Neural Networks). With respect to VQEs, the preparation of the ansatz (a first guess, or starting point) and the measurement of the expected value of the Hamiltonian are carried out on the quantum hardware. Next, the classical computer optimizes the iterative parameters of the next ansatz. This can be used to find the ground state of a molecule. 

Hybrid Hardware

There are also hybrid hardware approaches, which would take a whole other blog post to appropriately delve into. Dell, for example, has publicized a partnership with IonQ that supposedly utilizes Dell’s proprietary classical hardware in partnership with IonQ’s simulation capabilities as well as their own quantum processing unit.This type of partnership between hardware companies is increasingly likely as new quantum companies deepen their focus and talent acquisition in a narrower area of hardware or simulation.

It’s also worth mentioning quantum annealers here. They are not full quantum computers (they cannot compute Shor’s Algorithm, for example), but they are currently in use and helpful for some problem areas in drug-discovery. 

Annealing works to find a global minimum of a problem in a given set of candidate solutions, and quantum annealing can occasionally outperform thermal. Menten AI, for example, which develops peptide and protein therapeutics using machine learning and quantum computing, has published their theory on designing peptides with a quantum annealer, using a QPacker Algorithm, which you can read a breakdown of here.

Applications in the Drug Discovery Process

There are many different areas of quantum theory being researched for applications in the drug discovery process, but we can only scratch the surface of potential applications because of the vast problem space of computational biology. The near-term applications share a few features in common, principally their ability to simulate complex systems, especially particular aspects of molecular systems.  We are nowhere near being able to develop a generalized simulation platform even for a category like enzymes; you need pretty specific expertise on a type of enzyme, for example, and then work from there. 

  ( A breakdown of the broader CADD process from BCG) 

Across the entire computer-aided drug discovery process, different portions have already been unevenly affected by classical computation research, which is still rapidly progressing. Any developments made in near-term quantum that are applied to this process will be done by taking into account recent updates to classical algorithms and breakthroughs in broader research. 

The most ripe areas for NISQ-era innovation are largely within structure-based drug design, but occasionally elsewhere (including binding site identification, pharmacophore modeling, de novo protein modeling, conformational search, docking, and statistical modeling). 

Things like broader virtual screening, molecular dynamics / free energy perturbation (FEP), Ligand-based virtual screening, library design, Quantitative structure–activity relationship (QSAR), pKa prediction, membrane permeability, sequence alignment/homology modeling, and crystal structure prediction are too complex for near-term quantum and hopefully will be tackled further down the line as quantum computers are much more stable.

Some hybrid quantum-classical approaches will likely find ways to improve structure-based drug design of small molecules, but most ligand-based approaches and the north star some site of an end-to-end in silico drug discovery process is much farther off. 

Market predictions and opportunities

There are large-scale use cases for near-term quantum algorithms, but what will the company that uses them look like? There are a few different ways to think about it.

First of all, we at Compound have believed for a while that algorithms are not a long term moat, unless you have some proprietary method of gathering data that is defensible for a long time. Instead, proprietary customer understandings, and the product they help you build, are defensible long term moats, built off of the backs of these algorithms. 

This is key to the initial GTM of near-term quantum algorithms companies. They must build up defensibility by expanding their software and feature sets further and further into the materials or drug discovery process, based off proprietary knowledge of how their algorithms uniquely enable discovery.

Existing Companies & Aquisitions

There are a number of companies tackling this today, some coming from a more quantum approach, and others from an existing specialization in HPC/computation parallelization  software/hardware for the industry but incorporated with a focus of looking to push into the frontier by exploring near term quantum algorithms or simulation development.

It can be very hard to tell what the actual proprietary approach of companies in the space is, especially since many of the companies operate in semi-stealth, but I’ll list a few based on the claims they make and some digging into technical papers they’ve published: 

  1. Kuano AI – Developing optimized enzyme-inhibitors with DFT, and developing quantum algorithms. 
  2. Menten AI – Developing peptide and protein therapeutics using machine learning and quantum computing algorithms they’re developing (including the qPacker algorithm mentioned above) 
  3. PolarisQB – Developing quantum-inspired computing to increase the speed of protein targeting
  4. ProteinQure – Developing a platform to aid design of protein-based drugs employing machine learning and quantum algorithms, and TPUs and is looking to build specialized circuits.
  5. QSimulate – Developing a number of tech stacks, including for drug discovery, focusing on quantum simulation for covalent inhibitor design, simulations of ligand-protein interactions for small molecule drug screening, and the Fermionic Quantum Emulator, a software framework for emulating the behavior of quantum computers in fermionic simulations. 
  6. Riverlane – Developing a quantum simulation engine for materials, drugs, and catalysis development more broadly. 
  7. XtalPi – Creating a drug-discovery firm that has proprietary software using quantum physics-based, AI-powered drug R&D 

It is also notable that many of these companies often use DFT simulations (not especially new, or particularly accurate, as I mentioned above) and will advertise them as quantum-inspired.

One notable quantum-chemistry acquisition thus far has been Rahko AI, a quantum ML/ algorithms company focused on drug discovery, which had just completed a seed round of funding and was then acquired by Odyssey Therapeutics, an immunotherapy company around January 6, 2022. Rahko was largely running a lower-code VQE platform and POCs with pharma companies that were eager to test their technology. 

Ideal Company Structure: Shrodinger

To understand the ideal company structure for a massive, venture-scale company utilizing any of the near-term quantum developments I mentioned above, Schrodinger is one great example to look at. 

Schrodinger was founded in 1990, develops software for computational chemistry, and has a pipeline of collaborative & internal drug discovery programs. While traditional discovery allows for the synthesis of 1,000 new molecules a year, Schrodinger offers synthesis and evaluation of tens of billions of molecules every week.

Their revenue comes from two places: licensing software, and some downstream upside through royalties and milestone payments if the drugs they’ve involved in creating are successful. Their sales cycles to pharma often last 9-12 months because of testing and evaluation periods, contract negotiation, and budgeting. This timeline can be challenging, but also incredibly sticky when good relationships are formed. For example, 79% of their revenue last quarter is attributed to $4.4 million from their contract with Bristol Myers Squibb (BMS). At their own cost, they will spend up to 4 years developing molecules for BMS to commercialize, but then may gain up to $2.7 billion in total milestones across all potential targets if any of the molecules they helped develop are successfully commercialized. 

In many ways they’re a perfect company analog for the near-term quantum computation chemistry company we’re looking for. Because of that, you might be asking: why do we believe there will be new venture-scale incumbents, rather than legacy players developing internal quantum teams and employing these algorithms? 

Talent Constraints

The answer is largely: talent constraints. Talent is a very real bottleneck on the success of these companies today, due to the deep background necessary in a variety of academic fields along with quantum plus the software/product development experience ideal for scaling up a software platform of any sort. This is why we see companies like Zapata computing, Qureca, and many more announcing online training courses. Some quantum “ecosystem” companies like Qbraid are even attempting to capture potential hires as early as high school by pulling them into specialized courses to capture long term value.

Note on Moats 

We believe incredibly strongly that algorithms are generally only short-term economic moats, and the consequential product built out from the greater understanding of customers enabled by these algorithms is the longer term moat. This has been played out in machine learning many times over, and it follows logically that the same will be true for quantum. Although much of this piece has been overviewing different technological approaches, this all falls second to targeting early customers that will give you an ability to productionize these algorithms to produce value that eats further and further into a customers’ process over time. 

That being said, one other thing worth mentioning is the false signal that some pharma partnerships can provide. Companies using newer quantum techniques applied to drug discovery are generally looking for a signal from pharma customers that there is willingness to purchase and use their software platform because it is valuable to them internally at scale for drug discovery. However, many of these teams fall into pharma partnerships that are ultimately an IP vacuum, with smaller internal pharma research departments that would almost never convert into a bigger contract (rather than a high signal indicator of how valuable a new approach might be to the broader Pharma company). It’s important to approach these smaller initial research partnerships with some curiosity and skepticism about what realistically might prompt them to convert into greater scale contracts.

Concluding Thoughts

In summary, some hybrid quantum-classical approaches (across both hybrid hardware and hardware) will likely find ways to improve structure-based drug design of small molecules, and serve as an initial technical wedge to build a Schrodinger like SaaS product that is used by pharma companies. 

As always — feel free to tweet or message me questions, thoughts, disagreements, or pitches on twitter or at nicolewilliams@compound.vc.


Appendix & Educational Resources

Lastly, I want to provide a few educational resources that really helped me along the way, since I think well-made explainers are a true art form and should be shared with as many people as possible!