>

Will quantum computing ever be available?

It goes without saying that quantum computing is complex. But people buy very complex things through simple operations every day. After all, few smartphone buyers know how their devices work. Even a small bar of soap makes it to the shelf only after the raw materials are extracted, refined, manufactured, packaged, shipped and stored.

The question is: Will the complexity of quantum computing be contained to the extent that end users can buy it “ready”? Answer: It depends on what you mean by ready.

If you fancy something like the MacBook Quantum that you could find on the shelves at Best Buy, this is unlikely to ever be the case. For most users, it may not be worthwhile to have your own quantum computer in the workplace. Unless you are a large, well-funded organization in a unique position to benefit from exclusive access to a quantum device, such as a government entity or a major financial institution, quantum computing resources will always be more practical to access via the cloud.

On the other hand, if you envision a digital market for quantum-powered applications, it could pay off in the next two to five years. But just because you can download some quantum software doesn’t mean it will immediately provide an advantage over classical computing only β€” or even that it is useful at all.

Some readers may have already seen this with AI “turnkey” solutions. While there are many commercial AI solutions available in the market, these solutions do not provide an advantage without some level of customization. For example, consider the similarity of all chatbots that you come across while browsing the web. These chatbots have become table bets, not advantages.

In general, the less customization required to make a turnkey solution, the less likely it is to offer a feature that a competitor cannot easily install. Each organization will have its own unique set of data, IT infrastructure, teams, and issues to solve. Any useful algorithm will need to be tailored to that unique environment to make an impact. This is true for artificial intelligence, and even more true for quantum computing.

Quantum applications require dedicated expertise

Now, some quantum use cases will be more accessible to off-the-shelf applications than others. There are actually many classic optimization solutions available ready-made, such as Gurubi And CPlexIt is not an exaggeration to imagine quantum-powered versions in the future. Although optimization use cases vary widely, they can all be mapped to well-known mathematical formulas, such as the mixed integer programming problem. However, it still takes an expert in the field to understand which variables or constraints should be prioritized. It also takes a technical expert to map enterprise problems into mathematical problems that a software solution can solve, and then modify the software for optimum performance.

Any feature of off-the-shelf quantum software will depend on having a dedicated team that can adapt the software to the organization’s unique problems. This includes both quantum computing experts And Experts who deeply understand business problems. It may seem that you can wait until the program is fully realized to start hiring quantum talent, but unfortunately the talent pool is dwindling quickly. in our area recent study In terms of enterprise quantitative adoption, we found that 69 percent of companies are beginning their path toward quantitative adoption, and 51 percent of these organizations are already beginning to assemble their quantitative teams. If you wait too long, the smartest minds will be gone.

You’ll also want to nurture relationships with outside advisors. The executives surveyed agreed: 96 percent said they couldn’t successfully embrace quantum computing without outside help. External consultants can save time and energy by helping identify use cases, anticipate obstacles, and build the software infrastructure you’ll need to make effective use of quantum computing.

Building quantum computing infrastructure

Quantum computing will never exist in a vacuum, and to add value, the components of quantum computing must be seamlessly integrated with the rest of the enterprise’s suite of technologies. This includes HPC clusters, ETL operations, data warehouses, S3 containers, security policies, etc. The data would need to be processed by classical computers before and after running it through quantum algorithms.

This infrastructure is important: any acceleration from quantum computing can easily be offset by mundane problems such as unstructured data storage and suboptimal ETL operations. Expecting a quantum algorithm to deliver an advantage with poor, classical infrastructure around it is like expecting a trip that saves you time when you don’t have a car to take you to and from the airport.

Infrastructure issues often manifest themselves in many of the current machine learning (ML) use cases. There may be many off-the-shelf tools available, but any useful application of machine learning will ultimately be unique to the intent of the model and the data used to train it. You need a streamlined process for preparing and cleaning the data, making sure the data is compliant with privacy and governance policies, tracking and correcting drifts in the model, and of course making sure the model does what you want it to do.

As enterprise ML users know, maintenance of these applications is an ongoing process. Ideally, you’ll have a development environment for prototyping, a staging environment for testing, and then a production environment for scaling the model for enterprise use, leveraging HPC and cloud resources. The complexity associated with building and deploying ML applications in production has called for the creation of a realm of MLOps (also referred to as AIOps) to manage this complexity.

Complexity only doubles when quantum computing is added, which requires a similar “QuantumOps” process to manage complexity and make it useful in production. Quantum devices are rapidly evolving, and to keep up, you’ll need a way to benchmark the performance of new quantum devices’ backends as they emerge to make sure you have the best configuration for your problem. The last thing you want is to invest millions in developing a quantitative application, only to acquire a new hardware or software component that makes your work obsolete. It will be critical to have an environment that gives you the flexibility to fine-tune your models, experiment with different configurations, track and compare changes, and iterate quickly.

A future beyond luxury?

In the future, quantum computing may be as invisible as the processor that powers the device you’re reading now. Quantum applications are easily accessible such as a web browser application or an email application.

But access is not the same as useful.

To gain any useful advantage from quantum computing, you need to lay the foundation with the team building and required infrastructure. Although fault-tolerant quantum machines are still a long way off, organizations can build their own workflows ahead of time and swap in these more powerful back-end machines once they’re online.

Ultimately, each company will face unique challenges that require unique quantitative applications. Applications between businesses may be similar, but any quantitative advantage will depend on adapting the quantitative application to the needs and capabilities of the business. This is in direct contrast to the idea of ​​an off-the-shelf quantum implementation, as appealing as that might sound.

image credit: Conspiracy/stock struggle

Jonathan Romero Fontalvo is founder and director of Professional Services at Zapata computing.