Quantum Machine Learning: A Roadmap for NISQ Era and Beyond

With classical machine learning having reached its limit and advanced computing requirements on the rise owing to the advent of big data and artificial intelligence, quantum computing has turned the need of the hour. However, dealing with the complexities of quantum machine learning is a challenge that many companies are trying to overcome. Although the technology is still at a nascent stage, it is imperative for all industries to begin exploring the potential of quantum artificial intelligence and quantum machine learning and develop a road map for customized use cases.

Current Research on Quantum Machine Learning

Currently the top leaders in quantum machine learning technology are Dr. Amit Ray of Compassionate AI Lab, Dr. Maria Schuld of Xanadu, D-Wave Systems Inc, Canada, and NASA Quantum Artificial Intelligence Laboratory.

All classical machine learning algorithms are based on serial processing , its depends on the feedback of the first loop. quantum machine learning algorithms are based on parallel processing. Research in the areas of quantum-enhanced reinforcement learning, quantum neural networks, quantum K-Nearest neighbor, quantum Bayesian network, quantum learning theory, and quantum support vector machine are growing fast.

Quantum Bayesian Network
Quantum Bayesian Network Model

On the hardware side, researchers still can’t tell which type of qubit technologies — superconducting loop, ion, neutral atom, quantum dot — works best. For one, they still haven’t settled on clear metrics to compare different devices. Top quantum scientists like Dr. Amit Ray, are building Roadmaps for 1000 qubits quantum computers. Researchers like John Preskill are developing Noisy Intermediate-Scale Quantum (NISQ) technology, which will be available in the near future. NISQ devices will be useful tools for exploring many-body quantum physics, and quantum machine learning problems. Quantum machine learning are now focused for solving traveling salesman problems.

Photo by Michael Dziedzic on Unsplash

Quantum theory and quantum machine learning

Quantum Computing refers to the use of quantum mechanical phenomena such as superposition and entanglement to perform computation. Quantum physics deals with the energy comes in indivisible packets called quanta. Quanta behave very differently to macroscopic matter: particles can behave like waves, and waves behave as though they are particles, quantum simulations of many-body localization.

Quantum theory is incredibly successful, explaining the microscopic world with great accuracy, from the behavior of subatomic particles to chemical reactions to solid-state electronics. There is not a single experimental finding challenging its predictions, and ever more quantum phenomena are exploited in technology, including interferometric sensing and quantum cryptography.

Current Quantum Computing Scenario

The current quantum computer ecosystem has evolved considerably and can be categorized into end-to-end providers (such as IBM, Google, Rigetti, Microsoft, and Alibaba), hardware and system players (such as Intel, IonQ, and QuTech), software and service players (such as 1QBit, QC Ware, Zapata Computing, and CQC), and specialists (such as Q-CTRL, QubitLogic, and Silicon Quantum Computing). This indicates that simultaneous efforts are being made in various aspects related to quantum computer.

Quantum Computing Scenario

Future Roadmap Quantum Machine Learning

As quantum machine learning has a steep learning curve and building in-house competence is time consuming, big companies need to start working toward adopting QML as soon as possible, if they have not already started. The promise of the quantum artificial intelligence, quantum neural networks, quantum K-Nearest neighbor, quantum learning theory are realizing fast as the quantum hardware is maturing. However, quantum noise is still a major obstacle for quantum machine learning.

The benefits of quantum machine learning are huge. The era of noisy quantum computers will last a minimum of five to ten years, depending on when researchers can successfully implement error correction. But with the promise of new funding, you can expect an early collapse of the wave function within the next two years.




Psychologist, Neuroscience therapy advocate, Breakthrough technology, lover, machine learning expert. Better Humans — Better Society — I Love You.

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Deep Reinforcement Learning And Its Applications

How to set up AI for Contract Review & Management: 7 Best Practices

Trusted AI Call to Action

7 SaaS Trends for Greater Digital Excellence in 2021

Five ways artificial intelligence could shape our lives

“A new chip cluster will make massive AI models possible”

Is a Centralized AI Dangerous?

Introduction to Neural Network

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Lucas Munds

Lucas Munds

Psychologist, Neuroscience therapy advocate, Breakthrough technology, lover, machine learning expert. Better Humans — Better Society — I Love You.

More from Medium

Toward Strongest Security Protocols

A Report from NASA SMD’s Artificial Intelligence Frontier

Quantum(-inspired) improvement of portfolio optimization

Announcing Support for Federated Analytics in Raven Distribution Framework (RDF)