Quantum techniques could revolutionise IT

Issued by University of KwaZulu-Natal
KwaZulu-Natal, Jun 26, 2020

A team of scientists from data cybernetics in Germany, the Korea Advanced Institute of Science and Technology (KAIST), and UKZN's Centre for Quantum Technology, published research in "Nature Quantum Information" about the successful application of a machine learning tool, known as the kernel method, to quantum computers, formally making a link that could enhance various machine learning methods.

Quantum computing, which solves some computational problems far faster than classical computing by employing quantum phenomena, promises the ability to enhance machine learning tasks, such as pattern recognition in the analysis of extremely large datasets.

By learning from labelled data, machine learning techniques are used to predict categories for input data, and these pattern recognition techniques in classical supervised machine learning aim to get the best possible results while using minimal computational resources. Despite the limitations in their computations, these techniques have been applied successfully in science and industry.

Vectors, the numerical representations of objects in pattern recognition and machine learning, are collected in what is called a feature space. Kernel methods are an important tool in pattern analysis, referring to a similarity measure of data corresponding to an inner product in the feature space. They are vital for pattern analysis; however, classical classifiers relying on kernel methods are limited when feature space is large, and the kernel functions are computationally expensive to evaluate.

The UKZN, KAIST and data cybernetics team further proposed a quantum binary classifier, where the raw data is either represented by classical data via a quantum feature map or intrinsic quantum data, and the classification is based on the kernel function that measures the closeness of the test data to training data.

Dr Daniel Park of KAIST explained that the distance-based quantum classifier's kernel is based on the quantum state fidelity, a natural measure of the similarity in the quantum domain, and said the quantum kernel can be tailored systematically with a quantum circuit, making it an excellent candidate for real-world applications.

The 'swap-test' algorithm developed by these researchers will be practically useful for machine learning tasks where there is a small number of training data and large feature space. The 'swap-test' is an elementary operation in quantum computing that calculates the state overlap (or fidelity) of two states. Regardless of the number or size of the data, this protocol requires only a constant number of repetitions, and their novel approach means labelled training data can be densely packed into a quantum state and then be compared to the test data.

"I am delighted to see this great improvement, which makes it possible to apply kernel methods to their full extent," said Professor Francesco Petruccione. He noted the key advantage of the swap-test classifier in comparison to earlier classification algorithms is that the state fidelity of two quantum states includes the imaginary parts of the probability amplitudes, which enables use of the full feature Hilbert vector space.

To demonstrate the usefulness of the classification protocol, independent researcher Dr Carsten Blank implemented the classifier and compared classical simulations with a realistic noise model with a proof-of-principle experiment using the IBM Quantum Experience cloud platform.

"In September 2019, the results were suddenly astonishingly good, proving that our ideas are valid and that the performance of devices is rapidly improving. This is a promising sign that the field is progressing," said Blank.

Dr Park and Dr June-Koo Kevin Rhee from KAIST, with Prof Petruccione, found that a technique called quantum forking can be used to implement the classifier.

"This makes it possible to start the protocol from scratch when the labelled training data and the test data state are all in a product state," said Rhee. "An application to this might be classification of quantum states that we, as humans, don't have a priori knowledge about."

"We demonstrated that our classifier is equivalent to measuring the expectation value of a Helstrom operator, from which the well-known optimal quantum state discrimination can be derived," said Park, calling the finding surprising as it links the optimal state discrimination to state classification and motivates more research.