The basic idea of quantum computing is surprisingly similar to that of kernel methods in machine learning, namely to efficiently perform computations in an intractably large Hilbert space.
Researchers Maria Schuld and Nathan Killoran from Canadian quantum computing firm Xanadu explore some theoretical foundations of this link and show how it opens up a new avenue for the design of quantum machine learning algorithms.
The Hilbert space is the place where the states that describe quantum systems live, and it is a very large place indeed, explained Schuld in a Medium post. For a 50-qubit quantum computer, that means something like a quadrillion-dimensional space, and for a single mode of a continuous-variable quantum computer, the Hilbert space has an infinite number of dimensions.
Kernels are functions that compute a distance measure between two data points, for example between two images or text documents. Machine learning models like support vector machines are built from kernels (financial applications include financial time series forecasting).
It turns out that every kernel is related to a large — and sometimes infinite-dimensional — feature space. Computing the distance measure of two data points is equivalent to embedding these data points into the feature space and computing the inner product of the embedded vectors.
In a sense, this is the opposite of neural networks, where data are compressed to extract a few features. Here, data are effectively ‘blown up’ to make it potentially easier to analyze.
Schuld and Killoran discuss two approaches for building a quantum model for classification.
In the first approach, the quantum device estimates inner products of quantum states to compute a classically intractable kernel. This kernel can be fed into any classical kernel method such as a support vector machine.
In the second approach, they use a variational quantum circuit as a linear model that classifies data explicitly in Hilbert space. They illustrate these ideas with a feature map based on squeezing in a continuous-variable system and visualize the working principle with two-dimensional mini-benchmark datasets.