
📰News:
- Determinable and interpretable network representation for link prediction
- Azure Quantum Credits Program propels quantum innovation and exploration for researchers, educators, and students
- View from India: Where next for quantum computing?
📽Videos:
- Quantum Machine Learning Explained
- Qiskit Falll Fest CIC-IPN Mexico 2022- Quantum Machine Learning
- Quantum Machine Learning Neuroimaging for Alzheimer’s Disease
- MLBBQ: Quantum Machine Learning by Pavel Popov
- What is a Qubit, and how do Quantum Computers work with Q#
📗Papers:
Representation Theory for Geometric Quantum Machine Learning
Michael Ragone, Paolo Braccia, Quynh T. Nguyen, Louis Schatzki, Patrick J. Coles, Frederic Sauvage, Martin Larocca, M. Cerezo
Oct 17 2022
quant-ph cs.LG math.RT stat.ML
arXiv:2210.07980v1
Recent advances in classical machine learning have shown that creating models with inductive biases encoding the symmetries of a problem can greatly improve performance. Importation of these ideas, combined with an existing rich body of work at the nexus of quantum theory and symmetry, has given rise to the field of Geometric Quantum Machine Learning (GQML). Following the success of its classical counterpart, it is reasonable to expect that GQML will play a crucial role in developing problem-specific and quantum-aware models capable of achieving a computational advantage. Despite the simplicity of the main idea of GQML — create architectures respecting the symmetries of the data — its practical implementation requires a significant amount of knowledge of group representation theory. We present an introduction to representation theory tools from the optics of quantum learning, driven by key examples involving discrete and continuous groups. These examples are sewn together by an exposition outlining the formal capture of GQML symmetries via “label invariance under the action of a group representation”, a brief (but rigorous) tour through finite and compact Lie group representation theory, a reexamination of ubiquitous tools like Haar integration and twirling, and an overview of some successful strategies for detecting symmetries.
Quantum Event Learning and Gentle Random Measurements
Adam Bene Watts, John Bostanci
arXiv:2210.09155v1
We prove the expected disturbance caused to a quantum system by a sequence of randomly ordered two-outcome projective measurements is upper bounded by the square root of the probability that at least one measurement in the sequence accepts. We call this bound the \textitGentle Random Measurement Lemma. We also extend the techniques used to prove this lemma to develop protocols for problems in which we are given sample access to an unknown state ρρ and asked to estimate properties of the accepting probabilities Tr[Miρ]Tr[Miρ] of a set of measurements {M1,M2,…,Mm}{M1,M2,…,Mm}. We call these types of problems \textitQuantum Event Learning Problems. In particular, we show randomly ordering projective measurements solves the Quantum OR problem, answering an open question of Aaronson. We also give a Quantum OR protocol which works on non-projective measurements and which outperforms both the random measurement protocol analyzed in this paper and the protocol of Harrow, Lin, and Montanaro. However, this protocol requires a more complicated type of measurement, which we call a \textitBlended Measurement. When the total (summed) accepting probability of unlikely events is bounded, we show the random and blended measurement Quantum OR protocols developed in this paper can also be used to find a measurement MiMi such that Tr[Miρ]Tr[Miρ] is large. We call the problem of finding such a measurement \textitQuantum Event Finding. Finally, we show Blended Measurements also give a sample-efficient protocol for \textitQuantum Mean Estimation: a problem in which the goal is to estimate the average accepting probability of a set of measurements on an unknown state.
Theory for Equivariant Quantum Neural Networks
Quynh T. Nguyen, Louis Schatzki, Paolo Braccia, Michael Ragone, Patrick J. Coles, Frederic Sauvage, Martin Larocca, M. Cerezo
Oct 18 2022
arXiv:2210.08566v1
Most currently used quantum neural network architectures have little-to-no inductive biases, leading to trainability and generalization issues. Inspired by a similar problem, recent breakthroughs in classical machine learning address this crux by creating models encoding the symmetries of the learning task. This is materialized through the usage of equivariant neural networks whose action commutes with that of the symmetry. In this work, we import these ideas to the quantum realm by presenting a general theoretical framework to understand, classify, design and implement equivariant quantum neural networks. As a special implementation, we show how standard quantum convolutional neural networks (QCNN) can be generalized to group-equivariant QCNNs where both the convolutional and pooling layers are equivariant under the relevant symmetry group. Our framework can be readily applied to virtually all areas of quantum machine learning, and provides hope to alleviate central challenges such as barren plateaus, poor local minima, and sample complexity.
Theoretical Guarantees for Permutation-Equivariant Quantum Neural Networks
Louis Schatzki, Martin Larocca, Frederic Sauvage, M. Cerezo
Oct 19 2022 quant-ph cs.LG stat.ML
arXiv:2210.09974v1
Despite the great promise of quantum machine learning models, there are several challenges one must overcome before unlocking their full potential. For instance, models based on quantum neural networks (QNNs) can suffer from excessive local minima and barren plateaus in their training landscapes. Recently, the nascent field of geometric quantum machine learning (GQML) has emerged as a potential solution to some of those issues. The key insight of GQML is that one should design architectures, such as equivariant QNNs, encoding the symmetries of the problem at hand. Here, we focus on problems with permutation symmetry (i.e., the group of symmetry SnSn), and show how to build SnSn-equivariant QNNs. We provide an analytical study of their performance, proving that they do not suffer from barren plateaus, quickly reach overparametrization, and can generalize well from small amounts of data. To verify our results, we perform numerical simulations for a graph state classification task. Our work provides the first theoretical guarantees for equivariant QNNs, thus indicating the extreme power and potential of GQML.
Investigating Quantum Many-Body Systems with Tensor Networks, Machine Learning and Quantum Computers
Oct 21 2022
quant-ph arXiv:2210.11130v1
We perform quantum simulation on classical and quantum computers and set up a machine learning framework in which we can map out phase diagrams of known and unknown quantum many-body systems in an unsupervised fashion. The classical simulations are done with state-of-the-art tensor network methods in one and two spatial dimensions. For one dimensional systems, we utilize matrix product states (MPS) that have many practical advantages and can be optimized using the efficient density matrix renormalization group (DMRG) algorithm. The data for two dimensional systems is obtained from entangled projected pair states (PEPS) optimized via imaginary time evolution. Data in form of observables, entanglement spectra, or parts of the state vectors from these simulations, is then fed into a deep learning (DL) pipeline where we perform anomaly detection to map out the phase diagram. We extend this notion to quantum computers and introduce quantum variational anomaly detection. Here, we first simulate the ground state and then process it in a quantum machine learning (QML) manner. Both simulation and QML routines are performed on the same device, which we demonstrate both in classical simulation and on a physical quantum computer hosted by IBM.
Escaping barren plateaus in approximate quantum compiling
Niall F. Robertson, Albert Akhriev, Jiri Vala, Sergiy Zhuk
Oct 18 2022 quant-ph
arXiv:2210.09191v1
Quantum compilation provides a method to translate quantum algorithms at a high level of abstraction into their implementations as quantum circuits on real hardware. One approach to quantum compiling is to design a parameterised circuit and to use techniques from optimisation to find the parameters that minimise the distance between the parameterised circuit and the target circuit of interest. While promising, such an approach typically runs into the obstacle of barren plateaus – i.e. large regions of parameter space in which the gradient vanishes. A number of recent works focusing on so-called quantum assisted quantum compiling have developed new techniques to induce gradients in some particular cases. Here we develop and implement a set of related techniques such that they can be applied to classically assisted quantum compiling. We consider both approximate state preparation and approximate circuit preparation and show that, in both cases, we can significantly improve convergence with the approach developed in this work.
OpenQAOA — An SDK for QAOA
Vishal Sharma, Nur Shahidee Bin Saharan, Shao-Hen Chiew, Ezequiel Ignacio Rodríguez Chiacchio, Leonardo Disilvestro, Tommaso Federico Demarie, Ewan Munro
Oct 18 2022
quant-ph arXiv:2210.08695v1
We introduce OpenQAOA, a Python open-source multi-backend Software Development Kit to create, customise, and execute the Quantum Approximate Optimisation Algorithm (QAOA) on Noisy Intermediate-Scale Quantum (NISQ) devices and simulators. OpenQAOA facilitates the creation of QAOA workflows, removing the more tedious and repetitive aspects of implementing variational quantum algorithms. It standardises and automates tasks such as circuit creation across different backends, ansatz parametrisation, the optimisation loop, the formatting of results, and extensions of QAOA such as Recursive QAOA. OpenQAOA is designed to simplify and enhance research on QAOA, providing a robust and consistent framework for experimentation with, and deployment of, the algorithm and its variations. Importantly, a heavy emphasis is placed on the provision of tools to enable QAOA computations at the scale of hundreds or thousands of qubits.
General Classification of Entanglement Using Machine Learning
Oct 17 2022 quant-ph arXiv:2210.07711v1
A classification of multipartite entanglement in qubit systems is introduced for pure and mixed states. The classification is based on the robustness of the said entanglement against partial trace operation. Then we use current machine learning and deep learning techniques to automatically classify a random state of two, three and four qubits without the need to compute the amount of the different types of entanglement in each run; rather this is done only in the learning process. The technique shows high, near perfect, accuracy in the case of pure states. As expected, this accuracy drops, more or less, when dealing with mixed states and when increasing the number of parties involved.