The adoption of machine learning (ML) into theoretical physics comes on the heels of an explosion of industry progress that started in 2012. Since that time, computer scientists have demonstrated that learning algorithms - those designed to respond and adapt to new data - provide an exceptionally powerful platform for tackling many difficult tasks in image recognition, natural language comprehension, game play and more. This new breed of ML algorithm has now conquered benchmarks previously thought to be decades away due to their high mathematical complexity. In the last several years, researchers at Perimeter have begun to examine machine learning algorithms for application to a new set of problems, including condensed matter, quantum information, numerical relativity, quantum gravity and astrophysics.
We propose to generalise classical maximum likelihood learning to density matrices. As the objective function, we propose a quantum likelihood that is related to the cross entropy between density matrices. We apply this learning criterion to the quantum Boltzmann machine (QBM), previously proposed by Amin et al. We demonstrate for the first time learning a quantum Hamiltonian from quantum statistics using this approach. For the anti-ferromagnetic Heisenberg and XYZ model we recover the true ground state wave function and Hamiltonian.
Short-depth algorithms are crucial for reducing computational error on near-term quantum computers, for which decoherence and gate infidelity remain important issues. Here we present a machine-learning inspired approach for discovering such algorithms. We apply our method to a ubiquitous primitive: computing the overlap Tr(rho*sigma) between two quantum states rho and sigma. The standard algorithm for this task, known as the Swap Test, is used in many applications such as quantum support vector machines, and, when specialized to rho=sigma, quantifies the Renyi entanglement.