Jonas Nüßlein, Thomas Gabor, Claudia Linnhoff-Popien, Sebastian Feld
Quadratic Unconstrained Binary Optimization (QUBO) can be seen as a generic language for optimization problems. QUBOs attract particular attention since they can be solved with quantum hardware, like quantum annealers or quantum gate computers running QAOA. In this paper, we present two novel QUBO formulations for k-SAT and Hamiltonian Cycles that scale significantly better than existing approaches. For k-SAT we reduce the growth of the QUBO matrix from O(k) to O(log(k)). For Hamiltonian Cycles the matrix no longer grows quadratically in the number of nodes, as currently, but linearly in the number of edges and logarithmically in the number of nodes.
We present these two formulations not as mathematical expressions, as most QUBO formulations are, but as meta-algorithms that facilitate the design of more complex QUBO formulations and allow easy reuse in larger and more complex QUBO formulations.
Thomas Gabor, Michael Lachner, Nico Kraus, Christoph Roch, Jonas Stein, Daniel Ratke, Claudia Linnhoff-Popien
Based on the quantum-assisted genetic algorithm (QAGA) [11] and related approaches we introduce several modifications of QAGA to search for more promising solvers on (at least) graph coloring problems, knapsack problems, Boolean satisfiability problems, and an equal combination of these three. We empirically test the efficiency of these algorithmic changes on a purely classical version of the algorithm (simulated-annealing-assisted genetic algorithm, SAGA) and verify the benefit of selected modifications when using quantum annealing hardware. Our results point towards an inherent benefit of a simpler and more flexible algorithm design.
M. Friedrich, C. Roch, S. Feld, C. Hahn, and P. Fayolle
CSG trees are an intuitive, yet powerful technique for the representation of geometry using a combination of Boolean set-operations and geometric primitives. In general, there exists an infinite number of trees all describing the same 3D solid. However, some trees are optimal regarding the number of used operations, their shape or other attributes, like their suitability for intuitive, human-controlled editing. In this paper, we present a systematic comparison of newly developed and existing tree optimization methods and propose a flexible processing pipeline with a focus on tree editability. The pipeline uses a redundancy removal and decomposition stage for complexity reduction and different (meta-)heuristics for remaining tree optimization. We also introduce a new quantitative measure for CSG tree editability and show how it can be used as a constraint in the optimization process.
28th International Conference on Computer Graphics, Visualization and Computer Vision (WSCG)
We introduce Q-Nash, a quantum annealing algorithm for the NP-complete problem of finding pure Nash equilibria in graphical games. The algorithm consists of two phases. The first phase determines all combinations of best response strategies for each player using classical computation. The second phase finds pure Nash equilibria using a quantum annealing device by mapping the computed combinations to a quadratic unconstrained binary optimization formulation based on the Set Cover problem. We empirically evaluate Q-Nash on D-Wave’s Quantum Annealer 2000Q using different graphical game topologies. The results with respect to solution quality and computing time are compared to a Brute Force algorithm and the Iterated Best Response heuristic.
Published in 20th International Conference on Computational Science (ICCS 2020), 2020, p. 12. doi:10.1007/978-3-030-50433-5_38
S. Feld, C. Roch, K. Geirhos, and T. Gabor
Archetypes are those extreme values of a data set that can jointly represent all other data points. They often have descriptive meanings and can thus contribute to the understanding of the data. Such archetypes are identified using archetypal analysis and all data points are represented as convex combinations thereof. In this work, archetypal analysis is linked with quantum annealing. For both steps, i.e. the determination of archetypes and the assignment of data points, we derive a QUBO formulation which is solved on D-Wave’s 2000Q Quantum Annealer. For selected data sets, called toy and iris, our quantum annealing-based approach can achieve similar results to the original R-package archetypes.
28th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2020)
S. Feld, C. Roch, T. Gabor, M. To, and C. Linnhoff-Popien
Dynamic Time Warping (DTW) is a representative of a distance measure that is able to calculate the distance between two time series. It is often used for the recognition of handwriting or spoken language. The metaheuristic Quantum Annealing (QA) can be used to solve combinatorial optimization problems. Similar to Simulated Annealing it seeks to find a global minimum of a target function. In order to use specialized QA hardware, the problem to be optimized needs to be translated into a Quadratic Unconstrained Binary Optimization (QUBO) problem. With this paper we investigate whether it is possible to transfer the DTW distance measure into a QUBO formulation. The motivation behind is the hope on an accelerated execution once the QA hardware scales up and the aspiration of gaining benefits due to quantum effects that are not given in the classical calculation. In principle, we find that it is possible to transform DTW into a QUBO formulation suitable for executing on QA hardware. Also, the algorithm returns not only the minimum total distance between two sequences, but also the corresponding warping path. However, there are several difficulties that make a manual intervention necessary.
IEEE 5th International Conference on Computer and Communication Systems (ICCCS 2020)
Irmengard Sax, Sebastian Feld, Sebastian Zielinski, Thomas Gabor, Claudia Linnhoff-Popien, Wolfgang Mauerer
Many problems of industrial interest are NP-complete, and quickly exhaust resources of computational devices with increasing input sizes. Quantum annealers (QA) are physical devices that aim at this class of problems by exploiting quantum mechanical properties of nature. However, they compete with efficient heuristics and probabilistic or randomised algorithms on classical machines that allow for finding approximate solutions to large NP-complete problems. While first implementations of QA have become commercially available, their practical benefits are far from fully explored. To the best of our knowledge, approximation techniques have not yet received substantial attention. In this paper, we explore how problems’ approximate versions of varying degree can be systematically constructed for quantum annealer programs, and how this influences result quality or the handling of larger problem instances on given set of qubits. We illustrate various approximation techniques on both, simulations and real QA hardware, on different seminal problems, and interpret the results to contribute towards a better understanding of the realworld power and limitations of current-state and future quantum computing.
Published in ,. ACM, New York, NY, USA, 9 pages
S. Feld, M. Friedrich, and C. Linnhoff-Popien
The compression of geometry data is an important aspect of bandwidth-efficient data transfer for distributed 3d computer vision applications. We propose a quantum-enabled lossy 3d point cloud compression pipeline based on the constructive solid geometry (CSG) model representation. Key parts of the pipeline are mapped to NP-complete problems for which an efficient Ising formulation suitable for the execution on a Quantum Annealer exists. We describe existing Ising formulations for the maximum clique search problem and the smallest exact cover problem, both of which are important building blocks of the proposed compression pipeline. Additionally, we discuss the properties of the overall pipeline regarding result optimality and described Ising formulations.
IEEE Workshop on Quantum Communications and Information Technology 2018 (IEEE QCIT 2018), 2018, pp. 1-6
I. Sax, S. Feld, S. Zielinski, T. Gabor, C. Linnhoff-Popien, and W. Mauerer
Many industrially relevant problems can be deterministically solved by computers in principle, but are intractable in practice, as the seminal P/NP dichotomy of complexity theory and Cobham’s thesis testify. For the many NP-complete problems, industry needs to resort to using heuristics or approximation algorithms. For approximation algorithms, there is a more refined classification in complexity classes that goes beyond the simple P/NP dichotomy. As it is well known, approximation classes form a hierarchy, that is, FPTAS PTAS
APX
NPO. This classification gives a more realistic notion of complexity but—unless unexpected breakthroughs happen for fundamental problems like P = NP or related questions— there is no known efficient algorithm that can solve such problems exactly on a realistic computer. Therefore, new ways of computations are sought. Recently, considerable hope was placed on the possible computational powers of quantum computers and quantum annealing (QA) in particular. However, the precise benefits of such a drastic shift in hardware are still unchartered territory to a good extent. Firstly, the exact relations between classical and quantum complexity classes pose many open questions, and secondly, technical details of formulating and implementing quantum algorithms play a crucial role in real-world applications. Guided by the hierarchy of classical optimisation complexity classes, we discuss how to map problems of each class to a quantum annealer. Those problems are the Minimum Multiprocessor Scheduling (MMS) problem, the Minimum Vertex Cover (MVC) problem and the Maximum Independent Set (MIS) problem. We experimentally investigate if and how the degree of approximability influences implementation and run-time performance. Our experiments indicate a discrepancy between classical approximation complexity and QA behaviour: Problems MIS and MVC, members of APX respectively PTAS, exhibit better solution quality on a QA than MMS, which is in FPTAS, even despite the use of preprocessing the for latter. This leads to the hypothesis that traditional classifications do not immediately extend to the quantum annealing domain, at least when the properties of real-world devices are taken into account. A structural reason, why FPTAS problems do not show good solution quality, might be the use of an inequlity in the problem description of the FPTAS problems. Formulating those inequalities on a quantum hardware (mostly done by formulating a Quadratic Unconstrained Binary optimisation (QUBO) problem in form of a matrix) requires a lot of hardware space which makes finding an optimal solution more difficult. Reducing the density of a QUBO is possible by appropriately pruning QUBO matrices. For the problems considered in our evaluation, we find that the achievable solution quality on a real-world machine is unexpectedly robust against pruning, often up to ratios as high as 50% or more. Since quantum annealers are probabilistic machines by design, the loss in solution quality is only of subordinate relevance, especially considering that the pruning of QUBO matrices allows for solving larger problem instances on hardware of a given capacity. We quantitatively discuss the interplay between these factors.
1st International Symposium on Applied Artificial Intelligence (ISAAI’19)
Hybrid quantum-classical algorithms such as the Quantum Approximate Optimization Algorithm (QAOA) are considered as one of the most encouraging approaches for taking advantage of near-term quantum computers in practical applications. Such algorithms are usually implemented in a variational form, combining a classical optimization method with a quantum machine to find good solutions to an optimization problem. The solution quality of QAOA depends to a high degree on the parameters chosen by the classical optimizer at each iteration. However, the solution landscape of those parameters is highly multi-dimensional and contains many low-quality local optima. In this study we apply a Cross-Entropy method to shape this landscape, which allows the classical optimizer to find better parameter more easily and hence results in an improved performance. We empirically demonstrate that this approach can reach a significant better solution quality for the Knapsack Problem.
Accepted for publication, arXiv preprint arXiv:2003.05292 (2020)