Quantum Machine Learning -  Peter Wittek

Quantum Machine Learning (eBook)

What Quantum Computing Means to Data Mining

(Autor)

eBook Download: PDF | EPUB
2014 | 1. Auflage
176 Seiten
Elsevier Science (Verlag)
978-0-12-801099-0 (ISBN)
Systemvoraussetzungen
Systemvoraussetzungen
71,95 inkl. MwSt
  • Download sofort lieferbar
  • Zahlungsarten anzeigen
Quantum Machine Learning bridges the gap between abstract developments in quantum computing and the applied research on machine learning. Paring down the complexity of the disciplines involved, it focuses on providing a synthesis that explains the most important machine learning algorithms in a quantum framework. Theoretical advances in quantum computing are hard to follow for computer scientists, and sometimes even for researchers involved in the field. The lack of a step-by-step guide hampers the broader understanding of this emergent interdisciplinary body of research. Quantum Machine Learning sets the scene for a deeper understanding of the subject for readers of different backgrounds. The author has carefully constructed a clear comparison of classical learning algorithms and their quantum counterparts, thus making differences in computational complexity and learning performance apparent. This book synthesizes of a broad array of research into a manageable and concise presentation, with practical examples and applications. - Bridges the gap between abstract developments in quantum computing with the applied research on machine learning - Provides the theoretical minimum of machine learning, quantum mechanics, and quantum computing - Gives step-by-step guidance to a broader understanding of this emergent interdisciplinary body of research

Peter Wittek received his PhD in Computer Science from the National University of Singapore, and he also holds an MSc in Mathematics. He is interested in interdisciplinary synergies, such as scalable learning algorithms on supercomputers, computational methods in quantum simulations, and quantum machine learning. He collaborated on these topics during research stints to various institutions, including the Indian Institute of Science, Barcelona Supercomputing Center, Bangor University, Tsinghua University, the Centre for Quantum Technologies, and the Institute of Photonic Sciences. He has been involved in major EU research projects, and obtained several academic and industry grants.
Quantum Machine Learning bridges the gap between abstract developments in quantum computing and the applied research on machine learning. Paring down the complexity of the disciplines involved, it focuses on providing a synthesis that explains the most important machine learning algorithms in a quantum framework. Theoretical advances in quantum computing are hard to follow for computer scientists, and sometimes even for researchers involved in the field. The lack of a step-by-step guide hampers the broader understanding of this emergent interdisciplinary body of research. Quantum Machine Learning sets the scene for a deeper understanding of the subject for readers of different backgrounds. The author has carefully constructed a clear comparison of classical learning algorithms and their quantum counterparts, thus making differences in computational complexity and learning performance apparent. This book synthesizes of a broad array of research into a manageable and concise presentation, with practical examples and applications. - Bridges the gap between abstract developments in quantum computing with the applied research on machine learning- Provides the theoretical minimum of machine learning, quantum mechanics, and quantum computing- Gives step-by-step guidance to a broader understanding of this emergent interdisciplinary body of research

Front Cover 4
Quantum Machine Learning: What Quantum Computing Means 
4 
Copyright 5
Contents 6
Preface 10
Notations 12
Part One Fundamental Concepts 14
Chapter 1: Introduction 16
1.1 
18 
1.2. 
19 
1.3. 
20 
1.4. 
20 
1.5. 
22 
Chapter 2: Machine Learning 24
2.1. 
25 
2.2. 
25 
2.3. 
28 
2.4. 
31 
2.5. 
33 
2.6. 
35 
2.7. 
36 
Chapter 3: Quantum Mechanics 38
3.1. 
39 
3.2. 
40 
3.3. 
42 
3.4. 
45 
3.5. 
47 
3.6. 
49 
3.7. 
50 
3.8. 
50 
3.9. 
51 
Chapter 4: 
54 
4.1. 
54 
4.2. 
57 
4.3. 
61 
4.4. 
62 
4.5. 
62 
4.6. 
64 
4.7. 
65 
Part Two Classical Learning Algorithms 68
Chapter 5: 
70 
5.1. 
70 
5.2. 
71 
5.3. 
72 
5.4. 
73 
5.5. 
74 
Chapter 6: 
76 
6.1. 
76 
6.2. 
78 
6.3. 
80 
6.4. 
82 
6.5. 
83 
Chapter 7: 
86 
7.1. 
87 
7.2. 
87 
7.3. 
89 
7.4. 
90 
7.5. 
93 
7.6. 
94 
7.7. 
94 
7.8. 
96 
7.9. 
96 
Chapter 8: 
98 
8.1. 
98 
8.2. 
99 
8.3. 
100 
8.4. 
100 
Chapter 9: 
102 
9.1. 
102 
9.2. 
103 
9.3. 
105 
9.4. 
107 
Part Three Quantum Computing and Machine Learning 110
Chapter 10: 
112 
10.1. 
112 
10.2. 
113 
10.3. 
115 
10.4. 
117 
10.5. 
117 
10.6. 
118 
10.7. 
119 
10.8. 
120 
Chapter 11: 
122 
11.1. 
122 
11.2. 
127 
11.3. 
128 
11.4. 
129 
11.4. 
131 
Chapter 12: 
132 
12.1. 
132 
12.2. 
134 
12.3. 
135 
12.4. 
136 
Chapter 13: 
138 
13.1. 
139 
13.2. 
140 
13.3. 
141 
13.4. 
143 
13.5. 
146 
13.6. 
147 
13.7. 
149 
Chapter 14: 
152 
14.1. 
153 
14.2. 
154 
14.3. 
155 
14.4. 
156 
14.5. 
156 
14.6. 
158 
14.7. 
160 
14.8. 
164 
Bibliography 166
Front Cover 1

1

Introduction


Abstract


Why should we look at quantum computing in machine learning? Apart from a speedup and increased storage capacity, quantum computing has further benefits for machine learning algorithms. Learning models lie at the core of data mining, a complex process of extracting meaningful information from large volumes of data. We expect a machine learning model to generalize well beyond a limited training collection. On classical computers, we are constrained by convexity conditions or heuristics to keep computational time under control. Quantum computers do not suffer from these issues in optimization problems; hence, we can achieve better generalization performance. Classical computers excel at other tasks; hence, a heterogeneous model is likely to prevail. Dozens of approaches have already been published on quantum machine learning—we briefly overview the major ones. We further mention classical algorithms that borrow metaphors from quantum mechanics; this is also a prolific area of research.

Keywords

Quantum computing

Machine learning

Data mining

Optimization

Convexity

Nonconvex problems

Quantum speedup

The quest of machine learning is ambitious: the discipline seeks to understand what learning is, and studies how algorithms approximate learning. Quantum machine learning takes these ambitions a step further: quantum computing enrolls the help of nature at a subatomic level to aid the learning process.

Machine learning is based on minimizing a constrained multivariate function, and these algorithms are at the core of data mining and data visualization techniques. The result of the optimization is a decision function that maps input points to output points. While this view on machine learning is simplistic, and exceptions are countless, some form of optimization is always central to learning theory.

The idea of using quantum mechanics for computations stems from simulating such systems. Feynman (1982) noted that simulating quantum systems on classical computers becomes unfeasible as soon as the system size increases, whereas quantum particles would not suffer from similar constraints. Deutsch (1985) generalized the idea. He noted that quantum computers are universal Turing machines, and that quantum parallelism implies that certain probabilistic tasks can be performed faster than by any classical means.

Today, quantum information has three main specializations: quantum computing, quantum information theory, and quantum cryptography (Fuchs, 2002, p. 49). We are not concerned with quantum cryptography, which primarily deals with secure exchange of information. Quantum information theory studies the storage and transmission of information encoded in quantum states; we rely on some concepts such as quantum channels and quantum process tomography. Our primary focus, however, is quantum computing, the field of inquiry that uses quantum phenomena such as superposition, entanglement, and interference to operate on data represented by quantum states.

Algorithms of importance emerged a decade after the first proposals of quantum computing appeared. Shor (1997) introduced a method to factorize integers exponentially faster, and Grover (1996) presented an algorithm to find an element in an unordered data set quadratically faster than the classical limit. One would have expected a slew of new quantum algorithms after these pioneering articles, but the task proved hard (Bacon and van Dam, 2010). Part of the reason is that now we expect that a quantum algorithm should be faster—we see no value in a quantum algorithm with the same computational complexity as a known classical one. Furthermore, even with the spectacular speedups, the class NP cannot be solved on a quantum computer in subexponential time (Bennett et al., 1997).

While universal quantum computers remain out of reach, small-scale experiments implementing a few qubits are operational. In addition, quantum computers restricted to domain problems are becoming feasible. For instance, experimental validation of combinatorial optimization on over 500 binary variables on an adiabatic quantum computer showed considerable speedup over optimized classical implementations (McGeoch and Wang, 2013). The result is controversial, however (RØnnow et al., 2014).

Recent advances in quantum information theory indicate that machine learning may benefit from various paradigms of the field. For instance, adiabatic quantum computing finds the minimum of a multivariate function by a controlled physical process using the adiabatic theorem (Farhi et al., 2000). The function is translated to a physical description, the Hamiltonian operator of a quantum system. Then, a system with a simple Hamiltonian is prepared and initialized to the ground state, the lowest energy state a quantum system can occupy. Finally, the simple Hamiltonian is evolved to the target Hamiltonian, and, by the adiabatic theorem, the system remains in the ground state. At the end of the process, the solution is read out from the system, and we obtain the global optimum for the function in question.

While more and more articles that explore the intersection of quantum computing and machine learning are being published, the field is fragmented, as was already noted over a decade ago (Bonner and Freivalds, 2002). This should not come as a surprise: machine learning itself is a diverse and fragmented field of inquiry. We attempt to identify common algorithms and trends, and observe the subtle interplay between faster execution and improved performance in machine learning by quantum computing.

As an example of this interplay, consider convexity: it is often considered a virtue in machine learning. Convex optimization problems do not get stuck in local extrema, they reach a global optimum, and they are not sensitive to initial conditions. Furthermore, convex methods have easy-to-understand analytical characteristics, and theoretical bounds on convergence and other properties are easier to derive. Non-convex optimization, on the other hand, is a forte of quantum methods. Algorithms on classical hardware use gradient descent or similar iterative methods to arrive at the global optimum. Quantum algorithms approach the optimum through an entirely different, more physical process, and they are not bound by convexity restrictions. Nonconvexity, in turn, has great advantages for learning: sparser models ensure better generalization performance, and nonconvex objective functions are less sensitive to noise and outliers. For this reason, numerous approaches and heuristics exist for nonconvex optimization on classical hardware, which might prove easier and faster to solve by quantum computing.

As in the case of computational complexity, we can establish limits on the performance of quantum learning compared with the classical flavor. Quantum learning is not more powerful than classical learning—at least from an information-theoretic perspective, up to polynomial factors (Servedio and Gortler, 2004). On the other hand, there are apparent computational advantages: certain concept classes are polynomial-time exact-learnable from quantum membership queries, but they are not polynomial-time learnable from classical membership queries (Servedio and Gortler, 2004). Thus quantum machine learning can take logarithmic time in both the number of vectors and their dimension. This is an exponential speedup over classical algorithms, but at the price of having both quantum input and quantum output (Lloyd etal., 2013a).

1.1 Learning Theory and Data Mining


Machine learning revolves around algorithms, model complexity, and computational complexity. Data mining is a field related to machine learning, but its focus is different. The goal is similar: identify patterns in large data sets, but aside from the raw analysis, it encompasses a broader spectrum of data processing steps. Thus, data mining borrows methods from statistics, and algorithms from machine learning, information retrieval, visualization, and distributed computing, but it also relies on concepts familiar from databases and data management. In some contexts, data mining includes any form of large-scale information processing.

In this way, data mining is more applied than machine learning. It is closer to what practitioners would find useful. Data may come from any number of sources: business, science, engineering, sensor networks, medical applications, spatial information, and surveillance, to mention just a few. Making sense of the data deluge is the primary target of data mining.

Data mining is a natural step in the evolution of information systems. Early database systems allowed the storing and querying of data, but analytic functionality was limited. As databases grew, a need for automatic analysis emerged. At the same time, the amount of unstructured information—text, images, video, music—exploded. Data mining is meant to fill the role of analyzing and understanding both structured and unstructured data collections, whether they are in databases or stored in some other form.

Machine learning often takes a restricted view on data: algorithms assume eithera geometric perspective, treating data instances as vectors, or a probabilistic one, where data instances are multivariate random variables. Data mining involves preprocessing steps that extract these views from data.

For instance, in text mining—data mining aimed at unstructured text documents— the initial step builds a vector space from documents. This step starts...

Erscheint lt. Verlag 10.9.2014
Sprache englisch
Themenwelt Informatik Theorie / Studium Künstliche Intelligenz / Robotik
Naturwissenschaften Physik / Astronomie Quantenphysik
Technik
ISBN-10 0-12-801099-1 / 0128010991
ISBN-13 978-0-12-801099-0 / 9780128010990
Haben Sie eine Frage zum Produkt?
PDFPDF (Adobe DRM)
Größe: 3,2 MB

Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM

Dateiformat: PDF (Portable Document Format)
Mit einem festen Seiten­layout eignet sich die PDF besonders für Fach­bücher mit Spalten, Tabellen und Abbild­ungen. Eine PDF kann auf fast allen Geräten ange­zeigt werden, ist aber für kleine Displays (Smart­phone, eReader) nur einge­schränkt geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine Adobe-ID und die Software Adobe Digital Editions (kostenlos). Von der Benutzung der OverDrive Media Console raten wir Ihnen ab. Erfahrungsgemäß treten hier gehäuft Probleme mit dem Adobe DRM auf.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine Adobe-ID sowie eine kostenlose App.
Geräteliste und zusätzliche Hinweise

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

EPUBEPUB (Adobe DRM)
Größe: 4,8 MB

Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM

Dateiformat: EPUB (Electronic Publication)
EPUB ist ein offener Standard für eBooks und eignet sich besonders zur Darstellung von Belle­tristik und Sach­büchern. Der Fließ­text wird dynamisch an die Display- und Schrift­größe ange­passt. Auch für mobile Lese­geräte ist EPUB daher gut geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine Adobe-ID und die Software Adobe Digital Editions (kostenlos). Von der Benutzung der OverDrive Media Console raten wir Ihnen ab. Erfahrungsgemäß treten hier gehäuft Probleme mit dem Adobe DRM auf.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine Adobe-ID sowie eine kostenlose App.
Geräteliste und zusätzliche Hinweise

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich
der Praxis-Guide für Künstliche Intelligenz in Unternehmen - Chancen …

von Thomas R. Köhler; Julia Finkeissen

eBook Download (2024)
Campus Verlag
38,99
Wie du KI richtig nutzt - schreiben, recherchieren, Bilder erstellen, …

von Rainer Hattenhauer

eBook Download (2023)
Rheinwerk Computing (Verlag)
17,43