Machine Learning under Resource Constraints / Machine Learning under Resource Constraints - Fundamentals

Buch | Softcover
XIV, 491 Seiten
2022
De Gruyter (Verlag)
978-3-11-078593-7 (ISBN)
87,95 inkl. MwSt
Machine learning is part of Artificial Intelligence since its beginning. Certainly, not learning would only allow the perfect being to show intelligent behavior. All others, be it humans or machines, need to learn in order to enhance their capabilities. In the eighties of the last century, learning from examples and modeling human learning strategies have been investigated in concert. The formal statistical basis of many learning methods has been put forward later on and is still an integral part of machine learning. Neural networks have always been in the toolbox of methods. Integrating all the pre-processing, exploitation of kernel functions, and transformation steps of a machine learning process into the architecture of a deep neural network increased the performance of this model type considerably. Modern machine learning is challenged on the one hand by the amount of data and on the other hand by the demand of real-time inference. This leads to an interest in computing architectures and modern processors. For a long time, the machine learning research could take the von-Neumann architecture for granted. All algorithms were designed for the classical CPU. Issues of implementation on a particular architecture have been ignored. This is no longer possible. The time for independently investigating machine learning and computational architecture is over. Computing architecture has experienced a similarly rampant development from mainframe or personal computers in the last century to now very large compute clusters on the one hand and ubiquitous computing of embedded systems in the Internet of Things on the other hand. Cyber-physical systems’ sensors produce a huge amount of streaming data which need to be stored and analyzed. Their actuators need to react in real-time. This clearly establishes a close connection with machine learning. Cyber-physical systems and systems in the Internet of Things consist of diverse components, heterogeneous both in hard- and software. Modern multi-core systems, graphic processors, memory technologies and hardware-software codesign offer opportunities for better implementations of machine learning models. Machine learning and embedded systems together now form a field of research which tackles leading edge problems in machine learning, algorithm engineering, and embedded systems. Machine learning today needs to make the resource demands of learning and inference meet the resource constraints of used computer architecture and platforms. A large variety of algorithms for the same learning method and, moreover, diverse implementations of an algorithm for particular computing architectures optimize learning with respect to resource efficiency while keeping some guarantees of accuracy. The trade-off between a decreased energy consumption and an increased error rate, to just give an example, needs to be theoretically shown for training a model and the model inference. Pruning and quantization are ways of reducing the resource requirements by either compressing or approximating the model. In addition to memory and energy consumption, timeliness is an important issue, since many embedded systems are integrated into large products that interact with the physical world. If the results are delivered too late, they may have become useless. As a result, real-time guarantees are needed for such systems. To efficiently utilize the available resources, e.g., processing power, memory, and accelerators, with respect to response time, energy consumption, and power dissipation, different scheduling algorithms and resource management strategies need to be developed. This book series addresses machine learning under resource constraints as well as the application of the described methods in various domains of science and engineering. Turning big data into smart data requires many steps of data analysis: methods for extracting and selecting features, filtering and cleaning the data, joining heterogeneous sources, aggregating the data, and learning predictions need to scale up. The algorithms are challenged on the one hand by high-throughput data, gigantic data sets like in astrophysics, on the other hand by high dimensions like in genetic data. Resource constraints are given by the relation between the demands for processing the data and the capacity of the computing machinery. The resources are runtime, memory, communication, and energy. Novel machine learning algorithms are optimized with regard to minimal resource consumption. Moreover, learned predictions are applied to program executions in order to save resources. The three books will have the following subtopics: Volume 1: Machine Learning under Resource Constraints - Fundamentals Volume 2: Machine Learning and Physics under Resource Constraints - Discovery Volume 3: Machine Learning under Resource Constraints - Applications Volume 1 establishes the foundations of this new field (Machine Learning under Resource Constraints). It goes through all the steps from data collection, their summary and clustering, to the different aspects of resource-aware learning, i.e., hardware, memory, energy, and communication awareness. Several machine learning methods are inspected with respect to their resource requirements and how to enhance their scalability on diverse computing architectures ranging from embedded systems to large computing clusters.

lt;p>Katharina Morik received her doctorate from the University of Hamburg in 1981 and her habilitation from the TU Berlin in 1988. In 1991, she established the chair of Artificial Intelligence at the TU Dortmund University. She is a pioneer of machine learning contributing substantially to inductive logic programming, support vector machines, probabilistic graphical models. In 2011, she acquired the Collaborative Research Center SFB 876 "Providing Information by Resource-Constrained Data Analysis", of which she is the spokesperson. and computing architectures together so that machine learning models may be executed or even trained on resource restricted devices. It consists of 12 projects and a graduate school for more than 50 Ph. D. students. She is a spokesperson of the Competence Center for Machine Learning Rhein Ruhr (ML2R) and coordinator of the German competence centers for AI. She is the author of more than 200 publications in prestigious journals and conferences. She was a founding member, Program Chair and Vice Chair of the conference IEEE International Conference on Data Mining (ICDM) and is a member of the steering committee of and was Program Chair of ECML PKDD. Together with Volker Markl, Katharina Morik heads the working group "Technological Pioneers" of the platform "Learning Systems and Data Science" of the BMBF. Prof. Morik has been a member of the Academy of Technical Sciences since 2015 and of the North Rhine-Westphalian Academy of Sciences and Arts since 2016. She has been awarded Fellow of the German Society of Computer Science GI e.V. in 2019.

Dr. Peter Marwedel studied physics at the University of Kiel, Germany. He received his PhD in physics in 1974. As a post-doc, he published some of the first papers on high-level synthesis and retargetable compilation in the context of the MIMOLA hardware description language. In 1987, his habilitation thesis in computer science was accepted. He worked as a professor for computer engineering at TU Dortmund since 1989. He is chairing ICD, a local spin-off of TU Dortmund. His research interests include design automation for embedded systems, in particular the generation of efficient embedded software. Focus is on energy efficiency and timing predictability. Dr. Marwedel published papers on energy-efficient and timing-predictable software, including compiler-supported use of scratchpad memories. He is the author of one of the few textbooks on embedded systems. The book is complemented by videos available on youtube and by publicly available slides. He served as the vice-chair of the collaborative research center SFB 876, aiming at resource-efficient analysis of large data sets since 2011. Dr. Marwedel is an IEEE Fellow. He received the EDAA Lifetime Achievement Award in 2013 and the ESWEEK Lifetime achievement award in 2014.

Erscheinungsdatum
Reihe/Serie De Gruyter STEM
Machine Learning under Resource Constraints ; Volume 1
Zusatzinfo 200 b/w and 100 col. ill., 50 b/w and 50 col. tbl.
Verlagsort Berlin/Boston
Sprache englisch
Maße 170 x 240 mm
Gewicht 843 g
Themenwelt Informatik Datenbanken Data Warehouse / Data Mining
Schlagworte Artificial Intelligence • Big Data • Big Data and Machine Learning • Cyber-Physical Systems • Data mining for Ubiquitous System Software • Eingebettete Systeme • Embedded Systems and Machine Learning • Highly Distributed Data • Künstliche Intelligenz • Machine learning for knowledge discovery • Machine learning in high-energy physics • Maschinelles Lernen • ML on Small devices • Resource-Aware Machine Learning • Resource-Constrained Data Analysis
ISBN-10 3-11-078593-5 / 3110785935
ISBN-13 978-3-11-078593-7 / 9783110785937
Zustand Neuware
Haben Sie eine Frage zum Produkt?
Mehr entdecken
aus dem Bereich
Datenanalyse für Künstliche Intelligenz

von Jürgen Cleve; Uwe Lämmel

Buch | Softcover (2024)
De Gruyter Oldenbourg (Verlag)
74,95
Daten importieren, bereinigen, umformen und visualisieren

von Hadley Wickham; Mine Çetinkaya-Rundel …

Buch | Softcover (2024)
O'Reilly (Verlag)
54,90