Statistical Field Theory for Neural Networks - Moritz Helias, David Dahmen

Statistical Field Theory for Neural Networks

Buch | Softcover
XVII, 203 Seiten
2020 | 1st ed. 2020
Springer International Publishing (Verlag)
978-3-030-46443-1 (ISBN)
80,24 inkl. MwSt

This book presents a self-contained introduction to techniques from field theory applied to stochastic and collective dynamics in neuronal networks. These powerful analytical techniques, which are well established in other fields of physics, are the basis of current developments and offer solutions to pressing open problems in theoretical neuroscience and also machine learning. They enable a systematic and quantitative understanding of the dynamics in recurrent and stochastic neuronal networks.

This book is intended for physicists, mathematicians, and computer scientists and it is designed for self-study by researchers who want to enter the field or as the main text for a one semester course at advanced undergraduate or graduate level. The theoretical concepts presented in this book are systematically developed from the very beginning, which only requires basic knowledge of analysis and linear algebra.

Moritz Helias is group leader at the Jülich Research Centre and assistant professor in the department of physics of the RWTH Aachen University, Germany. He obtained his diploma in theoretical solid state physics at the University of Hamburg and his PhD in computational neuroscience at the University of Freiburg, Germany. Post-doctoral positions in RIKEN Wako-Shi, Japan and Jülich Research Center followed. His main research interests are neuronal network dynamics and function, and their quantitative analysis with tools from statistical physics and field theory. David Dahmen is a post-doctoral researcher in the Institute of Neuroscience and Medicine at the Jülich Research Centre, Germany. He obtained his Master's degree in physics from RWTH Aachen University, Germany, working on effective field theory approaches to particle physics. Afterwards he moved to the field of computational neuroscience, where he received his PhD in 2017. His research comprises modeling, analysis and simulation of recurrent neuronal networks with special focus on development and knowledge transfer of mathematical tools and simulation concepts. His main interests are field-theoretic methods for random neural networks, correlations in recurrent networks, and modeling of the local field potential.

Introduction.- Probabilities, moments, cumulants.- Gaussian distribution and Wick's theorem.- Perturbation expansion.- Linked cluster theorem.- Functional preliminaries.- Functional formulation of stochastic differential equations.- Ornstein-Uhlenbeck process: The free Gaussian theory.- Perturbation theory for stochastic differential equations.- Dynamic mean-field theory for random networks.- Vertex generating function.- Application: TAP approximation.- Expansion of cumulants into tree diagrams of vertex functions.- Loopwise expansion of the effective action - Tree level.- Loopwise expansion in the MSRDJ formalism.- Nomenclature.

Erscheinungsdatum
Reihe/Serie Lecture Notes in Physics
Zusatzinfo XVII, 203 p. 127 illus., 5 illus. in color.
Verlagsort Cham
Sprache englisch
Maße 155 x 235 mm
Gewicht 349 g
Themenwelt Naturwissenschaften Physik / Astronomie Theoretische Physik
Naturwissenschaften Physik / Astronomie Thermodynamik
Schlagworte Chaotic Network Dynamics • Correlated neuronal activity • Diagrammatic techniques • Dynamic mean-field theory • neuronal networks • Statistical Physics
ISBN-10 3-030-46443-1 / 3030464431
ISBN-13 978-3-030-46443-1 / 9783030464431
Zustand Neuware
Haben Sie eine Frage zum Produkt?
Mehr entdecken
aus dem Bereich

von Markus Stephan; Bernd Bachert; Matevz Dular

Buch | Softcover (2024)
Wiley-VCH (Verlag)
20,00
Theoretische Physik I

von Peter Reineker; Michael Schulz; Beatrix M. Schulz …

Buch | Softcover (2021)
Wiley-VCH (Verlag)
54,90
Concepts and Applications

von Nouredine Zettili

Buch | Softcover (2022)
John Wiley & Sons Inc (Verlag)
45,90