Generalized Normalizing Flows via Markov Chains
Seiten
2023
Cambridge University Press (Verlag)
978-1-009-33100-5 (ISBN)
Cambridge University Press (Verlag)
978-1-009-33100-5 (ISBN)
Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative models. This Element provides a unified framework to handle these approaches via Markov chains. The authors' framework establishes a useful mathematical tool to combine the various approaches.
Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative models. This Element provides a unified framework to handle these approaches via Markov chains. The authors consider stochastic normalizing flows as a pair of Markov chains fulfilling some properties, and show how many state-of-the-art models for data generation fit into this framework. Indeed numerical simulations show that including stochastic layers improves the expressivity of the network and allows for generating multimodal distributions from unimodal ones. The Markov chains point of view enables the coupling of both deterministic layers as invertible neural networks and stochastic layers as Metropolis-Hasting layers, Langevin layers, variational autoencoders and diffusion normalizing flows in a mathematically sound way. The authors' framework establishes a useful mathematical tool to combine the various approaches.
Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative models. This Element provides a unified framework to handle these approaches via Markov chains. The authors consider stochastic normalizing flows as a pair of Markov chains fulfilling some properties, and show how many state-of-the-art models for data generation fit into this framework. Indeed numerical simulations show that including stochastic layers improves the expressivity of the network and allows for generating multimodal distributions from unimodal ones. The Markov chains point of view enables the coupling of both deterministic layers as invertible neural networks and stochastic layers as Metropolis-Hasting layers, Langevin layers, variational autoencoders and diffusion normalizing flows in a mathematically sound way. The authors' framework establishes a useful mathematical tool to combine the various approaches.
1. Introduction; 2. Preliminaries; 3. Normalizing Flows; 4. Stochastic Normalizing Flows; 5. Stochastic Layers; 6. Conditional Generative Modeling; 7. Numerical Results; 8. Conclusions and Open Questions; References.
Erscheinungsdatum | 23.01.2023 |
---|---|
Reihe/Serie | Elements in Non-local Data Interactions: Foundations and Applications |
Zusatzinfo | Worked examples or Exercises |
Verlagsort | Cambridge |
Sprache | englisch |
Maße | 152 x 229 mm |
Gewicht | 94 g |
Themenwelt | Mathematik / Informatik ► Mathematik ► Wahrscheinlichkeit / Kombinatorik |
ISBN-10 | 1-009-33100-0 / 1009331000 |
ISBN-13 | 978-1-009-33100-5 / 9781009331005 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
Mehr entdecken
aus dem Bereich
aus dem Bereich
Buch | Softcover (2024)
Springer Spektrum (Verlag)
44,99 €
Eine Einführung in die faszinierende Welt des Zufalls
Buch | Softcover (2024)
Springer Spektrum (Verlag)
39,99 €