Information-Theoretic Aspects of Neural Networks
Seiten
1999
Crc Press Inc (Verlag)
978-0-8493-3198-5 (ISBN)
Crc Press Inc (Verlag)
978-0-8493-3198-5 (ISBN)
Intended for engineers and computer scientists working in the field of artificial neural networks, this book presents insight as well as different perspectives on information-processing as it relates to real and artificial networks. It provides alternative strategies for designing and understanding complex neural networks.
Information theoretics vis-a-vis neural networks generally embodies parametric entities and conceptual bases pertinent to memory considerations and information storage, information-theoretic based cost-functions, and neurocybernetics and self-organization. Existing studies only sparsely cover the entropy and/or cybernetic aspects of neural information.
Information-Theoretic Aspects of Neural Networks cohesively explores this burgeoning discipline, covering topics such as:
Shannon information and information dynamics
neural complexity as an information processing system
memory and information storage in the interconnected neural web
extremum (maximum and minimum) information entropy
neural network training
non-conventional, statistical distance-measures for neural network optimizations
symmetric and asymmetric characteristics of information-theoretic error-metrics
algorithmic complexity based representation of neural information-theoretic parameters
genetic algorithms versus neural information
dynamics of neurocybernetics viewed in the information-theoretic plane
nonlinear, information-theoretic transfer function of the neural cellular units
statistical mechanics, neural networks, and information theory
semiotic framework of neural information processing and neural information flow
fuzzy information and neural networks
neural dynamics conceived through fuzzy information parameters
neural information flow dynamics
informatics of neural stochastic resonance
Information-Theoretic Aspects of Neural Networks acts as an exceptional resource for engineers, scientists, and computer scientists working in the field of artificial neural networks as well as biologists applying the concepts of communication theory and protocols to the functioning of the brain. The information in this book explores new avenues in the field and creates a common platform for analyzing the neural complex as well as artificial neural networks.
Information theoretics vis-a-vis neural networks generally embodies parametric entities and conceptual bases pertinent to memory considerations and information storage, information-theoretic based cost-functions, and neurocybernetics and self-organization. Existing studies only sparsely cover the entropy and/or cybernetic aspects of neural information.
Information-Theoretic Aspects of Neural Networks cohesively explores this burgeoning discipline, covering topics such as:
Shannon information and information dynamics
neural complexity as an information processing system
memory and information storage in the interconnected neural web
extremum (maximum and minimum) information entropy
neural network training
non-conventional, statistical distance-measures for neural network optimizations
symmetric and asymmetric characteristics of information-theoretic error-metrics
algorithmic complexity based representation of neural information-theoretic parameters
genetic algorithms versus neural information
dynamics of neurocybernetics viewed in the information-theoretic plane
nonlinear, information-theoretic transfer function of the neural cellular units
statistical mechanics, neural networks, and information theory
semiotic framework of neural information processing and neural information flow
fuzzy information and neural networks
neural dynamics conceived through fuzzy information parameters
neural information flow dynamics
informatics of neural stochastic resonance
Information-Theoretic Aspects of Neural Networks acts as an exceptional resource for engineers, scientists, and computer scientists working in the field of artificial neural networks as well as biologists applying the concepts of communication theory and protocols to the functioning of the brain. The information in this book explores new avenues in the field and creates a common platform for analyzing the neural complex as well as artificial neural networks.
P. S. Neelakanta
IntroductionNeural Complex: A Nonlinear CI System?Neural Complex vis-a-vis Statistical Mechanics, Entropy, Thermodynamics and Information TheoryNeural Communication and Control in Information-Theoretic PlaneNeural Complexity: An Algorithmic RepresentationNeural Information DynamicsSemiotic Framework of Neural Information ProcessingGenetic Algorithmic Based Depiction of Neural InformationEpilogueAppendix
Erscheint lt. Verlag | 30.3.1999 |
---|---|
Zusatzinfo | 14 Tables, black and white |
Verlagsort | Bosa Roca |
Sprache | englisch |
Maße | 156 x 234 mm |
Gewicht | 743 g |
Themenwelt | Informatik ► Theorie / Studium ► Künstliche Intelligenz / Robotik |
Technik ► Nachrichtentechnik | |
ISBN-10 | 0-8493-3198-6 / 0849331986 |
ISBN-13 | 978-0-8493-3198-5 / 9780849331985 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
Mehr entdecken
aus dem Bereich
aus dem Bereich
was sie kann & was uns erwartet
Buch | Softcover (2023)
C.H.Beck (Verlag)
18,00 €
von absurd bis tödlich: Die Tücken der künstlichen Intelligenz
Buch | Softcover (2023)
Heyne (Verlag)
20,00 €