Formal Theories of Information
Springer Berlin (Verlag)
978-3-642-00658-6 (ISBN)
It is commonly assumed that computers process information. But what is inf- mation? In a technical, important, but nevertheless rather narrow sense, Sh- non'sinformationtheorygivesa?rstanswertothisquestion.Thistheoryfocuses on measuring the information content of a message. Essentially this measure is the reduction of the uncertainty obtained by receiving a message. The unc- tainty of a situation of ignorance in turn is measured by entropy. This theory hashad an immense impact on the technologyof information storage,data c- pression, information transmission and coding and still is a very active domain of research. Shannon's theory has also attractedmuch interest in a more philosophic look at information, although it was readily remarked that it is only a "syntactic" theory of information and neglects "semantic" issues. Several attempts have been made in philosophy to give information theory a semantic ?avor, but still mostly based on or at least linked to Shannon's theory. Approaches to semantic informationtheoryalsoveryoftenmakeuseofformallogic.Thereby,information is linked to reasoning, deduction and inference, as well as to decision making. Further, entropy and related measure were soon found to have important connotations with regard to statistical inference. Surely, statistical data and observation represent information, information about unknown, hidden para- ters. Thus a whole branch of statistics developed around concepts of Shannon's information theory or derived from them. Also some proper measurements - propriate for statistics, like Fisher's information, were proposed.
Philosophical Reflections.- Philosophical Conceptions of Information.- The Syntactical Approach.- Information Theory, Relative Entropy and Statistics.- Information: The Algorithmic Paradigm.- The Semantical Approach.- Information Algebra.- Uncertain Information.- Comparing Questions and Answers: A Bit of Logic, a Bit of Language, and Some Bits of Information.- Channels: From Logic to Probability.- Beyond the Semantical Approach.- Modeling Real Reasoning.- Philosophical Conclusions.- One or Many Concepts of Information?.
From the reviews:
"This new anthology on formal theories of information is based upon research presented at the May 2006 Muenchenwiler seminar of the Information and Knowledge research groups of the computer science departments of the universities of Bern, Fribourg, and Neuchatel. ... This is probably the clearest account of algorithmic information theory that one will come across. ... Formal theories of information and their philosophical analysis are being developed right now, and this is what makes a volume of this quality so welcome." (Sebastian Sequoiah-Grayson, Minds and Machines, Vol. 22, 2012)
Erscheint lt. Verlag | 22.4.2009 |
---|---|
Reihe/Serie | Lecture Notes in Computer Science | Theoretical Computer Science and General Issues |
Zusatzinfo | VII, 269 p. |
Verlagsort | Berlin |
Sprache | englisch |
Maße | 155 x 235 mm |
Gewicht | 428 g |
Themenwelt | Informatik ► Theorie / Studium ► Kryptologie |
Schlagworte | Algebra • algorithmic dephts • Boolean algebra • Chaitin • church's thesis • church's thesis • Entropy • Gödel • Hardcover, Softcover / Informatik, EDV/Informatik • Information • information algebra • Informationstheorie • Information Theory • Kolmogorov complexity • Logic • Probability • Reasoning • semantic information • Semantics • Shannon • solomonoff • Statistics • Syntax • Turing • Uncertainty |
ISBN-10 | 3-642-00658-2 / 3642006582 |
ISBN-13 | 978-3-642-00658-6 / 9783642006586 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
aus dem Bereich