Projection Matrices, Generalized Inverse Matrices, and Singular Value Decomposition (eBook)
XII, 236 Seiten
Springer New York (Verlag)
978-1-4419-9887-3 (ISBN)
Aside from distribution theory, projections and the singular value decomposition (SVD) are the two most important concepts for understanding the basic mechanism of multivariate analysis. The former underlies the least squares estimation in regression analysis, which is essentially a projection of one subspace onto another, and the latter underlies principal component analysis, which seeks to find a subspace that captures the largest variability in the original space.
This book is about projections and SVD. A thorough discussion of generalized inverse (g-inverse) matrices is also given because it is closely related to the former. The book provides systematic and in-depth accounts of these concepts from a unified viewpoint of linear transformations finite dimensional vector spaces. More specially, it shows that projection matrices (projectors) and g-inverse matrices can be defined in various ways so that a vector space is decomposed into a direct-sum of (disjoint) subspaces. Projection Matrices, Generalized Inverse Matrices, and Singular Value Decomposition will be useful for researchers, practitioners, and students in applied mathematics, statistics, engineering, behaviormetrics, and other fields.
Haruo Yanai is an educational psychologist and epidemiologist specialized in educational assessment and statistics. While he was developing an aptitude test as part of his doctoral dissertation at the University of Tokyo, he began his pioneering work on unifying various methods of multivariate analysis using projectors. This work has culminated in his widely acclaimed book 'The Foundations of Multivariate Analysis' (Wiley Eastern, 1982) with Takeuchi and Mukherjee. He has held a professorial position in the Research Division at the National Center for University Entrance Examinations and is currently a Professor of Statistics at St. Luke College of Nursing in Tokyo. He is a former President of the Behaviormetric Society and is currently President of the Japan Testing Society.
Kei Takeuchi is a mathematical statistician with a strong background in economics. He was a Professor of Statistics in the Faculty of Economics at the University of Tokyo, and after retirement in the Faculty of International Studies at Meiji Gakuin University (now emeritus at both universities). The main fields of his research include the theory of mathematical statistics, especially asymptotic theory of estimation, multivariate analysis, and so on. He has published many papers and books on these subjects in both Japanese and English. He has also published articles on the Japanese economy, impact of science and technology on economy, etc. He is a former President of the Japan Statistical Society and Chairman of the Statistical Commission of Japan.
Yoshio Takane earned his Ph.D in quantitative psychology from the University of North Carolina in 1977. Since then he has been a Professor of Psychology at McGill University, specializing in quantitative methodology. He has developed a number of techniques for data analysis such as nonlinear multivariate analysis (MVA), maximum likelihood multidimensional scaling, latent variable models, methods for contingency table analysis, constrained principal component analysis and other structured MVA, and matrix theory associated with these developments. He has published widely in such journals as Psychometrika and Linear Algebra and Its Applications. He is a former President of the Psychometric Society.Aside from distribution theory, projections and the singular value decomposition (SVD) are the two most important concepts for understanding the basic mechanism of multivariate analysis. The former underlies the least squares estimation in regression analysis, which is essentially a projection of one subspace onto another, and the latter underlies principal component analysis, which seeks to find a subspace that captures the largest variability in the original space.This book is about projections and SVD. A thorough discussion of generalized inverse (g-inverse) matrices is also given because it is closely related to the former. The book provides systematic and in-depth accounts of these concepts from a unified viewpoint of linear transformations finite dimensional vector spaces. More specially, it shows that projection matrices (projectors) and g-inverse matrices can be defined in various ways so that a vector space is decomposed into a direct-sum of (disjoint) subspaces. Projection Matrices, Generalized Inverse Matrices, and Singular Value Decomposition will be useful for researchers, practitioners, and students in applied mathematics, statistics, engineering, behaviormetrics, and other fields.
Haruo Yanai is an educational psychologist and epidemiologist specialized in educational assessment and statistics. While he was developing an aptitude test as part of his doctoral dissertation at the University of Tokyo, he began his pioneering work on unifying various methods of multivariate analysis using projectors. This work has culminated in his widely acclaimed book “The Foundations of Multivariate Analysis” (Wiley Eastern, 1982) with Takeuchi and Mukherjee. He has held a professorial position in the Research Division at the National Center for University Entrance Examinations and is currently a Professor of Statistics at St. Luke College of Nursing in Tokyo. He is a former President of the Behaviormetric Society and is currently President of the Japan Testing Society. Kei Takeuchi is a mathematical statistician with a strong background in economics. He was a Professor of Statistics in the Faculty of Economics at the University of Tokyo, and after retirement in the Faculty of International Studies at Meiji Gakuin University (now emeritus at both universities). The main fields of his research include the theory of mathematical statistics, especially asymptotic theory of estimation, multivariate analysis, and so on. He has published many papers and books on these subjects in both Japanese and English. He has also published articles on the Japanese economy, impact of science and technology on economy, etc. He is a former President of the Japan Statistical Society and Chairman of the Statistical Commission of Japan. Yoshio Takane earned his Ph.D in quantitative psychology from the University of North Carolina in 1977. Since then he has been a Professor of Psychology at McGill University, specializing in quantitative methodology. He has developed a number of techniques for data analysis such as nonlinear multivariate analysis (MVA), maximum likelihood multidimensional scaling, latent variable models, methods for contingency table analysis, constrained principal component analysis and other structured MVA, and matrix theory associated with these developments. He has published widely in such journals as Psychometrika and Linear Algebra and Its Applications. He is a former President of the Psychometric Society.
Preface 6
Contents 10
Chapter 1 Fundamentals of Linear Algebra 14
1.1 Vectors and Matrices 14
1.1.1 Vectors 14
1.1.2 Matrices 16
1.2 Vector Spaces and Subspaces 19
1.3 Linear Transformations 24
1.4 Eigenvalues and Eigenvectors 29
1.5 Vector and Matrix Derivatives 32
1.6 Exercises for Chapter 1 35
Chapter 2 Projection Matrices 38
2.1 Definition 38
2.2 Orthogonal Projection Matrices 43
2.3 Subspaces and Projection Matrices 46
2.3.1 Decomposition into a direct-sum of disjoint subspaces 46
2.3.2 Decomposition into nondisjoint subspaces 52
2.3.3 Commutative projectors 54
2.3.4 Noncommutative projectors 57
2.4 Norm of Projection Vectors 59
2.5 Matrix Norm and Projection Matrices 62
2.6 General Form of Projection Matrices 65
2.7 Exercises for Chapter 2 66
Chapter 3 Generalized Inverse Matrices 68
3.1 Definition through Linear Transformations 68
3.2 General Properties 72
3.2.1 Properties of generalized inverse matrices 72
3.2.2 Representation of subspaces by generalized inverses 74
3.2.3 Generalized inverses and linear equations 77
3.2.4 Generalized inverses of partitioned square matrices 80
3.3 A Variety of Generalized Inverse Matrices 83
3.3.1 Reflexive generalized inverse matrices 84
3.3.2 Minimum norm generalized inverse matrices 86
3.3.3 Least squares generalized inverse matrices 89
3.3.4 The Moore-Penrose generalized inverse matrix 92
3.4 Exercises for Chapter 3 98
Chapter 4 Explicit Representations 100
4.1 Projection Matrices 100
4.2 Decompositions of Projection Matrices 107
4.3 The Method of Least Squares 111
4.4 Extended Definitions 114
4.4.1 A generalized form of least squares g-inverse 116
4.4.2 A generalized form of minimum norm g-inverse 119
4.4.3 A generalized form of the Moore-Penrose inverse 124
4.4.4 Optimal g-inverses 131
4.5 Exercises for Chapter 4 133
Chapter 5 Singular Value Decomposition (SVD) 137
5.1 Definition through Linear Transformations 137
5.2 SVD and Projectors 146
5.3 SVD and Generalized Inverse Matrices 150
5.4 Some Properties of Singular Values 152
5.5 Exercises for Chapter 5 160
Chapter 6 Various Applications 162
6.1 Linear Regression Analysis 162
6.1.1 The method of least squares and multiple regression analysis 162
6.1.2 Multiple correlation coefficients and their partitions 165
6.1.3 The Gauss-Markov model 167
6.2 Analysis of Variance 172
6.2.1 One-way design 172
6.2.2 Two-way design 175
6.2.3 Three-way design 177
6.2.4 Cochran's theorem 179
6.3 Multivariate Analysis 182
6.3.1 Canonical correlation analysis 183
6.3.2 Canonical discriminant analysis 189
6.3.3 Principal component analysis 193
6.3.4 Distance and projection matrices 200
6.4 Linear Simultaneous Equations 206
6.4.1 QR decomposition by the Gram-Schmidt orthogonalization method 206
6.4.2 QR decomposition by the Householder transformation 208
6.4.3 Decomposition by projectors 211
6.5 Exercises for Chapter 6 212
Chapter 7 Answers to Exercises 215
7.1 Chapter 1 215
7.2 Chapter 2 218
7.3 Chapter 3 220
7.4 Chapter 4 224
7.5 Chapter 5 230
7.6 Chapter 6 233
Chapter 8 References 239
Index 243
Erscheint lt. Verlag | 6.4.2011 |
---|---|
Reihe/Serie | Statistics for Social and Behavioral Sciences | Statistics for Social and Behavioral Sciences |
Zusatzinfo | XII, 236 p. |
Verlagsort | New York |
Sprache | englisch |
Themenwelt | Mathematik / Informatik ► Mathematik ► Algebra |
Mathematik / Informatik ► Mathematik ► Statistik | |
Mathematik / Informatik ► Mathematik ► Wahrscheinlichkeit / Kombinatorik | |
Medizin / Pharmazie ► Allgemeines / Lexika | |
Technik | |
Schlagworte | g-inverse matrices • linear transformations • multivariate analysis • projections • singular value decomposition |
ISBN-10 | 1-4419-9887-X / 144199887X |
ISBN-13 | 978-1-4419-9887-3 / 9781441998873 |
Haben Sie eine Frage zum Produkt? |
Größe: 3,9 MB
DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasserzeichen und ist damit für Sie personalisiert. Bei einer missbräuchlichen Weitergabe des eBooks an Dritte ist eine Rückverfolgung an die Quelle möglich.
Dateiformat: PDF (Portable Document Format)
Mit einem festen Seitenlayout eignet sich die PDF besonders für Fachbücher mit Spalten, Tabellen und Abbildungen. Eine PDF kann auf fast allen Geräten angezeigt werden, ist aber für kleine Displays (Smartphone, eReader) nur eingeschränkt geeignet.
Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.
Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.
aus dem Bereich