This textbook for graduate and advanced undergraduate students presents the theory of matrix algebra for statistical applications, explores various types of matrices encountered in statistics, and covers numerical linear algebra. Matrix algebra is one of the most important areas of mathematics in data science and in statistical theory, and the second edition of this very popular textbook provides essential updates and comprehensive coverage on critical topics in mathematics in data science and in statistical theory.
Part I offers a self-contained description of relevant aspects of the theory of matrix algebra for applications in statistics. It begins with fundamental concepts of vectors and vector spaces; covers basic algebraic properties of matrices and analytic properties of vectors and matrices in multivariate calculus; and concludes with a discussion on operations on matrices in solutions of linear systems and in eigenanalysis. Part II considers various types of matricesencountered in statistics, such as projection matrices and positive definite matrices, and describes special properties of those matrices; and describes various applications of matrix theory in statistics, including linear models, multivariate analysis, and stochastic processes. Part III covers numerical linear algebra-one of the most important subjects in the field of statistical computing. It begins with a discussion of the basics of numerical computations and goes on to describe accurate and efficient algorithms for factoring matrices, how to solve linear systems of equations, and the extraction of eigenvalues and eigenvectors.
Although the book is not tied to any particular software system, it describes and gives examples of the use of modern computer software for numerical linear algebra. This part is essentially self-contained, although it assumes some ability to program in Fortran or C and/or the ability to use R or Matlab.
The first two parts of the text are ideal for a course in matrix algebra for statistics students or as a supplementary text for various courses in linear models or multivariate statistics. The third part is ideal for use as a text for a course in statistical computing or as a supplementary text for various courses that emphasize computations.
New to this edition
. 100 pages of additional material
. 30 more exercises-186 exercises overall . Added discussion of vectors and matrices with complex elements . Additional material on statistical applications . Extensive and reader-friendly cross references and indexAutorentext
James E. Gentle, PhD, is University Professor of Computational Statistics at George Mason University. He is a Fellow of the American Statistical Association (ASA) and of the American Association for the Advancement of Science. Professor Gentle has held several national offices in the ASA and has served as editor and associate editor of journals of the ASA as well as for other journals in statistics and computing. He is author of Random Number Generation and Monte Carlo Methods (Springer, 2003) and Computational Statistics (Springer, 2009).
Klappentext
Matrix algebra is one of the most important areas of mathematics for data analysis and for statistical theory. This much-needed work presents the relevant aspects of the theory of matrix algebra for applications in statistics. It moves on to consider the various types of matrices encountered in statistics, such as projection matrices and positive definite matrices, and describes the special properties of those matrices. Finally, it covers numerical linear algebra, beginning with a discussion of the basics of numerical computations, and following up with accurate and efficient algorithms for factoring matrices, solving linear systems of equations, and extracting eigenvalues and eigenvectors.
Inhalt
Part I Linear Algebra
1 Basic Vector/Matrix Structure and Notation
1.1 Vectors
1.2 Arrays
1.3 Matrices
1.4 Representation of Data
2 Vectors and Vector Spaces
2.1 Operations on Vectors
2.1.1 Linear Combinations and Linear Independence
2.1.2 Vector Spaces and Spaces of Vectors
2.1.3 Basis Sets for Vector Spaces
2.1.4 Inner Products
2.1.5 Norms
2.1.6 Normalized Vectors
2.1.7 Metrics and Distances
2.1.8 Orthogonal Vectors and Orthogonal Vector Spaces2.1.9 The "One Vector"
2.2 Cartesian Coordinates and Geometrical Properties of Vectors
2.2.1 Cartesian Geometry
2.2.2 Projections
2.2.3 Angles between Vectors
2.2.4 Orthogonalization Transformations; Gram-Schmidt .
2.2.5 Orthonormal Basis Sets
2.2.6 Approximation of Vectors
2.2.7 Flats, Affine Spaces, and Hyperplanes
2.2.8 Cones
2.2.9 Cross Products in IR3
2.3 Centered Vectors and Variances and Covariances of Vectors
2.3.1 The Mean and Centered Vectors
2.3.2 The Standard Deviation, the Variance, andScaled Vectors2.3.3 Covariances and Correlations between Vectors
Exercises
3 Basic Properties of Matrices
3.1 Basic Definitions and Notation
3.1.1 Matrix Shaping Operators
3.1.2 Partitioned Matrices
3.1.3 Matrix Addition
3.1.4 Scalar-Valued Operators on Square Matrices:The Trace
3.1.5 Scalar-Valued Operators on Square Matrices:The Determinant
3.2 Multiplication of Matrices and Multiplication ofVectors and Matrices
3.2.1 Matrix Multiplication (Cayley)
3.2.2 Multiplication of Matrices with Special Patterns3.2.3 Elementary Operations on Matrices
3.2.4 The Trace of a Cayley Product that Is Square
3.2.5 The Determinant of a Cayley Product of Square Matrices
3.2.6 Multiplication of Matrices and Vectors
3.2.7 Outer Products
3.2.8 Bilinear and Quadratic Forms; Definiteness
3.2.9 Anisometric Spaces
3.2.10 Other Kinds of Matrix Multiplication
3.3 Matrix Rank and the Inverse of a Matrix
3.3.1 The Rank of Partitioned Matrices, Products of Matrices, and Sums of Matrices
3.3.2 Full Rank Partitioning
3.3.3 Full Rank Matrices and Matrix Inverses3.3.4 Full Rank Factorization
3.3.5 Equivalent Matrices
3.3.6 Multiplication by Full Rank Matrices
3.3.7 Gramian Matrices: Products of the Form ATA
3.3.8 A Lower Bound on the Rank of a Matrix Product
3.3.9 Determinants of Inverses
3.3.10 Inverses of Products and Sums of Nonsingular Matrices
3.3.11 Inverses of Matrices with Special Forms
3.3.12 Determining the Rank of a Matrix
3.4 More on Partitioned Square Matrices: The Schur Complement
3.4.1 Inverses of Partitioned Matrices
3.4.2 Determinants of Partitioned Matrices
3.5 Linear Systems of Equations3.5.1 Solutions of Linear Systems
3.5.2 Null Space: The Orthogonal Complement
3.6 Generalized Inverses
3.6.1 Special Generalized Inverses; The Moore-Penrose Inverse
3.6.2 Generalized Inverses of Products and Sums of Matrices
3.6.3 Generalized Inverses of Partitioned Matrices
3.7 Orthogonality
3.8 Eigenanalysis; Canonical Factorizations
3.8.1 Basic Properties of Eigenvalues and Eigenvectors
3.8.2 The Characteristic Polynomial
3.8.3 The Spectrum
3.8.4 Similarity Transformations
3.8.5 Schur Factorization3.8.6 Similar Canonical Factorization; Diagonalizable Matrices
3.8.7 Properties of Diagonalizable Matrices
3.8.8 Eigenanalysis of Symmetric Matrices
3.8.9 Positive Definite and Nonnegative Definite Matrices
3.8.10 Generalized Eigenvalues and Eigenvectors
3.8.11 Singular Values and the Singular Value Decomposition (SVD)
3…