A self-contained introduction to matrix analysis theory and
applications in the field of statistics

Comprehensive in scope, Matrix Algebra for Linear Models
offers a succinct summary of matrix theory and its related
applications to statistics, especially linear models. The book
provides a unified presentation of the mathematical properties and
statistical applications of matrices in order to define and
manipulate data.

Written for theoretical and applied statisticians, the book
utilizes multiple numerical examples to illustrate key ideas,
methods, and techniques crucial to understanding matrix
algebra's application in linear models. Matrix Algebra for
Linear Models expertly balances concepts and methods allowing
for a side-by-side presentation of matrix theory and its linear
model applications. Including concise summaries on each topic, the
book also features:

* Methods of deriving results from the properties of eigenvalues
and the singular value decomposition

* Solutions to matrix optimization problems for obtaining more
efficient biased estimators for parameters in linear regression
models

* A section on the generalized singular value decomposition

* Multiple chapter exercises with selected answers to enhance
understanding of the presented material

Matrix Algebra for Linear Models is an ideal textbook for
advanced undergraduate and graduate-level courses on statistics,
matrices, and linear algebra. The book is also an excellent
reference for statisticians, engineers, economists, and readers
interested in the linear statistical model.



Autorentext

MARVIN H. J. GRUBER, PHD, is Professor Emeritus in the School of Mathematical Sciences at Rochester Institute of Technology. He has authored several books and journal articles in his areas of research interest, which include improving the efficiency of regression estimators. Dr. Gruber is a member of the American Mathematical Society and the American Statistical Association.

Klappentext

A self-contained introduction to matrix analysis theory and applications in the field of statistics

Comprehensive in scope, Matrix Algebra for Linear Models offers a succinct summary of matrix theory and its related applications to statistics, especially linear models. The book provides a unified presentation of the mathematical properties and statistical applications of matrices in order to define and manipulate data.

Written for theoretical and applied statisticians, the book utilizes multiple numerical examples to illustrate key ideas, methods, and techniques crucial to understanding matrix algebra's application in linear models. Matrix Algebra for Linear Models expertly balances concepts and methods allowing for a side-by-side presentation of matrix theory and its linear model applications. Including concise summaries on each topic, the book also features:

  • Methods of deriving results from the properties of eigenvalues and the singular value decomposition
  • Solutions to matrix optimization problems for obtaining more efficient biased estimators for parameters in linear regression models
  • A section on the generalized singular value decomposition
  • Multiple chapter exercises with selected answers to enhance understanding of the presented material

Matrix Algebra for Linear Models is an ideal textbook for advanced undergraduate and graduate-level courses on statistics, matrices, and linear algebra. The book is also an excellent reference for statisticians, engineers, economists, and readers interested in the linear statistical model.

Inhalt

Preface xiii

Acknowledgments xv

Part I Basic Ideas about Matrices and Systems of Linear Equations 1

Section 1 What Matrices are and Some Basic Operations with Them 3

1.1 Introduction 3

1.2 What are Matrices and why are they Interesting to a Statistician? 3

1.3 Matrix Notation Addition and Multiplication 6

1.4 Summary 10

Exercises 10

Section 2 Determinants and Solving a System of Equations 14

2.1 Introduction 14

2.2 Definition of and Formulae for Expanding Determinants 14

2.3 Some Computational Tricks for the Evaluation of Determinants 16

2.4 Solution to Linear Equations Using Determinants 18

2.5 Gauss Elimination 22

2.6 Summary 27

Exercises 27

Section 3 The Inverse of a Matrix 30

3.1 Introduction 30

3.2 The Adjoint Method of Finding the Inverse of a Matrix 30

3.3 Using Elementary Row Operations 31

3.4 Using the Matrix Inverse to Solve a System of Equations 33

3.5 Partitioned Matrices and Their Inverses 34

3.6 Finding the Least Square Estimator 38

3.7 Summary 44

Exercises 44

Section 4 Special Matrices and Facts about Matrices that will be used in the Sequel 47

4.1 Introduction 47

4.2 Matrices of the Form aIn + bJn 47

4.3 Orthogonal Matrices 49

4.4 Direct Product of Matrices 52

4.5 An Important Property of Determinants 53

4.6 The Trace of a Matrix 56

4.7 Matrix Differentiation 57

4.8 The Least Square Estimator Again 62

4.9 Summary 62

Exercises 63

Section 5 Vector Spaces 66

5.1 Introduction 66

5.2 What is a Vector Space? 66

5.3 The Dimension of a Vector Space 68

5.4 Inner Product Spaces 70

5.5 Linear Transformations 73

5.6 Summary 76

Exercises 76

Section 6 The Rank of a Matrix and Solutions to Systems of Equations 79

6.1 Introduction 79

6.2 The Rank of a Matrix 79

6.3 Solving Systems of Equations with Coefficient Matrix of Less than Full Rank 84

6.4 Summary 87

Exercises 87

Part II Eigenvalues the Singular Value Decomposition and Principal Components 91

Section 7 Finding the Eigenvalues of a Matrix 93

7.1 Introduction 93

7.2 Eigenvalues and Eigenvectors of a Matrix 93

7.3 Nonnegative Definite Matrices 101

7.4 Summary 104

Exercises 105

Section 8 The Eigenvalues and Eigenvectors of Special Matrices 108

8.1 Introduction 108

8.2 Orthogonal Nonsingular and Idempotent Matrices 109

8.3 The CayleyHamilton Theorem 112

8.4 The Relationship between the Trace the Determinant and the Eigenvalues of a Matrix 114

8.5 The Eigenvalues and Eigenvectors of the Kronecker Product of Two Matrices 116

8.6 The Eigenvalues and the Eigenvectors of a Matrix of the Form aI + bJ 117

8.7 The Loewner Ordering 119

8.8 Summary 121

Exercises 122

Section 9 The Singular Value Decomposition (SVD) 124

9.1 Introduction 124

9.2 The Existence of the SVD 125

9.3 Uses and Examples of the SVD 127

9.4 Summary 134

Exercises 134

Section 10 Applications of the Singular Value Decomposition 137

10.1 Introduction 137

10.2 Reparameterization of a Non-full-Rank Model to a Full-Rank Model 137

10.3 Principal Components 141

10.4 The Multicollinearity Problem 143

10.5 Summary 144

Exercises 145

Section 11 Relative Eigenvalues and Generalizations of the Singular Value Decomposition 146

11.1 Introduction 146

11.2 Relative E...

Titel
Matrix Algebra for Linear Models
EAN
9781118800416
ISBN
978-1-118-80041-6
Format
E-Book (pdf)
Hersteller
Herausgeber
Veröffentlichung
02.12.2013
Digitaler Kopierschutz
Adobe-DRM
Dateigrösse
3.87 MB
Anzahl Seiten
392
Jahr
2013
Untertitel
Englisch