Skip to main content

An Introduction to Envelopes: Dimension Reduction for Efficient Estimation in Multivariate Statistics

An Introduction to Envelopes: Dimension Reduction for Efficient Estimation in Multivariate Statistics

R. Dennis Cook

ISBN: 978-1-119-42296-9

Sep 2018

320 pages

$120.99

Product not available for purchase

Description

Written by the leading expert in the field, this text reviews the major new developments in envelope models and methods 

An Introduction to Envelopes provides an overview of the theory and methods of envelopes, a class of procedures for increasing efficiency in multivariate analyses without altering traditional objectives. The author offers a balance between foundations and methodology by integrating illustrative examples that show how envelopes can be used in practice. He discusses how to use envelopes to target selected coefficients and explores predictor envelopes and their connection with partial least squares regression. The book reveals the potential for envelope methodology to improve estimation of a multivariate mean.

The text also includes information on how envelopes can be used in generalized linear models, regressions with a matrix-valued response, and reviews work on sparse and Bayesian response envelopes. In addition, the text explores relationships between envelopes and other dimension reduction methods, including canonical correlations, reduced-rank regression, supervised singular value decomposition, sufficient dimension reduction, principal components, and principal fitted components. This important resource: 

•    Offers a text written by the leading expert in this field

•    Describes groundbreaking work that puts the focus on this burgeoning area of study

•    Covers the important new developments in the field and highlights the most important directions

•    Discusses the underlying mathematics and linear algebra

•    Includes an online companion site with both R and Matlab support

Written for researchers and graduate students in multivariate analysis and dimension reduction, as well as practitioners interested in statistical methodology, An Introduction to Envelopes offers the first book on the theory and methods of envelopes.

Preface

Acknowledgements

Notation and Definitions

Chapter 1 Response Envelopes

1.1 The multivariate linear model

1.1.1 Partitioned models and added variable plots

1.1.2 Alternative model forms

1.2 Envelope model for response reduction

1.3 Illustrations

1.3.1 A schematic example

1.3.2 Compound symmetry

1.3.3 Wheat protein: Introductory illustration

1.3.4 Cattle weights: Initial fit

1.4 More on the envelope model

1.4.1 Relationship with sufficiency

1.4.2 Parameter count

1.4.3 Potential gains

1.5 Maximum likelihood estimation

1.5.1 Derivation

1.5.2 Cattle weights: Variation of the X-variant parts of Y

1.5.3 Insights into b E(B)

1.5.4 Scaling the responses

1.6 Asymptotic distributions

1.7 Fitted values and predictions

1.8 Testing the responses

1.8.1 Test development

1.8.2 Testing individual responses

1.8.3 Testing containment only

1.9 Non-normal errors

1.10 Selecting the envelope dimension, u

1.10.1 Selection methods

1.10.2 Inferring about rank(β)

1.10.3 Asymptotic considerations

1.10.4 Overestimation versus underestimation of u

1.10.5 Cattle weights: Influence of u

1.11 Bootstrap and uncertainty in the envelope dimension

1.11.1 Bootstrap for envelope models

1.11.2 Wheat protein: Bootstrap and asymptotic standard errors, u fixed

1.11.3 Cattle weights: Bootstrapping u

1.11.4 Bootstrap smoothing

1.11.5 Cattle data: Bootstrap smoothing

Chapter 2 Illustrative Analyses using Response Envelopes

2.1 Wheat protein: Full data

2.2 Berkeley Guidance Study

2.3 Banknotes

2.4

2.5 Australian Institute of Sport: Response Envelopes

2.6 Air pollution

2.7 Multivariate bioassay

2.8 Brain volumes

2.9 Reducing lead levels in children

Chapter 3 Partial Response Envelopes

3.1 Partial envelope model

3.2 Estimation

3.2.1 Asymptotic distribution of bβ1

3.2.2 Selecting u1

3.3 Illustrations

3.3.1 Cattle weight: Incorporating basal weight

3.3.2 Mens’ urine

3.4 Partial envelopes for prediction

3.4.1 Rationale

3.4.2 Pulp fibers: Partial envelopes and prediction

3.5 Reducing part of the response

Chapter 4 Predictor Envelopes

4.1 Model formulations

4.1.1 Linear predictor reduction

4.1.2 Latent variable formulation of partial least squares regression

4.1.3 Potential advantages

4.2 SIMPLS

4.2.1 SIMPLS algorithm

4.2.2 SIMPLS when n < p

4.3 Likelihood-based predictor envelopes.

4.3.1 Estimation

4.3.2 Comparisions with SIMPLS and principal component regression

4.3.3 Asymptotic properties

4.3.4 Fitted values and prediction

4.3.5 Choice of dimension

4.3.6 Relevant components

4.4 Illustrations

4.4.1 Expository example, continued

4.4.2 Australian Institute of Sport: Predictor envelopes

4.4.3 Wheat protein: Predicting protein content

4.4.4 Mussels’ muscles: Predictor envelopes

4.4.5 Meat properties

4.5 Simultaneous Predictor-Response Envelopes

4.5.1 Model formulation

4.5.2 Potential gain

4.5.3 Estimation

Chapter 5 Enveloping Multivariate Means

5.1 Enveloping a single mean

5.1.1 Envelope structure

5.1.2 Envelope model

5.1.3 Estimation

5.1.4 Minneapolis schools

5.1.5 Functional data

5.2 Enveloping multiple means with heteroscedastic errors

5.2.1 Heteroscedastic envelopes

5.2.2 Estimation

5.2.3 Cattle weights: Heteroscedastic envelope fit

5.3 Extension to heteroscedastic regressions

Chapter 6 Envelope Agorithms

6.1 Likelihood-based envelope estimation

6.2 Starting values

6.2.1 Choosing the starting value from the eigenvectors ofcM.

6.2.2 Choosing the starting value from the eigenvectors ofcM + bU

6.2.3 Summary

6.3 A Non-Grassmann algorithm for estimating EM(U)

6.4 Sequential likelihood-based envelope estimation

6.4.1 The 1D algorithm

6.4.2 Envelope component screening

6.5 Sequential moment-based envelope estimation

6.5.1 Basic algorithm

6.5.2 Krylov matrices and dim(U) = 1

6.5.3 Variations on the basic algorithm

Chapter 7 Envelope Exensions

7.1 Envelopes for vector-valued parameters

7.1.1 Illustrations

7.1.2 Estimation based on a complete likelihood

7.2 Envelopes for matrix-valued parameters

7.3 Envelopes for matrix-valued responses

7.3.1 Initial modeling

7.3.2 Models with Kronecker structure

7.3.3 Envelope models with Kronecker structure

7.4 Spatial envelopes

7.5 Sparse response envelopes

7.5.1 Sparse response envelopes when r ≪ n

7.5.2 Cattle weights and brain volumes: Sparse fits

7.5.3 Sparse envelopes when r > n

7.6 Bayesian response envelopes

Chapter 8 Inner and Scaled Envelopes

8.1 Inner Envelopes

8.1.1 Definition and properties of inner envelopes

8.1.2 Inner response envelopes

8.1.3 Maximum likelihood estimators

8.1.4 Race times: Inner envelopes

8.2 Scaled response envelopes

8.2.1 Scaled response model

8.2.2 Estimation

8.2.3 Race times: Scaled response envelopes

8.3 Scaled predictor envelopes

8.3.1 Scaled predictor model

8.3.2 Estimation

8.3.3 Scaled SIMPLS algorithm

Chapter 9 Connections and Adaptations

9.1 Canonical correlations

9.1.1 Construction of canonical variates and correlations

9.1.2 Derivation of canonical variates

9.1.3 Connection to envelopes

9.2 Reduced-rank regression

9.2.1 Reduced-rank model and estimation

9.2.2 Contrasts with envelopes

9.2.3 Reduced-rank response envelopes

9.2.4 Reduced-rank predictor envelopes

9.3 Supervised singular value decomposition

9.4 Sufficient dimension reduction

9.5 Sliced inverse regression

9.5.1 SIR methodology

9.5.2 Mussels’ muscles: Sliced inverse regression

9.5.3 The “Envelope method”

9.5.4 Envelopes and SIR

9.6 Dimension reduction for the conditional mean

9.6.1 Estimating one vector in SE(Y |X)

9.6.2 Estimating SE(Y |X)

9.7 Functional envelopes for SDR

9.7.1 Functional SDR

9.7.2 Functional predictor envelopes

9.8 Comparing covariance matrices

9.8.1 SDR for covariance matrices

9.8.2 Connections with envelopes

9.8.3 Illustrations

9.8.4 SDR for means and covariance matrices

9.9 Principal components

9.9.1 Introduction

9.9.2 Random latent variables

9.9.3 Fixed latent variables and isotropic errors

9.9.4 Numerical illustrations

9.10 Principal fitted components

9.10.1 Isotropic errors, X|Y = σ2Ip

9.10.2 Anisotropic errors, X|Y > 0

9.10.3 Non-normal errors and the choice of f

9.10.4 High-dimensional PFC

Appendix Envelope Algebra

A.1 Invariant and reducing subspaces

A.2 M-Envelopes

A.3 Relationships between envelopes

A.3.1 Invariance and equivariance

A.3.2 Direct sums of envelopes

A.3.3 Coordinate reduction

A.4 Kronecker products, vec and vech

A.5 Commutation, expansion and contraction matrices

A.6 Derivatives

A.6.1 Derivatives for η,  and 0

A.6.2 Derivatives with respect to 􀀀

A.6.3 Derivatives of Grassmann objective functions

A.7 Miscellaneous results

A.8 Matrix normal distribution

A.9 Literature notes

Appendix B Proofs for Envelope Algorithms

B.1 The 1D algorithm, Section 6.4

B.2 Sequential moment-based algorithm, Section 6.5

Appendix C Grassmann Manifold Optimization

C.1 Gradient algorithm

C.2 Construction of B

C.3 Construction of exp{δA(B)}

C.4 Starting and Stopping 

Index