DOI QR코드

DOI QR Code

Tutorial: Methodologies for sufficient dimension reduction in regression

  • Yoo, Jae Keun (Department of Statistics, Ewha Womans University)
  • 투고 : 2016.01.22
  • 심사 : 2016.02.13
  • 발행 : 2016.03.31

초록

In the paper, as a sequence of the first tutorial, we discuss sufficient dimension reduction methodologies used to estimate central subspace (sliced inverse regression, sliced average variance estimation), central mean subspace (ordinary least square, principal Hessian direction, iterative Hessian transformation), and central $k^{th}$-moment subspace (covariance method). Large-sample tests to determine the structural dimensions of the three target subspaces are well derived in most of the methodologies; however, a permutation test (which does not require large-sample distributions) is introduced. The test can be applied to the methodologies discussed in the paper. Theoretical relationships among the sufficient dimension reduction methodologies are also investigated and real data analysis is presented for illustration purposes. A seeded dimension reduction approach is then introduced for the methodologies to apply to large p small n regressions.

키워드

참고문헌

  1. Bura E and Cook RD (2001). Extending sliced inverse regression: the weighted chi-squared test, Journal of the American Statistical Association, 96, 996-1003. https://doi.org/10.1198/016214501753208979
  2. Cook RD (1998a). Regression Graphics: Ideas for Studying Regressions through Graphics, Wiley, New York.
  3. Cook RD (1998b). Principal Hessian directions revisited, Journal of the American Statistical Association, 93, 84-94. https://doi.org/10.1080/01621459.1998.10474090
  4. Cook RD and Critchley F (2000). Identifying regression outliers and mixtures graphically, Journal of the American Statistical Association, 95, 781-794. https://doi.org/10.1080/01621459.2000.10474270
  5. Cook RD and Li B (2002). Dimension reduction for the conditional mean in regression, Annals of Statistics, 30, 455-474. https://doi.org/10.1214/aos/1021379861
  6. Cook RD and Li B (2004). Determining the dimension of iterative Hessian transformation, Annals of Statistics, 32, 2501-2531. https://doi.org/10.1214/009053604000000661
  7. Cook RD, Li B, and Chiaromonte F (2007). Dimension reduction in regression without matrix inversion, Biometrika, 94, 569-584. https://doi.org/10.1093/biomet/asm038
  8. Cook RD and Weisberg S (1991). Comment: Sliced inverse regression for dimension reduction by KC Li, Journal of the American Statistical Association, 86, 328-332.
  9. Cook RD and Zhang X (2014). Fused estimators of the central subspace in sufficient dimension reduction, Journal of the American Statistical Association, 109, 815-827. https://doi.org/10.1080/01621459.2013.866563
  10. Li KC (1991). Sliced inverse regression for dimension reduction, Journal of the American Statistical Association, 86, 316-327. https://doi.org/10.1080/01621459.1991.10475035
  11. Li KC (1992). On principal Hessian directions for data visualization and dimension reduction: another application of Stein's lemma, Journal of the American Statistical Association, 87, 1025-1039. https://doi.org/10.1080/01621459.1992.10476258
  12. Shao Y, Cook RD, and Weisberg S (2007). Marginal tests with sliced average variance estimation, Biometrika, 94, 285-296. https://doi.org/10.1093/biomet/asm021
  13. Stein CM (1981). Estimation of the mean of a multivariate normal distribution, Annals of Statistics, 9, 1135-1151. https://doi.org/10.1214/aos/1176345632
  14. Ye Z and Weiss RE (2003). Using the bootstrap to select one of a new class of dimension reduction methods, Journal of the American Statistical Association, 98, 968-979. https://doi.org/10.1198/016214503000000927
  15. Yin X and Cook RD (2002). Dimension reduction for the conditional kth moment in regression, Journal of Royal Statistical Society Series B, 64, 159-175. https://doi.org/10.1111/1467-9868.00330
  16. Yoo JK (2013a). Advances in seeded dimension reduction: bootstrap criteria and extensions, Computational Statistics & Data Analysis, 60, 70-79. https://doi.org/10.1016/j.csda.2012.10.003
  17. Yoo JK (2013b). Chi-squared tests in kth-moment sufficient dimension reduction, Journal of Statistical Computation and Simulation, 83, 191-201. https://doi.org/10.1080/00949655.2011.635304
  18. Yoo JK (2016). Tutorial: Dimension reduction in regression with a notion of sufficiency, Communications for Statistical Applications and Methods, 23, 93-103. https://doi.org/10.5351/CSAM.2016.23.2.093

피인용 문헌

  1. Dimension reduction for right-censored survival regression: transformation approach vol.23, pp.3, 2016, https://doi.org/10.5351/CSAM.2016.23.3.259
  2. Intensive numerical studies of optimal sufficient dimension reduction with singularity vol.24, pp.3, 2017, https://doi.org/10.5351/CSAM.2017.24.3.303