DOI QR코드

DOI QR Code

Intensive comparison of semi-parametric and non-parametric dimension reduction methods in forward regression

  • Shin, Minju (Department of Statistics, Ewha Womans University) ;
  • Yoo, Jae Keun (Department of Statistics, Ewha Womans University)
  • 투고 : 2022.06.02
  • 심사 : 2022.07.22
  • 발행 : 2022.09.30

초록

Principal Fitted Component (PFC) is a semi-parametric sufficient dimension reduction (SDR) method, which is originally proposed in Cook (2007). According to Cook (2007), the PFC has a connection with other usual non-parametric SDR methods. The connection is limited to sliced inverse regression (Li, 1991) and ordinary least squares. Since there is no direct comparison between the two approaches in various forward regressions up to date, a practical guidance between the two approaches is necessary for usual statistical practitioners. To fill this practical necessity, in this paper, we newly derive a connection of the PFC to covariance methods (Yin and Cook, 2002), which is one of the most popular SDR methods. Also, intensive numerical studies have done closely to examine and compare the estimation performances of the semi- and non-parametric SDR methods for various forward regressions. The founding from the numerical studies are confirmed in a real data example.

키워드

과제정보

For Minju Shin and Jae Keun Yoo, this work was supported by the MSIT (Ministry of Science, ICT), Korea, under the High-Potential Individuals Global Training Program (RS-2022-00154879) supervised by the IITP (Institute for Information & Communications Technology Planning & Evaluation). For Jae Keun Yoo, this work was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Korean Ministry of Education (NRF-2021R1F1A1059844).

참고문헌

  1. Cook RD (1998). Regression Graphics, John Wiley & Sons, New York.
  2. Cook RD (2007). Fisher lecture: dimension reduction in regression, Statistical Science, 22, 1-26. https://doi.org/10.1214/088342306000000682
  3. Cook RD and Forzani L (2009). Principal fitted components for dimension reduction in regression, Statistical Science, 485, 485-501.
  4. Cook RD, Li B, and Chiaromonte F (2007). Dimension reduction in regression without matrix inversion, Biometrika, 94, 569-584. https://doi.org/10.1093/biomet/asm038
  5. Cook RD and Weisberg S (1991). Comment: Sliced inverse regression for dimension reduction by KC Li, Journal of the American Statistical Association, 86, 328-332.
  6. Cook RD and Weisberg S (1994). An Introduction to Regression Graphics, John Wiley & Sons, New York.
  7. Cook RD and Zhang X (2014). Fused estimators of the central subspace in sufficient dimension reduction, Journal of the American Statistical Association, 109, 815-827. https://doi.org/10.1080/01621459.2013.866563
  8. Fearn T (1983). A misuse of ridge regression in the calibration of a near infrared reflectance instrument, Journal of the Royal Statistical Society: Series C (Applied Statistics), 32, 73-79. https://doi.org/10.2307/2348045
  9. Li KC (1991). Sliced inverse regression for dimension reduction, Journal of the American Statistical Association, 86, 316-327. https://doi.org/10.1080/01621459.1991.10475035
  10. Li KC (1992). On principal Hessian directions for data visualization and dimension reduction: another application of Stein's lemma, Journal of the American Statistical Association, 87, 1025-1039. https://doi.org/10.1080/01621459.1992.10476258
  11. Li L and Yin X (2008) Sliced inverse regression with regularizations, Biometrics, 64, 124-131. https://doi.org/10.1111/j.1541-0420.2007.00836.x
  12. Yin X and Cook RD (2002). Dimension reduction for the conditional kth moment in regression, Journal of Royal Statistical Society Series B, 64, 159-175. https://doi.org/10.1111/1467-9868.00330
  13. Yoo JK (2016). Tutorial: Methodologies for sufficient dimension reduction in regression, Communications for Statistical Applications and Methods, 23, 95-117.
  14. Yoo JK (2018a). Response dimension reduction: model-based approach, Statistics, 52, 409-425. https://doi.org/10.1080/02331888.2017.1410152
  15. Yoo JK (2018b). Partial least squares fusing unsupervised learning, Chemometrics and Intelligent Laboratory Systems, 175, 82-86. https://doi.org/10.1016/j.chemolab.2017.12.016