• Title/Summary/Keyword: inverse regression

Search Result 224, Processing Time 0.015 seconds

Classification Using Sliced Inverse Regression and Sliced Average Variance Estimation

  • Lee, Hakbae
    • Communications for Statistical Applications and Methods
    • /
    • v.11 no.2
    • /
    • pp.275-285
    • /
    • 2004
  • We explore classification analysis using graphical methods such as sliced inverse regression and sliced average variance estimation based on dimension reduction. Some useful information about classification analysis are obtained by sliced inverse regression and sliced average variance estimation through dimension reduction. Two examples are illustrated, and classification rates by sliced inverse regression and sliced average variance estimation are compared with those by discriminant analysis and logistic regression.

Iterative projection of sliced inverse regression with fused approach

  • Han, Hyoseon;Cho, Youyoung;Yoo, Jae Keun
    • Communications for Statistical Applications and Methods
    • /
    • v.28 no.2
    • /
    • pp.205-215
    • /
    • 2021
  • Sufficient dimension reduction is useful dimension reduction tool in regression, and sliced inverse regression (Li, 1991) is one of the most popular sufficient dimension reduction methodologies. In spite of its popularity, it is known to be sensitive to the number of slices. To overcome this shortcoming, the so-called fused sliced inverse regression is proposed by Cook and Zhang (2014). Unfortunately, the two existing methods do not have the direction application to large p-small n regression, in which the dimension reduction is desperately needed. In this paper, we newly propose seeded sliced inverse regression and seeded fused sliced inverse regression to overcome this deficit by adopting iterative projection approach (Cook et al., 2007). Numerical studies are presented to study their asymptotic estimation behaviors, and real data analysis confirms their practical usefulness in high-dimensional data analysis.

Fused sliced inverse regression in survival analysis

  • Yoo, Jae Keun
    • Communications for Statistical Applications and Methods
    • /
    • v.24 no.5
    • /
    • pp.533-541
    • /
    • 2017
  • Sufficient dimension reduction (SDR) replaces original p-dimensional predictors to a lower-dimensional linearly transformed predictor. The sliced inverse regression (SIR) has the longest and most popular history of SDR methodologies. The critical weakness of SIR is its known sensitive to the numbers of slices. Recently, a fused sliced inverse regression is developed to overcome this deficit, which combines SIR kernel matrices constructed from various choices of the number of slices. In this paper, the fused sliced inverse regression and SIR are compared to show that the former has a practical advantage in survival regression over the latter. Numerical studies confirm this and real data example is presented.

A study on the multivariate sliced inverse regression (다변량 분할 역회귀모형에 관한 연구)

  • 이용구;이덕기
    • The Korean Journal of Applied Statistics
    • /
    • v.10 no.2
    • /
    • pp.293-308
    • /
    • 1997
  • Sliced inverse regression is a method for reducing the dimension of the explanatory variable X without going through any parametric or nonparametric model fitting process. This method explores the simplicity of the inverse view of regression; that is, instead of regressing the univariate output varable y against the multivariate X, we regress X against y. In this article, we propose bivariate sliced inverse regression, whose method regress the multivariate X against the bivariate output variables $y_1, Y_2$. Bivariate sliced inverse regression estimates the e.d.r. directions of satisfying two generalized regression model simultaneously. For the application of bivariate sliced inverse regression, we decompose the output variable y into two variables, one variable y gained by projecting the output variable y onto the column space of X and the other variable r through projecting the output variable y onto the space orthogonal to the column space of X, respectively and then estimate the e.d.r. directions of the generalized regression model by utilize two variables simultaneously. As a result, bivariate sliced inverse regression of considering the variable y and r simultaneously estimates the e.d.r. directions efficiently and steadily when the regression model is linear, quadratic and nonlinear, respectively.

  • PDF

On robustness in dimension determination in fused sliced inverse regression

  • Yoo, Jae Keun;Cho, Yoo Na
    • Communications for Statistical Applications and Methods
    • /
    • v.25 no.5
    • /
    • pp.513-521
    • /
    • 2018
  • The goal of sufficient dimension reduction (SDR) is to replace original p-dimensional predictors with a lower-dimensional linearly transformed predictor. The sliced inverse regression (SIR) (Li, Journal of the American Statistical Association, 86, 316-342, 1991) is one of the most popular SDR methods because of its applicability and simple implementation in practice. However, SIR may yield different dimension reduction results for different numbers of slices and despite its popularity, is a clear deficit for SIR. To overcome this, a fused sliced inverse regression was recently proposed. The study shows that the dimension-reduced predictors is robust to the numbers of the slices, but it does not investigate how robust its dimension determination is. This paper suggests a permutation dimension determination for the fused sliced inverse regression that is compared with SIR to investigate the robustness to the numbers of slices in the dimension determination. Numerical studies confirm this and a real data example is presented.

Evaluation of mental and physical load using inverse regression on sinus arrhythmia scores

  • Lee, Dhong-H.;Park, Kyung-S.
    • Journal of the Ergonomics Society of Korea
    • /
    • v.6 no.1
    • /
    • pp.3-8
    • /
    • 1987
  • This paper develops a statistical mode which estimates mental and physical loads of light work from sinus arrhythmia (SA) scores. During experiments, various levels of mental and physical loads (respectively scored by information processing and finger tapping rates) were imposed on subjects and SA scores were measured from the subjects. Two methods were used in developing workload estimation model. One is an algebraic inverse function of a multivariate regression equation, where mental and physical loads are independent variables and SA scores are dependent variables. The other is a statistical multivariate inverse regression. Of the two methods, inverse function resulted in larger mean squqre error in predicting mental and physical loads. Hence, inverse regression model is recommended for precise workload estimation.

  • PDF

Variable Selection in Sliced Inverse Regression Using Generalized Eigenvalue Problem with Penalties

  • Park, Chong-Sun
    • Communications for Statistical Applications and Methods
    • /
    • v.14 no.1
    • /
    • pp.215-227
    • /
    • 2007
  • Variable selection algorithm for Sliced Inverse Regression using penalty function is proposed. We noted SIR models can be expressed as generalized eigenvalue decompositions and incorporated penalty functions on them. We found from small simulation that the HARD penalty function seems to be the best in preserving original directions compared with other well-known penalty functions. Also it turned out to be effective in forcing coefficient estimates zero for irrelevant predictors in regression analysis. Results from illustrative examples of simulated and real data sets will be provided.

A Short Note on Empirical Penalty Term Study of BIC in K-means Clustering Inverse Regression

  • Ahn, Ji-Hyun;Yoo, Jae-Keun
    • Communications for Statistical Applications and Methods
    • /
    • v.18 no.3
    • /
    • pp.267-275
    • /
    • 2011
  • According to recent studies, Bayesian information criteria(BIC) is proposed to determine the structural dimension of the central subspace through sliced inverse regression(SIR) with high-dimensional predictors. The BIC may be useful in K-means clustering inverse regression(KIR) with high-dimensional predictors. However, the direct application of the BIC to KIR may be problematic, because the slicing scheme in SIR is not the same as that of KIR. In this paper, we present empirical penalty term studies of BIC in KIR to identify the most appropriate one. Numerical studies and real data analysis are presented.

Fused inverse regression with multi-dimensional responses

  • Cho, Youyoung;Han, Hyoseon;Yoo, Jae Keun
    • Communications for Statistical Applications and Methods
    • /
    • v.28 no.3
    • /
    • pp.267-279
    • /
    • 2021
  • A regression with multi-dimensional responses is quite common nowadays in the so-called big data era. In such regression, to relieve the curse of dimension due to high-dimension of responses, the dimension reduction of predictors is essential in analysis. Sufficient dimension reduction provides effective tools for the reduction, but there are few sufficient dimension reduction methodologies for multivariate regression. To fill this gap, we newly propose two fused slice-based inverse regression methods. The proposed approaches are robust to the numbers of clusters or slices and improve the estimation results over existing methods by fusing many kernel matrices. Numerical studies are presented and are compared with existing methods. Real data analysis confirms practical usefulness of the proposed methods.

A random forest-regression-based inverse-modeling evolutionary algorithm using uniform reference points

  • Gholamnezhad, Pezhman;Broumandnia, Ali;Seydi, Vahid
    • ETRI Journal
    • /
    • v.44 no.5
    • /
    • pp.805-815
    • /
    • 2022
  • The model-based evolutionary algorithms are divided into three groups: estimation of distribution algorithms, inverse modeling, and surrogate modeling. Existing inverse modeling is mainly applied to solve multi-objective optimization problems and is not suitable for many-objective optimization problems. Some inversed-model techniques, such as the inversed-model of multi-objective evolutionary algorithm, constructed from the Pareto front (PF) to the Pareto solution on nondominated solutions using a random grouping method and Gaussian process, were introduced. However, some of the most efficient inverse models might be eliminated during this procedure. Also, there are challenges, such as the presence of many local PFs and developing poor solutions when the population has no evident regularity. This paper proposes inverse modeling using random forest regression and uniform reference points that map all nondominated solutions from the objective space to the decision space to solve many-objective optimization problems. The proposed algorithm is evaluated using the benchmark test suite for evolutionary algorithms. The results show an improvement in diversity and convergence performance (quality indicators).