• Title/Summary/Keyword: sufficient statistics

Search Result 306, Processing Time 0.029 seconds

A concise overview of principal support vector machines and its generalization

  • Jungmin Shin;Seung Jun Shin
    • Communications for Statistical Applications and Methods
    • /
    • v.31 no.2
    • /
    • pp.235-246
    • /
    • 2024
  • In high-dimensional data analysis, sufficient dimension reduction (SDR) has been considered as an attractive tool for reducing the dimensionality of predictors while preserving regression information. The principal support vector machine (PSVM) (Li et al., 2011) offers a unified approach for both linear and nonlinear SDR. This article comprehensively explores a variety of SDR methods based on the PSVM, which we call principal machines (PM) for SDR. The PM achieves SDR by solving a sequence of convex optimizations akin to popular supervised learning methods, such as the support vector machine, logistic regression, and quantile regression, to name a few. This makes the PM straightforward to handle and extend in both theoretical and computational aspects, as we will see throughout this article.

Tutorial: Methodologies for sufficient dimension reduction in regression

  • Yoo, Jae Keun
    • Communications for Statistical Applications and Methods
    • /
    • v.23 no.2
    • /
    • pp.105-117
    • /
    • 2016
  • In the paper, as a sequence of the first tutorial, we discuss sufficient dimension reduction methodologies used to estimate central subspace (sliced inverse regression, sliced average variance estimation), central mean subspace (ordinary least square, principal Hessian direction, iterative Hessian transformation), and central $k^{th}$-moment subspace (covariance method). Large-sample tests to determine the structural dimensions of the three target subspaces are well derived in most of the methodologies; however, a permutation test (which does not require large-sample distributions) is introduced. The test can be applied to the methodologies discussed in the paper. Theoretical relationships among the sufficient dimension reduction methodologies are also investigated and real data analysis is presented for illustration purposes. A seeded dimension reduction approach is then introduced for the methodologies to apply to large p small n regressions.

Tutorial: Dimension reduction in regression with a notion of sufficiency

  • Yoo, Jae Keun
    • Communications for Statistical Applications and Methods
    • /
    • v.23 no.2
    • /
    • pp.93-103
    • /
    • 2016
  • In the paper, we discuss dimension reduction of predictors ${\mathbf{X}}{\in}{{\mathbb{R}}^p}$ in a regression of $Y{\mid}{\mathbf{X}}$ with a notion of sufficiency that is called sufficient dimension reduction. In sufficient dimension reduction, the original predictors ${\mathbf{X}}$ are replaced by its lower-dimensional linear projection without loss of information on selected aspects of the conditional distribution. Depending on the aspects, the central subspace, the central mean subspace and the central $k^{th}$-moment subspace are defined and investigated as primary interests. Then the relationships among the three subspaces and the changes in the three subspaces for non-singular transformation of ${\mathbf{X}}$ are studied. We discuss the two conditions to guarantee the existence of the three subspaces that constrain the marginal distribution of ${\mathbf{X}}$ and the conditional distribution of $Y{\mid}{\mathbf{X}}$. A general approach to estimate them is also introduced along with an explanation for conditions commonly assumed in most sufficient dimension reduction methodologies.

On Mimimal Sufficient Statistics

  • Nabeya, Seiji
    • Journal of the Korean Statistical Society
    • /
    • v.10
    • /
    • pp.83-90
    • /
    • 1981
  • Let (X, A) be a measurable space, i.e. X is a non-empty set and A is a $\sigma$-field of subsets of X. Let $\Omega$ be a parameter space and P be a family of probability measures $P_\theta, \theta \in \Omega$ defined on (X, A).

  • PDF

On the Conditon of Tightness for Fuzzy Random Variables

  • Joo, Sang-Yeol
    • Proceedings of the Korean Reliability Society Conference
    • /
    • 2002.06a
    • /
    • pp.303-303
    • /
    • 2002
  • We obtain the necessary and sufficient condition of tightness for a sequence of random variables in the space of fuzzy sets with compact support in R.

  • PDF

CONFORMAL HEMI-SLANT SUBMERSION FROM KENMOTSU MANIFOLD

  • Mohammad Shuaib;Tanveer Fatima
    • Honam Mathematical Journal
    • /
    • v.45 no.2
    • /
    • pp.248-268
    • /
    • 2023
  • As a generalization of conformal semi-invariant submersion, conformal slant submersion and conformal semi-slant submersion, in this paper we study conformal hemi-slant submersion from Kenmotsu manifold onto a Riemannian manifold. The necessary and sufficient conditions for the integrability and totally geodesicness of distributions are discussed. Moreover, we have obtained sufficient condition for a conformal hemi-slant submersion to be a homothetic map. The condition for a total manifold of the submersion to be twisted product is studied, followed by other decomposition theorems.

Case study: application of fused sliced average variance estimation to near-infrared spectroscopy of biscuit dough data (Fused sliced average variance estimation의 실증분석: 비스킷 반죽의 근적외분광분석법 분석 자료로의 적용)

  • Um, Hye Yeon;Won, Sungmin;An, Hyoin;Yoo, Jae Keun
    • The Korean Journal of Applied Statistics
    • /
    • v.31 no.6
    • /
    • pp.835-842
    • /
    • 2018
  • The so-called sliced average variance estimation (SAVE) is a popular methodology in sufficient dimension reduction literature. SAVE is sensitive to the number of slices in practice. To overcome this, a fused SAVE (FSAVE) is recently proposed by combining the kernel matrices obtained from various numbers of slices. In the paper, we consider practical applications of FSAVE to large p-small n data. For this, near-infrared spectroscopy of biscuit dough data is analyzed. In this case study, the usefulness of FSAVE in high-dimensional data analysis is confirmed by showing that the result by FASVE is superior to existing analysis results.

FORWARD ORDER LAW FOR THE GENERALIZED INVERSES OF MULTIPLE MATRIX PRODUCT

  • Xiong, Zhipin;Zheng, Bing
    • Journal of applied mathematics & informatics
    • /
    • v.25 no.1_2
    • /
    • pp.415-424
    • /
    • 2007
  • The generalized inverses have many important applications in the aspects of theoretic research and numerical computations and therefore they were studied by many authors. In this paper we get some necessary and sufficient conditions of the forward order law for {1}-inverse of multiple matrices products $A\;=\;A_1A_2{\cdots}A_n$ by using the maximal rank of generalized Schur complement.

Investigating SIR, DOC and SAVE for the Polychotomous Response

  • Lee, Hak-Bae;Lee, Hee-Min
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.3
    • /
    • pp.501-506
    • /
    • 2012
  • This paper investigates the central subspace related with SIR, DOC and SAVE when the response has more than two values. The subspaces constructed by SIR, DOC and SAVE are investigated and compared. The SAVE paradigm is the most comprehensive. In addition, the SAVE coincides with the central subspace when the conditional distribution of predictors given the response is normally distributed.