• Title/Summary/Keyword: Projection Pursuit type statistics

Search Result 5, Processing Time 0.025 seconds

BOOTSTRAP TESTS FOR THE EQUALITY OF DISTRIBUTIONS

  • Ping, Jing
    • Journal of applied mathematics & informatics
    • /
    • v.7 no.2
    • /
    • pp.467-482
    • /
    • 2000
  • Testing equality of two and k distributions has long been an interesting issue in statistical inference. To overcome the sparseness of data points in high-dimensional space and deal with the general cases, we suggest several projection pursuit type statistics. Some results on the limiting distributions of the statistics are obtained, some properties of Bootstrap approximation are investigated. Furthermore, for computational reasons an approximation for the statistics the based on Number theoretic method is applied. Several simulation experiments are performed.

ON TESTING FOR HOMOGENEITY OF THE COVARIANCE N\MATRICES

  • Zhang, Xiao-Ning;Jing, Ping;Ji, Xiao-Ming
    • Journal of applied mathematics & informatics
    • /
    • v.8 no.2
    • /
    • pp.361-370
    • /
    • 2001
  • Testing equality of covariance matrix of k populations has long been an interesting issue in statistical inference. To overcome the sparseness of data points in a high-dimensional space and deal with the general cases, we suggest several projection pursuit type statistics. Some results on the limiting distributions of the statistics are obtained. some properties of Bootstrap approximation are investigated. Furthermore, for computational reasons an approximation which is based on Number theoretic method for the statistics is adopted. Several simulation experiments are performed.

A study on high dimensional large-scale data visualization (고차원 대용량 자료의 시각화에 대한 고찰)

  • Lee, Eun-Kyung;Hwang, Nayoung;Lee, Yoondong
    • The Korean Journal of Applied Statistics
    • /
    • v.29 no.6
    • /
    • pp.1061-1075
    • /
    • 2016
  • In this paper, we discuss various methods to visualize high dimensional large-scale data and review some issues associated with visualizing this type of data. High-dimensional data can be presented in a 2-dimensional space with a few selected important variables. We can visualize more variables with various aesthetic attributes in graphics or use the projection pursuit method to find an interesting low-dimensional view. For large-scale data, we discuss jittering and alpha blending methods that solve any problem with overlapping points. We also review the R package tabplot, scagnostics, and other R packages for interactive web application with visualization.

LMS and LTS-type Alternatives to Classical Principal Component Analysis

  • Huh, Myung-Hoe;Lee, Yong-Goo
    • Communications for Statistical Applications and Methods
    • /
    • v.13 no.2
    • /
    • pp.233-241
    • /
    • 2006
  • Classical principal component analysis (PCA) can be formulated as finding the linear subspace that best accommodates multidimensional data points in the sense that the sum of squared residual distances is minimized. As alternatives to such LS (least squares) fitting approach, we produce LMS (least median of squares) and LTS (least trimmed squares)-type PCA by minimizing the median of squared residual distances and the trimmed sum of squares, in a similar fashion to Rousseeuw (1984)'s alternative approaches to LS linear regression. Proposed methods adopt the data-driven optimization algorithm of Croux and Ruiz-Gazen (1996, 2005) that is conceptually simple and computationally practical. Numerical examples are given.

A Study on Automatic Learning of Weight Decay Neural Network (가중치감소 신경망의 자동학습에 관한 연구)

  • Hwang, Chang-Ha;Na, Eun-Young;Seok, Kyung-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.12 no.2
    • /
    • pp.1-10
    • /
    • 2001
  • Neural networks we increasingly being seen as an addition to the statistics toolkit which should be considered alongside both classical and modern statistical methods. Neural networks are usually useful for classification and function estimation. In this paper we concentrate on function estimation using neural networks with weight decay factor The use of weight decay seems both to help the optimization process and to avoid overfitting. In this type of neural networks, the problem to decide the number of hidden nodes, weight decay parameter and iteration number of learning is very important. It is called the optimization of weight decay neural networks. In this paper we propose a automatic optimization based on genetic algorithms. Moreover, we compare the weight decay neural network automatically learned according to automatic optimization with ordinary neural network, projection pursuit regression and support vector machines.

  • PDF