• Title/Summary/Keyword: statistical approach

Search Result 2,335, Processing Time 0.024 seconds

Bayesian Analysis for Multiple Change-point hazard Rate Models

  • Jeong, Kwangmo
    • Communications for Statistical Applications and Methods
    • /
    • v.6 no.3
    • /
    • pp.801-812
    • /
    • 1999
  • Change-point hazard rate models arise for example in applying "burn-in" techniques to screen defective items and in studing times until undesirable side effects occur in clinical trials. Sometimes in screening defectives it might be sensible to model two stages of burn-in. In a clinical trial there might be an initial hazard rate for a side effect which after a period of time changes to an intermediate hazard rate before settling into a long term hazard rate. In this paper we consider the multiple change points hazard rate model. The classical approach's asymptotics can be poor for the small to all moderate sample sizes often encountered in practice. We propose a Bayesian approach avoiding asymptotics to provide more reliable inference conditional only upon the data actually observed. The Bayesian models can be fitted using simulation methods. Model comparison is made using recently developed Bayesian model selection criteria. The above methodology is applied to a generated data and to a generated data and the Lawless(1982) failure times of electrical insulation.

  • PDF

DISCRIMINATION OF IN-ORDINAL STATE IN ROOM TEMPERATURE BASED ON STATISTICAL ANALYSIS

  • Takanashi, Ken-ichi;Daisuke Kozeki;Yoshiyuki Matsubara
    • Proceedings of the Korea Institute of Fire Science and Engineering Conference
    • /
    • 1997.11a
    • /
    • pp.484-491
    • /
    • 1997
  • In this paper, an approach to determine the in-ordinal condition of a room, which is based on multi variable analysis, is proposed. According to this approach, the distance of a state from the ordinal condition is thought to be evaluated by the Mahalanobis' distance. The temperature changes of a room were measured and their statistical characteristics such as distribution type, the mean value and the standard deviation are studied. The applicability of the method for the fire detection is also investigated.

  • PDF

Abnormal Crowd Behavior Detection Using Heuristic Search and Motion Awareness

  • Usman, Imran;Albesher, Abdulaziz A.
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.4
    • /
    • pp.131-139
    • /
    • 2021
  • In current time, anomaly detection is the primary concern of the administrative authorities. Suspicious activity identification is shifting from a human operator to a machine-assisted monitoring in order to assist the human operator and react to an unexpected incident quickly. These automatic surveillance systems face many challenges due to the intrinsic complex characteristics of video sequences and foreground human motion patterns. In this paper, we propose a novel approach to detect anomalous human activity using a hybrid approach of statistical model and Genetic Programming. The feature-set of local motion patterns is generated by a statistical model from the video data in an unsupervised way. This features set is inserted to an enhanced Genetic Programming based classifier to classify normal and abnormal patterns. The experiments are performed using publicly available benchmark datasets under different real-life scenarios. Results show that the proposed methodology is capable to detect and locate the anomalous activity in the real time. The accuracy of the proposed scheme exceeds those of the existing state of the art in term of anomalous activity detection.

Parameter estimation of an extended inverse power Lomax distribution with Type I right censored data

  • Hassan, Amal S.;Nassr, Said G.
    • Communications for Statistical Applications and Methods
    • /
    • v.28 no.2
    • /
    • pp.99-118
    • /
    • 2021
  • In this paper, we introduce an extended form of the inverse power Lomax model via Marshall-Olkin approach. We call it the Marshall-Olkin inverse power Lomax (MOIPL) distribution. The four- parameter MOIPL distribution is very flexible which contains some former and new models. Vital properties of the MOIPL distribution are affirmed. Maximum likelihood estimators and approximate confidence intervals are considered under Type I censored samples. Maximum likelihood estimates are evaluated according to simulation study. Bayesian estimators as well as Bayesian credible intervals under symmetric loss function are obtained via Markov chain Monte Carlo (MCMC) approach. Finally, the flexibility of the new model is analyzed by means of two real data sets. It is found that the MOIPL model provides closer fits than some other models based on the selected criteria.

Iterative projection of sliced inverse regression with fused approach

  • Han, Hyoseon;Cho, Youyoung;Yoo, Jae Keun
    • Communications for Statistical Applications and Methods
    • /
    • v.28 no.2
    • /
    • pp.205-215
    • /
    • 2021
  • Sufficient dimension reduction is useful dimension reduction tool in regression, and sliced inverse regression (Li, 1991) is one of the most popular sufficient dimension reduction methodologies. In spite of its popularity, it is known to be sensitive to the number of slices. To overcome this shortcoming, the so-called fused sliced inverse regression is proposed by Cook and Zhang (2014). Unfortunately, the two existing methods do not have the direction application to large p-small n regression, in which the dimension reduction is desperately needed. In this paper, we newly propose seeded sliced inverse regression and seeded fused sliced inverse regression to overcome this deficit by adopting iterative projection approach (Cook et al., 2007). Numerical studies are presented to study their asymptotic estimation behaviors, and real data analysis confirms their practical usefulness in high-dimensional data analysis.

Methods and Sample Size Effect Evaluation for Wafer Level Statistical Bin Limits Determination with Poisson Distributions (포아송 분포를 가정한 Wafer 수준 Statistical Bin Limits 결정방법과 표본크기 효과에 대한 평가)

  • Park, Sung-Min;Kim, Young-Sig
    • IE interfaces
    • /
    • v.17 no.1
    • /
    • pp.1-12
    • /
    • 2004
  • In a modern semiconductor device manufacturing industry, statistical bin limits on wafer level test bin data are used for minimizing value added to defective product as well as protecting end customers from potential quality and reliability excursion. Most wafer level test bin data show skewed distributions. By Monte Carlo simulation, this paper evaluates methods and sample size effect regarding determination of statistical bin limits. In the simulation, it is assumed that wafer level test bin data follow the Poisson distribution. Hence, typical shapes of the data distribution can be specified in terms of the distribution's parameter. This study examines three different methods; 1) percentile based methodology; 2) data transformation; and 3) Poisson model fitting. The mean square error is adopted as a performance measure for each simulation scenario. Then, a case study is presented. Results show that the percentile and transformation based methods give more stable statistical bin limits associated with the real dataset. However, with highly skewed distributions, the transformation based method should be used with caution in determining statistical bin limits. When the data are well fitted to a certain probability distribution, the model fitting approach can be used in the determination. As for the sample size effect, the mean square error seems to reduce exponentially according to the sample size.

A Maximum Likelihood Approach to Edge Detection (Maximum Likelihood 기법을 이용한 Edge 검출)

  • Cho, Moon;Park, Rae-Hong
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.11 no.1
    • /
    • pp.73-84
    • /
    • 1986
  • A statistical method is proposed which estimates an edge that is one of the basic features in image understanding. The conventional edge detection techniques are performed well for a deterministic singnal, but are not satisfactory for a statistical signal. In this paper, we use the likelihood function which takes account of the statistical property of a signal, and derive the decision function from it. We propose the maximum likelihood edge detection technique which estimates an edge point which maximizes the decision function mentioned above. We apply this technique to statistecal signals which are generated by using the random number generator. Simnulations show that the statistical edge detection technique gives satisfactory results. This technique is extended to the two-dimensional image and edges are found with a good accuracy.

  • PDF

Using GA based Input Selection Method for Artificial Neural Network Modeling Application to Bankruptcy Prediction (유전자 알고리즘을 활용한 인공신경망 모형 최적입력변수의 선정 : 부도예측 모형을 중심으로)

  • 홍승현;신경식
    • Proceedings of the Korea Inteligent Information System Society Conference
    • /
    • 1999.10a
    • /
    • pp.365-373
    • /
    • 1999
  • Recently, numerous studies have demonstrated that artificial intelligence such as neural networks can be an alternative methodology for classification problems to which traditional statistical methods have long been applied. In building neural network model, the selection of independent and dependent variables should be approached with great care and should be treated as a model construction process. Irrespective of the efficiency of a learning procedure in terms of convergence, generalization and stability, the ultimate performance of the estimator will depend on the relevance of the selected input variables and the quality of the data used. Approaches developed in statistical methods such as correlation analysis and stepwise selection method are often very useful. These methods, however, may not be the optimal ones for the development of neural network models. In this paper, we propose a genetic algorithms approach to find an optimal or near optimal input variables for neural network modeling. The proposed approach is demonstrated by applications to bankruptcy prediction modeling. Our experimental results show that this approach increases overall classification accuracy rate significantly.

  • PDF

A GA-based Rule Extraction for Bankruptcy Prediction Modeling (유전자 알고리즘을 활용한 부실예측모형의 구축)

  • Shin, Kyung-shik
    • Journal of Intelligence and Information Systems
    • /
    • v.7 no.2
    • /
    • pp.83-93
    • /
    • 2001
  • Prediction of corporate failure using past financial data is well-documented topic. Early studies of bankruptcy prediction used statistical techniques such as multiple discriminant analysis, logit and probit. Recently, however, numerous studies have demonstrated that artificial intelligence such as neural networks (NNs) can be an alternative methodology for classification problems to which traditional statistical methods have long been applied. Although numerous theoretical and experimental studies reported the usefulness or neural networks in classification studies, there exists a major drawback in building and using the model. That is, the user can not readily comprehend the final rules that the neural network models acquire. We propose a genetic algorithms (GAs) approach in this study and illustrate how GAs can be applied to corporate failure prediction modeling. An advantage of GAs approach offers is that it is capable of extracting rules that are easy to understand for users like expert systems. The preliminary results show that rule extraction approach using GAs for bankruptcy prediction modeling is promising.

  • PDF

Model selection algorithm in Gaussian process regression for computer experiments

  • Lee, Youngsaeng;Park, Jeong-Soo
    • Communications for Statistical Applications and Methods
    • /
    • v.24 no.4
    • /
    • pp.383-396
    • /
    • 2017
  • The model in our approach assumes that computer responses are a realization of a Gaussian processes superimposed on a regression model called a Gaussian process regression model (GPRM). Selecting a subset of variables or building a good reduced model in classical regression is an important process to identify variables influential to responses and for further analysis such as prediction or classification. One reason to select some variables in the prediction aspect is to prevent the over-fitting or under-fitting to data. The same reasoning and approach can be applicable to GPRM. However, only a few works on the variable selection in GPRM were done. In this paper, we propose a new algorithm to build a good prediction model among some GPRMs. It is a post-work of the algorithm that includes the Welch method suggested by previous researchers. The proposed algorithms select some non-zero regression coefficients (${\beta}^{\prime}s$) using forward and backward methods along with the Lasso guided approach. During this process, the fixed were covariance parameters (${\theta}^{\prime}s$) that were pre-selected by the Welch algorithm. We illustrated the superiority of our proposed models over the Welch method and non-selection models using four test functions and one real data example. Future extensions are also discussed.