• Title/Summary/Keyword: Interval models

Search Result 563, Processing Time 0.029 seconds

A Study on the Nonlinear Dynamics of PR Interval Variability Using Surrogate data

  • Lee, J.M.;Park, K.S.;Shin, I.S.
    • Proceedings of the KOSOMBE Conference
    • /
    • v.1996 no.05
    • /
    • pp.27-30
    • /
    • 1996
  • PR interval variability has been proposed as a noninvasive tool for in-vestigating the autonomic nervous system as welt as heart rate variability. The goal of this paper is to determine whether PR interval variability is generated from deterministic nonlinear dynamics. The data used in this study is a 24-hour bolter ECGs of 20 healthy adults. We developed an automatic PR interval measurement algorithm, and tested it using MIT ECG Databases. The general discriminants of nonlinear dynamics, such as, correlation dimension and phase space reconstruction are used. Surrogate data is generated from simpler linear models to have similar statistical characteristics with the original data. Nonlinear discriminants are applied to both data, and compared for any significant results. It was concluded that PR interval variability shows non-linear characteristics.

  • PDF

Neural Network Modeling for Software Reliability Prediction of Grouped Failure Data (그룹 고장 데이터의 소프트웨어 신뢰성 예측에 관한 신경망 모델)

  • Lee, Sang-Un;Park, Yeong-Mok;Park, Soo-Jin;Park, Jae-Heung
    • The Transactions of the Korea Information Processing Society
    • /
    • v.7 no.12
    • /
    • pp.3821-3828
    • /
    • 2000
  • Many software projects collect grouped failure data (failures in some failure interval or in variable time interval) rather than individual failure times or failure count data during the testing or operational phase. This paper presents the neural network (NN) modeling that is dble to predict cumulative failures in the variable future time for grouped failure data. ANN's predictive ability can be affected by what it learns and in its ledming sequence. Eleven training regimes that represents the input-output of NN are considered. The best training regimes dre selected rJdsed on the next' step dvemge reldtive prediction error (AE) and normalized AE (NAE). The suggested NN models are compared with other well-known KN models and statistical software reliability growth models (SHGlvls) in order to evaluate performance, Experimental results show that the NN model with variable time interval information is necessary in order to predict cumulative failures in the variable future time interval.

  • PDF

Asymptotics in Transformed ARMA Models

  • Yeo, In-Kwon
    • Communications for Statistical Applications and Methods
    • /
    • v.18 no.1
    • /
    • pp.71-77
    • /
    • 2011
  • In this paper, asymptotic results are investigated when a parametric transformation is applied to ARMA models. The conditions are determined to ensure the strong consistency and the asymptotic normality of maximum likelihood estimators and the correct coverage probability of the forecast interval obtained by the transformation and backtransformation approach.

Investigations into Coarsening Continuous Variables

  • Jeong, Dong-Myeong;Kim, Jay-J.
    • The Korean Journal of Applied Statistics
    • /
    • v.23 no.2
    • /
    • pp.325-333
    • /
    • 2010
  • Protection against disclosure of survey respondents' identifiable and/or sensitive information is a prerequisite for statistical agencies that release microdata files from their sample surveys. Coarsening is one of popular methods for protecting the confidentiality of the data. Grouped data can be released in the form of microdata or tabular data. Instead of releasing the data in a tabular form only, having microdata available to the public with interval codes with their representative values greatly enhances the utility of the data. It allows the researchers to compute covariance between the variables and build statistical models or to run a variety of statistical tests on the data. It may be conjectured that the variance of the interval data is lower that of the ungrouped data in the sense that the coarsened data do not have the within interval variance. This conjecture will be investigated using the uniform and triangular distributions. Traditionally, midpoint is used to represent all the values in an interval. This approach implicitly assumes that the data is uniformly distributed within each interval. However, this assumption may not hold, especially in the last interval of the economic data. In this paper, we will use three distributional assumptions - uniform, Pareto and lognormal distribution - in the last interval and use either midpoint or median for other intervals for wage and food costs of the Statistics Korea's 2006 Household Income and Expenditure Survey(HIES) data and compare these approaches in terms of the first two moments.

Structural design of Optimized Interval Type-2 FCM Based RBFNN : Focused on Modeling and Pattern Classifier (최적화된 Interval Type-2 FCM based RBFNN 구조 설계 : 모델링과 패턴분류기를 중심으로)

  • Kim, Eun-Hu;Song, Chan-Seok;Oh, Sung-Kwun;Kim, Hyun-Ki
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.66 no.4
    • /
    • pp.692-700
    • /
    • 2017
  • In this paper, we propose the structural design of Interval Type-2 FCM based RBFNN. Proposed model consists of three modules such as condition, conclusion and inference parts. In the condition part, Interval Type-2 FCM clustering which is extended from FCM clustering is used. In the conclusion part, the parameter coefficients of the consequence part are estimated through LSE(Least Square Estimation) and WLSE(Weighted Least Square Estimation). In the inference part, final model outputs are acquired by fuzzy inference method from linear combination of both polynomial and activation level obtained through Interval Type-2 FCM and acquired activation level through Interval Type-2 FCM. Additionally, The several parameters for the proposed model are identified by using differential evolution. Final model outputs obtained through benchmark data are shown and also compared with other already studied models' performance. The proposed algorithm is performed by using Iris and Vehicle data for pattern classification. For the validation of regression problem modeling performance, modeling experiments are carried out by using MPG and Boston Housing data.

Estimation of the Actual Working Time by Interval Linear Regression Models with Constraint Conditions (제약부 구간 선형 회귀모델에 의한 실동시간의 견적)

  • Hwang, S. G.;Seo, Y. J.
    • Journal of the Korean Operations Research and Management Science Society
    • /
    • v.14 no.2
    • /
    • pp.105-114
    • /
    • 1989
  • The actual working time of jobs, in general, is different to the standard time of jobs. In this paper, in order to analyze the actual working time of each job in production, we use the total production amount and the encessary total working time. The method which analyzes the actual working time is as follows. In this paper, we propose the interval regression analysis for obtaining an interval linear regression model with constraint conditions with respect to interval parameters. The merits of this method are the following.1) it is easy to obtain an interval linear model by solving a LP problem to which the formulation of proposed regression analysis is reduced, 2) it is easy to add constraint conditions about interval parameters, which are a sort of expert knowledge. As an application, within a case which has given certain data, the actual working time of jobs and the number of workers in a future plan are estimated through the real data obtianed from the operation of processing line in a heavy industry company. It results from the proposed method that the actual working time and the number of workers can be estimated as intervals by the interval regression model.

  • PDF

First- and Second-best Pricing in Stable Dynamic Models (안정동력학 모형에서 최선 통행료 및 차선 통행료)

  • Park, Koo-Hyun
    • Journal of the Korean Operations Research and Management Science Society
    • /
    • v.34 no.4
    • /
    • pp.123-138
    • /
    • 2009
  • This study examined the first- and second-best pricing by stable dynamics in congested transportation networks. Stable dynamics, suggested by Nesterov and de Palma (2003), is a new model which describes and provides a stable state of congestion in urban transportation networks. The first-best pricing in user equilibrium models introduces user-equilibrium in the system-equilibrium by tolling the difference between the marginal social cost and the marginal private cost on each link. Nevertheless, the second-best pricing, which levies the toll on some, but not all, links, is relevant from the practical point of view. In comparison with the user equilibrium model, the stable dynamic model provides a solution equivalent to system-equilibrium if it is focused on link flows. Therefore the toll interval on each link, which keeps up the system-equilibrium, is more meaningful than the first-best pricing. In addition, the second-best pricing in stable dynamic models is the same as the first-best pricing since the toll interval is separately given by each link. As an effect of congestion pricing in stable dynamic models, we can remove the inefficiency of the network with inefficient Braess links by levying a toll on the Braess link. We present a numerical example applied to the network with 6 nodes and 9 links, including 2 Braess links.

Warranty cost anlaysis for multi-component systems with imperfect repair

  • Park, Minjae
    • International Journal of Reliability and Applications
    • /
    • v.15 no.1
    • /
    • pp.51-64
    • /
    • 2014
  • This paper develops a warranty cost model for complex systems with imperfect repair within a warranty period by addressing a practical case that the first inter-failure interval is longer than any other inter-failure intervals. The product is in its best condition before the first failure if repair is imperfect. After the imperfect repair, other inter-failure intervals which are explained by renewal processes, are stochastically smaller than the first inter-failure interval. Based on this idea, we suggest the failure-interval-failure-criterion model. In this model, we consider two random variables, X and Y where X represents failure intervals and Y represents failure criterion. We also obtain the distribution of the number of failures and conduct the warranty cost analysis. We investigate different types of warranty cost models, reliabilities and other measures for various systems including series-parallel configurations. Several numerical examples are discussed to demonstrate the applicability of the methodologies derived in the paper.

  • PDF

A Variable Precision Rough Set Model for Interval data (구간 데이터를 위한 가변정밀도 러프집합 모형)

  • Kim, Kyeong-Taek
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.34 no.2
    • /
    • pp.30-34
    • /
    • 2011
  • Variable precision rough set models have been successfully applied to problems whose domains are discrete values. However, there are many situations where discrete data is not available. When it comes to the problems with interval values, no variable precision rough set model has been proposed. In this paper, we propose a variable precision rough set model for interval values in which classification errors are allowed in determining if two intervals are same. To build the model, we define equivalence class, upper approximation, lower approximation, and boundary region. Then, we check if each of 11 characteristics on approximation that works in Pawlak's rough set model is valid for the proposed model or not.

Analysis of Yield Model Using Defect Density Function of DOU(Defects of One Unit) (DOU 결점 밀도분포를 이용한 수율 모형 분석)

  • Choi, Sung-Woon
    • Proceedings of the Safety Management and Science Conference
    • /
    • 2010.11a
    • /
    • pp.551-557
    • /
    • 2010
  • The research proposes the hypergeometric, binomial and Poisson yield models for defective and defect. The paper also presents the hypothesis test, confidence interval and control charts for DPU(Defect Per Unit) and DPO(Defect Per Opportunity). Especially the study considers the analysis of compound Poisson yield models using various DOU density distributions.

  • PDF