• Title/Summary/Keyword: Markov chain model

Search Result 554, Processing Time 0.031 seconds

Development of Daily Rainfall Simulation Model Based on Homogeneous Hidden Markov Chain (동질성 Hidden Markov Chain 모형을 이용한 일강수량 모의기법 개발)

  • Kwon, Hyun-Han;Kim, Tae Jeong;Hwang, Seok-Hwan;Kim, Tae-Woong
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.33 no.5
    • /
    • pp.1861-1870
    • /
    • 2013
  • A climate change-driven increased hydrological variability has been widely acknowledged over the past decades. In this regards, rainfall simulation techniques are being applied in many countries to consider the increased variability. This study proposed a Homogeneous Hidden Markov Chain(HMM) designed to recognize rather complex patterns of rainfall with discrete hidden states and underlying distribution characteristics via mixture probability density function. The proposed approach was applied to Seoul and Jeonju station to verify model's performance. Statistical moments(e.g. mean, variance, skewness and kurtosis) derived by daily and seasonal rainfall were compared with observation. It was found that the proposed HMM showed better performance in terms of reproducing underlying distribution characteristics. Especially, the HMM was much better than the existing Markov Chain model in reproducing extremes. In this regard, the proposed HMM could be used to evaluate a long-term runoff and design flood as inputs.

Development of Statistical Downscaling Model Using Nonstationary Markov Chain (비정상성 Markov Chain Model을 이용한 통계학적 Downscaling 기법 개발)

  • Kwon, Hyun-Han;Kim, Byung-Sik
    • Journal of Korea Water Resources Association
    • /
    • v.42 no.3
    • /
    • pp.213-225
    • /
    • 2009
  • A stationary Markov chain model is a stochastic process with the Markov property. Having the Markov property means that, given the present state, future states are independent of the past states. The Markov chain model has been widely used for water resources design as a main tool. A main assumption of the stationary Markov model is that statistical properties remain the same for all times. Hence, the stationary Markov chain model basically can not consider the changes of mean or variance. In this regard, a primary objective of this study is to develop a model which is able to make use of exogenous variables. The regression based link functions are employed to dynamically update model parameters given the exogenous variables, and the model parameters are estimated by canonical correlation analysis. The proposed model is applied to daily rainfall series at Seoul station having 46 years data from 1961 to 2006. The model shows a capability to reproduce daily and seasonal characteristics simultaneously. Therefore, the proposed model can be used as a short or mid-term prediction tool if elaborate GCM forecasts are used as a predictor. Also, the nonstationary Markov chain model can be applied to climate change studies if GCM based climate change scenarios are provided as inputs.

A Study of Image Target Tracking Using ITS in an Occluding Environment (표적이 일시적으로 가려지는 환경에서 ITS 기법을 이용한 영상 표적 추적 알고리듬 연구)

  • Kim, Yong;Song, Taek-Lyul
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.19 no.4
    • /
    • pp.306-314
    • /
    • 2013
  • Automatic tracking in cluttered environment requires the initiation and maintenance of tracks, and track existence probability of true track is kept by Markov Chain Two model of target existence propagation. Unlike Markov Chain One model for target existence propagation, Markov Chain Two model is made up three hypotheses about target existence event which are that the target exist and is detectable, the target exists and is non-detectable through occlusion, and the target does not exist and is non-detectable according to non-existing target. In this paper we present multi-scan single target tracking algorithm based on the target existence, which call the Integrated Track Splitting algorithm with Markov Chain Two model in imaging sensor.

The Probabilistic Analysis of Fatigue Damage Accumulation Behavior Using Markov Chain Model in CFRP Composites (Markov Chain Model을 이용한 CFRP 복합재료의 피로손상누적거동에 대한 확률적 해석)

  • Kim, Do-Sik;Kim, In-Bai;Kim, Jung-Kyu
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.20 no.4
    • /
    • pp.1241-1250
    • /
    • 1996
  • The characteristics of fatigue cumulative damage and fatigue life of 8-harness satin woven CFRP composites with a circular hole under constant amplitude and 2-level block loading are estimated by Stochastic Makov chain model. It is found in this study that the fatigue damage accumulation behavior is very random and the fatigue damage is accumulated as two regions under constant amplitude fatigue loading. In constant amplitude fatigue loading the predicted mean number of cycles to a specified damage state by Markov chain model shows a good agreement with the test result. The predicted distribution of the fatigue cumulative damage by Markov chain model is similar to the test result. The fatigue life predictions under 2-level block loading by Markov chain model revised are good fitted to the test result more than by 2-parameter Weibull distribution function using percent failure rule.

Study on Demand Estimation of Agricultural Machinery by Using Logistic Curve Function and Markov Chain Model (로지스틱함수법 및 Markov 전이모형법을 이용한 농업기계의 수요예측에 관한 연구)

  • Yun Y. D.
    • Journal of Biosystems Engineering
    • /
    • v.29 no.5 s.106
    • /
    • pp.441-450
    • /
    • 2004
  • This study was performed to estimate mid and long term demands of a tractor, a rice transplanter, a combine and a grain dryer by using logistic curve function and Markov chain model. Field survey was done to decide some parameters far logistic curve function and Markov chain model. Ceiling values of tractor and combine fer logistic curve function analysis were 209,280 and 85,607 respectively. Based on logistic curve function analysis, total number of tractors increased slightly during the period analysed. New demand for combine was found to be zero. Markov chain analysis was carried out with 2 scenarios. With the scenario 1(rice price $10\%$ down and current supporting policy by government), new demand for tractor was decreased gradually up to 700 unit in the year 2012. For combine, new demand was zero. Regardless of scenarios, the replacement demand was increased slightly after 2003. After then, the replacement demand is decreased after the certain time. Two analysis of logistic owe function and Markov chain model showed the similar trend in increase and decrease for total number of tractors and combines. However, the difference in numbers of tractors and combines between the results from 2 analysis got bigger as the time passed.

Sensitivity of Conditions for Lumping Finite Markov Chains

  • Suh, Moon-Taek
    • Journal of the military operations research society of Korea
    • /
    • v.11 no.1
    • /
    • pp.111-129
    • /
    • 1985
  • Markov chains with large transition probability matrices occur in many applications such as manpowr models. Under certain conditions the state space of a stationary discrete parameter finite Markov chain may be partitioned into subsets, each of which may be treated as a single state of a smaller chain that retains the Markov property. Such a chain is said to be 'lumpable' and the resulting lumped chain is a special case of more general functions of Markov chains. There are several reasons why one might wish to lump. First, there may be analytical benefits, including relative simplicity of the reduced model and development of a new model which inherits known or assumed strong properties of the original model (the Markov property). Second, there may be statistical benefits, such as increased robustness of the smaller chain as well as improved estimates of transition probabilities. Finally, the identification of lumps may provide new insights about the process under investigation.

  • PDF

A Markov Chain Representation of Statistical Process Monitoring Procedure under an ARIMA(0,1,1) Model (ARIMA(0,1,1)모형에서 통계적 공정탐색절차의 MARKOV연쇄 표현)

  • 박창순
    • The Korean Journal of Applied Statistics
    • /
    • v.16 no.1
    • /
    • pp.71-85
    • /
    • 2003
  • In the economic design of the process control procedure, where quality is measured at certain time intervals, its properties are difficult to derive due to the discreteness of the measurement intervals. In this paper a Markov chain representation of the process monitoring procedure is developed and used to derive its properties when the process follows an ARIMA(0,1,1) model, which is designed to describe the effect of the noise and the special cause in the process cycle. The properties of the Markov chain depend on the transition matrix, which is determined by the control procedure and the process distribution. The derived representation of the Markov chain can be adapted to most different types of control procedures and different kinds of process distributions by obtaining the corresponding transition matrix.

A Probabilistic Analysis for Fatigue Cumulative Damage and Fatigue Life in CFRP Composites Containing a Circular Hole (원공을 가진 CFRP 복합재료의 피로누적손상 및 피로수명에 대한 확률적 해석)

  • 김정규;김도식
    • Transactions of the Korean Society of Mechanical Engineers
    • /
    • v.19 no.8
    • /
    • pp.1915-1926
    • /
    • 1995
  • The Fatigue characteristics of 8-harness satin woven CFRP composites with a circular hole are experimentally investigated under constant amplitude tension-tension loading. It is found in this study that the fatigue damage accumulation behavior is very random and history-independent, and the fatigue cumulative damage is linearly related with the mean number of cycles to a specified damage state. From these results, it is known that the fatigue characteristics of CFRP composites satisfy the basic assumptions of Markov chain theory and the parameter of Markov chain model can be determined only by mean and variance of fatigue lives. The predicted distribution of the fatigue cumulative damage using Markov chain model shows a good agreement with the test results. For the fatigue life distribution, Markov chain model makes similar accuracy to 2-parameter Weibull distribution function.

Bayesian Analysis of Binary Non-homogeneous Markov Chain with Two Different Time Dependent Structures

  • Sung, Min-Je
    • Management Science and Financial Engineering
    • /
    • v.12 no.2
    • /
    • pp.19-35
    • /
    • 2006
  • We use the hierarchical Bayesian approach to describe the transition probabilities of a binary nonhomogeneous Markov chain. The Markov chain is used for describing the transition behavior of emotionally disturbed children in a treatment program. The effects of covariates on transition probabilities are assessed using a logit link function. To describe the time evolution of transition probabilities, we consider two modeling strategies. The first strategy is based on the concept of exchangeabiligy, whereas the second one is based on a first order Markov property. The deviance information criterion (DIC) measure is used to compare models with two different time dependent structures. The inferences are made using the Markov chain Monte Carlo technique. The developed methodology is applied to some real data.

A study on Classification of Insider threat using Markov Chain Model

  • Kim, Dong-Wook;Hong, Sung-Sam;Han, Myung-Mook
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.4
    • /
    • pp.1887-1898
    • /
    • 2018
  • In this paper, a method to classify insider threat activity is introduced. The internal threats help detecting anomalous activity in the procedure performed by the user in an organization. When an anomalous value deviating from the overall behavior is displayed, we consider it as an inside threat for classification as an inside intimidator. To solve the situation, Markov Chain Model is employed. The Markov Chain Model shows the next state value through an arbitrary variable affected by the previous event. Similarly, the current activity can also be predicted based on the previous activity for the insider threat activity. A method was studied where the change items for such state are defined by a transition probability, and classified as detection of anomaly of the inside threat through values for a probability variable. We use the properties of the Markov chains to list the behavior of the user over time and to classify which state they belong to. Sequential data sets were generated according to the influence of n occurrences of Markov attribute and classified by machine learning algorithm. In the experiment, only 15% of the Cert: insider threat dataset was applied, and the result was 97% accuracy except for NaiveBayes. As a result of our research, it was confirmed that the Markov Chain Model can classify insider threats and can be fully utilized for user behavior classification.