• Title/Summary/Keyword: Markov transition probability

Search Result 100, Processing Time 0.026 seconds

A study of guiding probability applied markov-chain (Markov 연쇄를 적용한 확률지도연구)

  • Lee Tae-Gyu
    • The Mathematical Education
    • /
    • v.25 no.1
    • /
    • pp.1-8
    • /
    • 1986
  • It is a common saying that markov-chain is a special case of probability course. That is to say, It means an unchangeable markov-chain process of the transition-probability of discontinuous time. There are two kinds of ways to show transition probability parade matrix theory. The first is the way by arrangement of a rightangled tetragon. The second part is a vertical measurement and direction sing by transition-circle. In this essay, I try to find out existence of procession for transition-probability applied markov-chain. And it is possible for me to know not only, what it is basic on a study of chain but also being applied to abnormal problems following a flow change and statistic facts expecting to use as a model of air expansion in physics.

  • PDF

SOME LIMIT PROPERTIES OF RANDOM TRANSITION PROBABILITY FOR SECOND-ORDER NONHOMOGENEOUS MARKOV CHAINS ON GENERALIZED GAMBLING SYSTEM INDEXED BY A DOUBLE ROOTED TREE

  • Wang, Kangkang;Zong, Decai
    • Journal of applied mathematics & informatics
    • /
    • v.30 no.3_4
    • /
    • pp.541-553
    • /
    • 2012
  • In this paper, we study some limit properties of the harmonic mean of random transition probability for a second-order nonhomogeneous Markov chain on the generalized gambling system indexed by a tree by constructing a nonnegative martingale. As corollary, we obtain the property of the harmonic mean and the arithmetic mean of random transition probability for a second-order nonhomogeneous Markov chain indexed by a double root tree.

The Bus Delay Time Prediction Using Markov Chain (Markov Chain을 이용한 버스지체시간 예측)

  • Lee, Seung-Hun;Moon, Byeong-Sup;Park, Bum-Jin
    • The Journal of The Korea Institute of Intelligent Transport Systems
    • /
    • v.8 no.3
    • /
    • pp.1-10
    • /
    • 2009
  • Bus delay time is occurred as the result of traffic condition and important factor to predict bus arrival time. In this paper, transition probability matrixes between bus stops are made by using Markov Chain and it is predicted bus delay time with them. As the results of study, it is confirmed a possibility of adapting the assumption which it has same bus transition probability between stops through paired-samples T-test and overcame the limitation of exiting studies in case there is no scheduled bus arrival time for each stops with using bus interval time. Therefore it will be possible to predict bus arrival time with Markov Chain.

  • PDF

LIMIT THEOREMS FOR MARKOV PROCESSES GENERATED BY ITERATIONS OF RANDOM MAPS

  • Lee, Oe-Sook
    • Journal of the Korean Mathematical Society
    • /
    • v.33 no.4
    • /
    • pp.983-992
    • /
    • 1996
  • Let p(x, dy) be a transition probability function on $(S, \rho)$, where S is a complete separable metric space. Then a Markov process $X_n$ which has p(x, dy) as its transition probability may be generated by random iterations of the form $X_{n+1} = f(X_n, \varepsilon_{n+1})$, where $\varepsilon_n$ is a sequence of independent and identically distributed random variables (See, e.g., Kifer(1986), Bhattacharya and Waymire(1990)).

  • PDF

TWO-SIDED ESTIMATES FOR TRANSITION PROBABILITIES OF SYMMETRIC MARKOV CHAINS ON ℤd

  • Zhi-He Chen
    • Journal of the Korean Mathematical Society
    • /
    • v.60 no.3
    • /
    • pp.537-564
    • /
    • 2023
  • In this paper, we are mainly concerned with two-sided estimates for transition probabilities of symmetric Markov chains on ℤd, whose one-step transition probability is comparable to |x - y|-dϕj (|x - y|)-1 with ϕj being a positive regularly varying function on [1, ∞) with index α ∈ [2, ∞). For upper bounds, we directly apply the comparison idea and the Davies method, which considerably improves the existing arguments in the literature; while for lower bounds the relation with the corresponding continuous time symmetric Markov chains are fully used. In particular, our results answer one open question mentioned in the paper by Murugan and Saloff-Coste (2015).

A computation method of reliability for preprocessing filters in the fire control system using Markov process and state transition probability matrix (Markov process 및 상태천이확률 행렬 계산을 통한 사격통제장치 전처리필터 신뢰성 산출 기법)

  • Kim, Jae-Hun;Lyou, Joon
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.2 no.2
    • /
    • pp.131-139
    • /
    • 1999
  • An easy and efficient method is proposed for a computation of reliability of preprocessing filters in the fire control system when the sensor data are frequently unreliable depending on the operation environment. It computes state transition probability matrix after modeling filter states as a Markov process, and computing false alarm and detection probability of each filter state under the given sensor failure probability. It shows that two important indices such as distributed state probability and error variance can be derived easily for a reliability assessment of the given sensor fusion system.

  • PDF

Performance Analysis of Wireless Communication System with FSMC Model in Nakagami-m Fading Channel (Nakagami-m 페이딩 채널에서 FSMC 모델에 의한 무선 통신시스템의 성능 분석)

  • 조용범;노재성;조성준
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.8 no.5
    • /
    • pp.1010-1019
    • /
    • 2004
  • In this paper, we represent Nakagami-m fading channel as finite-State Markov Channel (FSMC) and analyze the performance of wireless communication system with varying the fading channel condition. In FSMC model, the received signal's SNR is divided into finite intervals and these intervals are formed into Markov chain states. Each state is modeled by a BSC and the transition probability is dependent upon the physical characterization of the channel. The steady state probability and average symbol error rate of each state and transition probability are derived by numerical analysis and FSMC model is formed with these values. We found that various fading channels can be represented with FSMC by changing state transition index. In fast fading environment in which state transition index is large, the channel can be viewed as i.i.d. channel and on the contrary, in slow fading channel where state transition index is small, the channel can be represented by simple FSMC model in which transitions occur between just adjacent states. And we applied the proposed FSMC model to analyze the coding gain of random error correcting code on various fading channels via computer simulation.

Bayesian Conjugate Analysis for Transition Probabilities of Non-Homogeneous Markov Chain: A Survey

  • Sung, Minje
    • Communications for Statistical Applications and Methods
    • /
    • v.21 no.2
    • /
    • pp.135-145
    • /
    • 2014
  • The present study surveys Bayesian modeling structure for inferences about transition probabilities of Markov chain. The motivation of the study came from the data that shows transitional behaviors of emotionally disturbed children undergoing residential treatment program. Dirichlet distribution was used as prior for the multinomial distribution. The analysis with real data was implemented in WinBUGS programming environment. The performance of the model was compared to that of alternative approaches.

Forecasting Probability of Precipitation Using Morkov Logistic Regression Model

  • Park, Jeong-Soo;Kim, Yun-Seon
    • Communications for Statistical Applications and Methods
    • /
    • v.14 no.1
    • /
    • pp.1-9
    • /
    • 2007
  • A three-state Markov logistic regression model is suggested to forecast the probability of tomorrow's precipitation based on the current meteorological situation. The suggested model turns out to be better than Markov regression model in the sense of the mean squared error of forecasting for the rainfall data of Seoul area.

On Weak Convergence of Some Rescaled Transition Probabilities of a Higher Order Stationary Markov Chain

  • Yun, Seok-Hoon
    • Journal of the Korean Statistical Society
    • /
    • v.25 no.3
    • /
    • pp.313-336
    • /
    • 1996
  • In this paper we consider weak convergence of some rescaled transi-tion probabilities of a real-valued, k-th order (k $\geq$ 1) stationary Markov chain. Under the assumption that the joint distribution of K + 1 consecutive variables belongs to the domain of attraction of a multivariate extreme value distribution, the paper gives a sufficient condition for the weak convergence and characterizes the limiting distribution via the multivariate extreme value distribution.

  • PDF