Abstract
Equipment replacement policy may not be defined with certainty, because physical states of any technological system may not be determined with foresight. This paper presents Markov Decision Process(MDP) model for army equipment which is subject to the uncertainty of deterioration and ultimately to failure. The components of the MDP model is defined as follows: ⅰ) state is identified as the age of the equipment, ⅱ) actions are classified as 'keep' and 'replace', ⅲ) cost is defined as the expected cost per unit time associated with 'keep' and 'replace' actions, ⅳ) transition probability is derived from Weibull distribution. Using the MDP model, we can determine the optimal replacement policy for an army equipment replacement problem.