Abstract
In this paper, we described the simulation results of the phase offset performance of a clock in holdover mode which was normally operated in GPS Disciplined Oscillator (GPSDO). In the TIE model, we included the time error term caused by environmental temperature variation because one of the most important parameters of clock phase error is the frequency offset and drift caused by the variation of temperature. For the simulation, we employed Maximum Time Interval Error (MTIE) for the performance evaluation when the frequency offset and drift are estimated by using an Unbiased Finite Impulse Response (UFIR) filter with ladder algorithm. We assumed that the noise in the GPS measurement is white Gaussian with zero mean and 1 ns standard deviation, and temperature linearly varies with a slope of $1{^{\circ}C}$ per hour. From the simulation results, the followings were observed. First, with the estimation error of temperature of less than 3 % and the temperature compensation period of less than 900 seconds, the requirement of CDMA2000 phase synchronization under 10 us could be achieved for more than 40,000 seconds holdover time if we employ an OCXO (Oven Controlled Crystal Oscillator) clock. Second, in order to achieve the requirement of LTE-TDD under 1.5 us for more than 10,000 seconds holdover time, below 3 % estimation error and 500 seconds should be retained if a Rubidium clock is adopted.