Time Discretization of the Nonlinear System with Variable Time-delayed Input using a Taylor Series Expansion

  • Choi, Hyung-Jo (Department of Mechatronics Engineering, Chonbuk National University) ;
  • Chong, Kil-To (Department of Electronical & Information Engineering, Chonbuk National University)
  • Published : 2005.06.02

Abstract

This paper suggests a new method discretization of nonlinear system using Taylor series expansion and zero-order hold assumption. This method is applied into the sampled-data representation of a nonlinear system with input time delay. Additionally, the delayed input is time varying and its amplitude is bounded. The maximum time-delayed input is assumed to be two sampling periods. Them mathematical expressions of the discretization method are presented and the ability of the algorithm is tested for some of the examples. And 'hybrid' discretization scheme that result from a combination of the ‘scaling and squaring' technique with the Taylor method are also proposed, especially under condition of very low sampling rates. The computer simulation proves the proposed algorithm discretized the nonlinear system with the variable time-delayed input accurately.

Keywords