• Title/Summary/Keyword: Recurrent set

Search Result 100, Processing Time 0.033 seconds

THE RECURRENT HYPERCYCLICITY CRITERION FOR TRANSLATION C0-SEMIGROUPS ON COMPLEX SECTORS

  • Yuxia Liang;Zhi-Yuan Xu;Ze-Hua Zhou
    • Bulletin of the Korean Mathematical Society
    • /
    • v.60 no.2
    • /
    • pp.293-305
    • /
    • 2023
  • Let {Tt}t∈∆ be the translation semigroup with a sector ∆ ⊂ ℂ as index set. The recurrent hypercyclicity criterion (RHCC) for the C0-semigroup {Tt}t∈∆ is established, and then the equivalent conditions ensuring {Tt}t∈∆ satisfying the RHCC on weighted spaces of p-integrable and of continuous functions are presented. Especially, every chaotic semigroup {Tt}t∈∆ satisfies the RHCC.

THE SET OF RECURRENT POINTS OF A CONTINUOUS SELF-MAP ON AN INTERVAL AND STRONG CHAOS

  • Wang, Lidong;Liao, Gongfu;Chu, Zhenyan;Duan, Xiaodong
    • Journal of applied mathematics & informatics
    • /
    • v.14 no.1_2
    • /
    • pp.277-288
    • /
    • 2004
  • In this paper, we discuss a continuous self-map of an interval and the existence of an uncountable strongly chaotic set. It is proved that if a continuous self-map of an interval has positive topological entropy, then it has an uncountable strongly chaotic set in which each point is recurrent, but is not almost periodic.

SHADOWABLE POINTS FOR FINITELY GENERATED GROUP ACTIONS

  • Kim, Sang Jin;Lee, Keonhee
    • Journal of the Chungcheong Mathematical Society
    • /
    • v.31 no.4
    • /
    • pp.411-420
    • /
    • 2018
  • In this paper we introduce the notion of shadowable points for finitely generated group actions on compact metric spaace and prove that the set of shadowable points is invariant and Borel set and if chain recurrent set contained shadowable point set then it coincide with nonwandering set. Moreover an action $T{\in}Act(G, X)$ has the shadowing property if and only if every point is shadowable.

RECURRENT PATTERNS IN DST TIME SERIES

  • Kim, Hee-Jeong;Lee, Dae-Young;Choe, Won-Gyu
    • Journal of Astronomy and Space Sciences
    • /
    • v.20 no.2
    • /
    • pp.101-108
    • /
    • 2003
  • This study reports one approach for the classification of magnetic storms into recurrent patterns. A storm event is defined as a local minimum of Dst index. The analysis of Dst index for the period of year 1957 through year 2000 has demonstrated that a large portion of the storm events can be classified into a set of recurrent patterns. In our approach, the classification is performed by seeking a categorization that minimizes thermodynamic free energy which is defined as the sum of classification errors and entropy. The error is calculated as the squared sum of the value differences between events. The classification depends on the noise parameter T that represents the strength of the intrinsic error in the observation and classification process. The classification results would be applicable in space weather forecasting.

EEG Signal Prediction by using State Feedback Real-Time Recurrent Neural Network (상태피드백 실시간 회귀 신경회망을 이용한 EEG 신호 예측)

  • Kim, Taek-Soo
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.51 no.1
    • /
    • pp.39-42
    • /
    • 2002
  • For the purpose of modeling EEG signal which has nonstationary and nonlinear dynamic characteristics, this paper propose a state feedback real time recurrent neural network model. The state feedback real time recurrent neural network is structured to have memory structure in the state of hidden layers so that it has arbitrary dynamics and ability to deal with time-varying input through its own temporal operation. For the model test, Mackey-Glass time series is used as a nonlinear dynamic system and the model is applied to the prediction of three types of EEG, alpha wave, beta wave and epileptic EEG. Experimental results show that the performance of the proposed model is better than that of other neural network models which are compared in this paper in some view points of the converging speed in learning stage and normalized mean square error for the test data set.

Control of Chaos Dynamics in Jordan Recurrent Neural Networks

  • Jin, Sang-Ho;Kenichi, Abe
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.43.1-43
    • /
    • 2001
  • We propose two control methods of the Lyapunov exponents for Jordan-type recurrent neural networks. Both the two methods are formulated by a gradient-based learning method. The first method is derived strictly from the definition of the Lyapunov exponents that are represented by the state transition of the recurrent networks. The first method can control the complete set of the exponents, called the Lyapunov spectrum, however, it is computationally expensive because of its inherent recursive way to calculate the changes of the network parameters. Also this recursive calculation causes an unstable control when, at least, one of the exponents is positive, such as the largest Lyapunov exponent in the recurrent networks with chaotic dynamics. To improve stability in the chaotic situation, we propose a non recursive formulation by approximating ...

  • PDF

Bayesian Neural Network with Recurrent Architecture for Time Series Prediction

  • Hong, Chan-Young;Park, Jung-Hun;Yoon, Tae-Sung;Park, Jin-Bae
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.631-634
    • /
    • 2004
  • In this paper, the Bayesian recurrent neural network (BRNN) is proposed to predict time series data. Among the various traditional prediction methodologies, a neural network method is considered to be more effective in case of non-linear and non-stationary time series data. A neural network predictor requests proper learning strategy to adjust the network weights, and one need to prepare for non-linear and non-stationary evolution of network weights. The Bayesian neural network in this paper estimates not the single set of weights but the probability distributions of weights. In other words, we sets the weight vector as a state vector of state space method, and estimates its probability distributions in accordance with the Bayesian inference. This approach makes it possible to obtain more exact estimation of the weights. Moreover, in the aspect of network architecture, it is known that the recurrent feedback structure is superior to the feedforward structure for the problem of time series prediction. Therefore, the recurrent network with Bayesian inference, what we call BRNN, is expected to show higher performance than the normal neural network. To verify the performance of the proposed method, the time series data are numerically generated and a neural network predictor is applied on it. As a result, BRNN is proved to show better prediction result than common feedforward Bayesian neural network.

  • PDF

Analysis of Multi-variate Recurrent Fall Risk Factors in Elderly People Using Residential Assessment Instrument-Home Care - Comparisons between Single and Recurrent Fallers - (RAI-HC를 이용한 노인의 다면적 재낙상 위험요인 분석 -1회 낙상자와 재낙상자 비교-)

  • Yoo, In-Young
    • Journal of Korean Academy of Nursing
    • /
    • v.41 no.1
    • /
    • pp.119-128
    • /
    • 2011
  • Purpose: This study was done to determine the risk factors for recurrent fallers (2+falls) compared to single fallers. Methods: Participants were 104 community-dwelling people 65 yr of age or older. The data were collected from June 1, 2008 to June 30, 2009 using the Residential Assessment Instrument-Home Care. Results: Over the past 90 days, 55.7% of the 104 participants fell once, and 44.2% experienced recurrent falls (2+falls). In comparison of recurrent fallers with single fallers, there were significant differences in scores on the following factors: gender ($X^2$=4.22, p=.040), age ($X^2$=5.74, p=.017), educational level ($X^2$=5.22, p=.022), living arrangements ($X^2$=35.02, p<.001), cardiovascular diseases ($X^2$=17.10, p<.001), hypertension ($X^2$=4.43, p=.035), diabetes mellitus ($X^2$=4.44, p=.035), glaucoma ($X^2$=13.95, p<.001), Minimal Data Set (MDS)-Pain (t=-2.56, p=.012), fear of falling ($X^2$=4.08, p=.034), reduced vision (t=-3.06, p=.003), MDS-activity of daily living (t=3.46, p=.001), MDS-Instrumental Activities of daily living (t=3.24, p=.002), cognition (MDS-Cognition Performance Scale) (t=3.40, p=.001), and 'difficulties entering and leaving the house' ($X^2$=4.53, p=.033). Conclusion: It is important to assess the risk factors for recurrent falls and develop differentiated strategies that will help prevent recurrent falls. Additionally, utilizing a standardized tool, such as RAI-HC, would help health professionals assess multi-variate fall risk factors to facilitate comparisons of different community care settings.

C1-STABLE INVERSE SHADOWING CHAIN COMPONENTS FOR GENERIC DIFFEOMORPHISMS

  • Lee, Man-Seob
    • Communications of the Korean Mathematical Society
    • /
    • v.24 no.1
    • /
    • pp.127-144
    • /
    • 2009
  • Let f be a diffeomorphism of a compact $C^{\infty}$ manifold, and let p be a hyperbolic periodic point of f. In this paper we introduce the notion of $C^1$-stable inverse shadowing for a closed f-invariant set, and prove that (i) the chain recurrent set $\cal{R}(f)$ of f has $C^1$-stable inverse shadowing property if and only if f satisfies both Axiom A and no-cycle condition, (ii) $C^1$-generically, the chain component $C_f(p)$ of f associated to p is hyperbolic if and only if $C_f(p)$ has the $C^1$-stable inverse shadowing property.

THE STRUCTURE OF ALMOST REGULAR SEMIGROUPS

  • Chae, Younki;Lim, Yongdo
    • Bulletin of the Korean Mathematical Society
    • /
    • v.31 no.2
    • /
    • pp.187-192
    • /
    • 1994
  • The author extended the small properties of topological semilattices to that of regular semigroups [3]. In this paper, it could be shown that a semigroup S is almost regular if and only if over bar RL = over bar R.cap.L for every right ideal R and every left ideal L of S. Moreover, it has shown that the Bohr compactification of an almost regular semigroup is regular. Throughout, a semigroup will mean a topological semigroup which is a Hausdorff space together with a continuous associative multiplication. For a semigroup S, we denote E(S) by the set of all idempotents of S. An element x of a semigroup S is called regular if and only if x .mem. xSx. A semigroup S is termed regular if every element of S is regular. If x .mem. S is regular, then there exists an element y .mem S such that x xyx and y = yxy (y is called an inverse of x) If y is an inverse of x, then xy and yx are both idempotents but are not always equal. A semigroup S is termed recurrent( or almost pointwise periodic) at x .mem. S if and only if for any open set U about x, there is an integer p > 1 such that x$^{p}$ .mem.U.S is said to be recurrent (or almost periodic) if and only if S is recurrent at every x .mem. S. It is known that if x .mem. S is recurrent and .GAMMA.(x)=over bar {x,x$^{2}$,..,} is compact, then .GAMMA.(x) is a subgroup of S and hence x is a regular element of S.

  • PDF