• Title/Summary/Keyword: Global minima

Search Result 68, Processing Time 0.028 seconds

Improving the Training Performance of Neural Networks by using Hybrid Algorithm (하이브리드 알고리즘을 이용한 신경망의 학습성능 개선)

  • Kim, Weon-Ook;Cho, Yong-Hyun;Kim, Young-Il;Kang, In-Ku
    • The Transactions of the Korea Information Processing Society
    • /
    • v.4 no.11
    • /
    • pp.2769-2779
    • /
    • 1997
  • This Paper Proposes an efficient method for improving the training performance of the neural networks using a hybrid of conjugate gradient backpropagation algorithm and dynamic tunneling backpropagation algorithm The conjugate gradient backpropagation algorithm, which is the fast gradient algorithm, is applied for high speed optimization. The dynamic tunneling backpropagation algorithm, which is the deterministic method with tunneling phenomenon, is applied for global optimization. Conversing to the local minima by using the conjugate gradient backpropagation algorithm, the new initial point for escaping the local minima is estimated by dynamic tunneling backpropagation algorithm. The proposed method has been applied to the parity check and the pattern classification. The simulation results show that the performance of proposed method is superior to those of gradient descent backpropagtion algorithm and a hybrid of gradient descent and dynamic tunneling backpropagation algorithm, and the new algorithm converges more often to the global minima than gradient descent backpropagation algorithm.

  • PDF

PROBLEMS IN INVERSE SCATTERING-ILLPOSEDNESS, RESOLUTION, LOCAL MINIMA, AND UNIQUENESSE

  • Ra, Jung-Woong
    • Communications of the Korean Mathematical Society
    • /
    • v.16 no.3
    • /
    • pp.445-458
    • /
    • 2001
  • The shape and the distribution of material construction of the scatterer may be obtained from its scattered fields by the iterative inversion in the spectral domain. The illposedness, the resolution, and the uniqueness of the inversion are the key problems in the inversion and inter-related. The illposedness is shown to be caused by the evanescent modes which carries and amplifies exponentially the measurement errors in the back-propagation of the measured scattered fields. By filtering out all the evanescent modes in the cost functional defined as the squared difference between the measured and the calculated spatial spectrum of the scattered fields from the iteratively chosen medium parameters of the scatterer, one may regularize the illposedness of the inversion in the expense of the resolution. There exist many local minima of the cost functional for the inversion of the large and the high-contrast scatterer and the hybrid algorithm combining the genetic algorithm and the Levenberg-Marquardt algorithm is shown to find efficiently its global minimum. The resolution of reconstruction obtained by keeping all the propating modes and filtering out the evanescent modes for the regularization becomes 0.5 wavelength. The super resolution may be obtained by keeping the evanescent modes when the measurement error and instance, respectively, are small and near.

  • PDF

Searching a global optimum by stochastic perturbation in error back-propagation algorithm (오류 역전파 학습에서 확률적 가중치 교란에 의한 전역적 최적해의 탐색)

  • 김삼근;민창우;김명원
    • Journal of the Korean Institute of Telematics and Electronics C
    • /
    • v.35C no.3
    • /
    • pp.79-89
    • /
    • 1998
  • The Error Back-Propagation(EBP) algorithm is widely applied to train a multi-layer perceptron, which is a neural network model frequently used to solve complex problems such as pattern recognition, adaptive control, and global optimization. However, the EBP is basically a gradient descent method, which may get stuck in a local minimum, leading to failure in finding the globally optimal solution. Moreover, a multi-layer perceptron suffers from locking a systematic determination of the network structure appropriate for a given problem. It is usually the case to determine the number of hidden nodes by trial and error. In this paper, we propose a new algorithm to efficiently train a multi-layer perceptron. OUr algorithm uses stochastic perturbation in the weight space to effectively escape from local minima in multi-layer perceptron learning. Stochastic perturbation probabilistically re-initializes weights associated with hidden nodes to escape a local minimum if the probabilistically re-initializes weights associated with hidden nodes to escape a local minimum if the EGP learning gets stuck to it. Addition of new hidden nodes also can be viewed asa special case of stochastic perturbation. Using stochastic perturbation we can solve the local minima problem and the network structure design in a unified way. The results of our experiments with several benchmark test problems including theparity problem, the two-spirals problem, andthe credit-screening data show that our algorithm is very efficient.

  • PDF

DO THE OBSERVED RELATIONS OF THE GLOBAL SEISMIC PARAMETERS DEPEND ON THE MAGNETIC ACTIVITY LEVEL?

  • Kim, Ki-Beom;Chang, Heon-Young
    • Journal of The Korean Astronomical Society
    • /
    • v.54 no.4
    • /
    • pp.121-128
    • /
    • 2021
  • It has been known that the global asteroseismic parameters as well as the stellar acoustic mode parameters vary with stellar magnetic activity. Some solar-like stars whose variations are thought to be induced by magnetic activity, however, show mode frequencies changing with different magnitude and phase unlike what is expected for the Sun. Therefore, it is of great importance to find out whether expected relations are consistently manifested regardless of the phase of the stellar magnetic cycle, in the sense that observations are apt to cover a part of a complete cycle of stellar magnetic activity unless observations span several decades. Here, we explore whether the observed relations of the global seismic parameters hold good regardless of the phase of the stellar magnetic cycle, even if observations only cover a part of the stellar magnetic cycle. For this purpose, by analyzing photometric Sun-as-a-star data from 1996 to 2019 covering solar cycles 23 and 24, we compare correlations of the global asteroseismic parameters and magnetic proxies for four separate intervals of the solar cycle: solar minima ±2 years, solar minima +4 years, solar maxima ±2 years, and solar maxima +4 years. We have found that the photometric magnetic activity proxy, Sph, is an effective proxy for the solar magnetic activity regardless of the phase of the solar cycle. The amplitude of the mode envelope correlates negatively with the solar magnetic activity regardless of the phase of the solar cycle. However, relations between the central frequency of the envelope and the envelope width are vulnerable to the phase of the stellar magnetic cycle.

Compression of Image Data Using Neural Networks based on Conjugate Gradient Algorithm and Dynamic Tunneling System

  • Cho, Yong-Hyun;Kim, Weon-Ook;Bang, Man-Sik;Kim, Young-il
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 1998.06a
    • /
    • pp.740-749
    • /
    • 1998
  • This paper proposes compression of image data using neural networks based on conjugate gradient method and dynamic tunneling system. The conjugate gradient method is applied for high speed optimization .The dynamic tunneling algorithms, which is the deterministic method with tunneling phenomenon, is applied for global optimization. Converging to the local minima by using the conjugate gradient method, the new initial point for escaping the local minima is estimated by dynamic tunneling system. The proposed method has been applied the image data compression of 12 ${\times}$12 pixels. The simulation results shows the proposed networks has better learning performance , in comparison with that using the conventional BP as learning algorithm.

Likelihood search method with variable division search

  • Koga, Masaru;Hirasawa, Kotaro;Murata, Junichi;Ohbayashi, Masanao
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1995.10a
    • /
    • pp.14-17
    • /
    • 1995
  • Various methods and techniques have been proposed for solving optimization problems; the methods have been applied to various practical problems. However the methods have demerits. The demerits which should be covered are, for example, falling into local minima, or, a slow convergence speed to optimal points. In this paper, Likelihood Search Method (L.S.M.) is proposed for searching for a global optimum systematically and effectively in a single framework, which is not a combination of different methods. The L.S.M. is a sort of a random search method (R.S.M.) and thus can get out of local minima. However exploitation of gradient information makes the L.S.M. superior in convergence speed to the commonly used R.S.M..

  • PDF

A Study for Global Optimization Using Dynamic Encoding Algorithm for Searches

  • Kim, Nam-Geun;Kim, Jong-Wook;Kim, Sang-Woo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.857-862
    • /
    • 2004
  • This paper analyzes properties of the recently developed nonlinear optimization method, Dynamic Encoding Algorithm for Searches (DEAS) [1]. DEAS locates local minima with binary strings (or binary matrices for multi-dimensional problems) by iterating the two operators; bisectional search (BSS) and unidirectional search (UDS). BSS increases binary strings by one digit (i.e., zero or one), while UDS performs increment or decrement to binary strings with no change of string length. Owing to these search routines, DEAS retains the optimization capability that combines the special features of several conventional optimization methods. In this paper, a special feature of BSS and UDS in DEAS is analyzed. In addition, a effective global search strategy is established by using information of DEAS. Effectiveness of the proposed global search strategy is validated through the well-known benchmark functions.

  • PDF

A study of global minimization analaysis of Langevine competitive learning neural network based on constraction condition and its application to recognition for the handwritten numeral (축합조건의 분석을 통한 Langevine 경쟁 학습 신경회로망의 대역 최소화 근사 해석과 필기체 숫자 인식에 관한 연구)

  • 석진욱;조성원;최경삼
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1996.10b
    • /
    • pp.466-469
    • /
    • 1996
  • In this paper, we present the global minimization condition by an informal analysis of the Langevine competitive learning neural network. From the viewpoint of the stochastic process, it is important that competitive learning guarantees an optimal solution for pattern recognition. By analysis of the Fokker-Plank equation for the proposed neural network, we show that if an energy function has a special pseudo-convexity, Langevine competitive learning can find the global minima. Experimental results for pattern recognition of handwritten numeral data indicate the superiority of the proposed algorithm.

  • PDF

On the Global Convergence of Univariate Dynamic Encoding Algorithm for Searches (uDEAS)

  • Kim, Jong-Wook;Kim, Tae-Gyu;Choi, Joon-Young;Kim, Sang-Woo
    • International Journal of Control, Automation, and Systems
    • /
    • v.6 no.4
    • /
    • pp.571-582
    • /
    • 2008
  • This paper analyzes global convergence of the univariate dynamic encoding algorithm for searches (uDEAS) and provides an application result to function optimization. uDEAS is a more advanced optimization method than its predecessor in terms of the number of neighborhood points. This improvement should be validated through mathematical analysis for further research and application. Since uDEAS can be categorized into the generating set search method also established recently, the global convergence property of uDEAS is proved in the context of the direct search method. To show the strong performance of uDEAS, the global minima of four 30 dimensional benchmark functions are attempted to be located by uDEAS and the other direct search methods. The proof of global convergence and the successful optimization result guarantee that uDEAS is a reliable and effective global optimization method.

A Novel Global Minimum Search Algorithm based on the Geodesic of Classical Dynamics Lagrangian (고전 역학의 라그랑지안을 이용한 미분 기하학적 global minimum 탐색 알고리즘)

  • Kim, Joon-Shik;O, Jang-Min;Kim, Jong-Chan;Zhang, Byoung-Tak
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2006.10a
    • /
    • pp.39-42
    • /
    • 2006
  • 뉴럴네트워크에서 학습은 에러를 줄이는 방법으로 구현 된다. 이 때 parameter 공간에서 Risk function은 multi-minima potential로 표현 될 수 있으며 우리의 목적은 global minimum weight 좌표를 얻는 것이다. 이전의 연구로는 Attouch et al.의 damped oscillator 방정식을 이용한 방법이 있고, Qian의 critically damped oscillator를 통한 steepest descent의 momentum과 learning parameter 유도가 있다. 우리는 이 두 연구를 참고로 manifold 상에서 최단 경로인 geodesic을 Newton 역학의 Lagrangian에 적용함으로써 adaptive steepest descent 학습법을 얻었다. 우리는 이 새로운 방법을 Rosenbrock 과 Griewank 포텐셜들에 적용하여 그 성능을 알아 본다.

  • PDF