• 제목/요약/키워드: Global optimization method

검색결과 618건 처리시간 0.025초

One Dimensional Optimization using Learning Network

  • Chung, Taishn;Bien, Zeungnam
    • 한국지능시스템학회:학술대회논문집
    • /
    • 한국퍼지및지능시스템학회 1995년도 추계학술대회 학술발표 논문집
    • /
    • pp.33-39
    • /
    • 1995
  • One dimensional optimization problem is considered, we propose a method to find the global minimum of one-dimensional function with on gradient information but only the finite number of input-output samples. We construct a learning network which has a good learning capability and of which global maximum(or minimum) can be calculated with simple calculation. By teaching this network to approximate the given function with minimal samples, we can get the global minimum of the function. We verify this method using some typical esamples.

  • PDF

곡선부에서 차륜 마모 저감을 위한 차륜답면 형상 설계 (Design of Wheel Profile to Reduce Wear of Railway Wheel)

  • 최하영;이동형;송창용;이종수
    • 한국정밀공학회지
    • /
    • 제29권6호
    • /
    • pp.607-612
    • /
    • 2012
  • The wear problem of wheel flange occurs at sharp curves of rail. This paper proposes a procedure for optimum design of a wheel profile wherein flange wear is reduced by improving an interaction between wheel and rail. Application of optimization method to design problem mainly depends on characteristics of design space. This paper compared local optimization method with global optimization according to sensitivity value of objective function for design variables to find out which optimization method is appropriable to minimize wear of wheel flange. Wheel profile is created by a piecewise cubic Hermite interpolating polynomial and dynamic performances are analyzed by a railway dynamic analysis program, VAMPIRE. From the optimization results, it is verified that the global optimization method such as genetic algorithm is more suitable to wheel profile optimization than the local optimization of SQP (Sequential Quadratic Programming) in case of considering the lack of empirical knowledge for initial design value.

CONVERGENCE OF THE NONMONOTONE PERRY-SHANNO METHOD FOR UNCONSTRAINED OPTIMIZATION

  • Ou, Yigui;Ma, Wei
    • Journal of applied mathematics & informatics
    • /
    • 제30권5_6호
    • /
    • pp.971-980
    • /
    • 2012
  • In this paper, a method associating with one new form of nonmonotone linesearch technique is proposed, which can be regarded as a generalization of the Perry-Shanno memoryless quasi-Newton type method. Under some reasonable conditions, the global convergence of the proposed method is proven. Numerical tests show its efficiency.

함수 근사화를 위한 방사 기저함수 네트워크의 전역 최적화 기법 (A Global Optimization Method of Radial Basis Function Networks for Function Approximation)

  • 이종석;박철훈
    • 정보처리학회논문지B
    • /
    • 제14B권5호
    • /
    • pp.377-382
    • /
    • 2007
  • 본 논문에서는 방사 기저함수 네트워크의 파라미터를 전 영역에서 최적화하는 학습 알고리즘을 제안한다. 기존의 학습 알고리즘들은 지역 최적화만을 수행하기 때문에 성능의 한계가 있고 최종 결과가 초기 네트워크 파라미터 값에 크게 의존하는 단점이 있다. 본 논문에서 제안하는 하이브리드 모의 담금질 기법은 모의 담금질 기법의 전 영역 탐색 능력과 경사 기반 학습 알고리즘의 지역 최적화 능력을 조합하여 전 파라미터 영역에서 해를 찾을 수 있도록 한다. 제안하는 기법을 함수 근사화 문제에 적용하여 기존의 학습 알고리즘에 비해 더 좋은 학습 및 일반화 성능을 보이는 네트워크 파라미터를 찾을 수 있으며, 초기 파라미터 값의 영향을 크게 줄일 수 있음을 보인다.

크리깅 방법에 의한 방열판 최적설계 (Optimal Design of a Heat Sink Using the Kriging Method)

  • 류제선;류근호;박경우
    • 대한기계학회논문집B
    • /
    • 제29권10호
    • /
    • pp.1139-1147
    • /
    • 2005
  • The shape optimal design of the plate-fin type heat sink with vortex generator is performed to minimize the pressure loss subjected to the desired maximum temperature numerically. Evaluation of the performance function, in general, is required much computational cost in fluid/thermal systems. Thus, global approximate optimization techniques have been introduced into the optimization of fluid/thermal systems. In this study, Kriging method Is used to obtain the optimal solutions associated with the computational fluid dynamics (CFD). The results show that when the temperature .rise is less than 40 K, the optimal design variables are $B_1=2.44\;mm,\;B_2=2.09\;mm$, and t=7.58 mm. Kriging method can dramatically reduce computational time by 1/6 times compared to SQP method so that the efficiency of Kriging method can be validated.

유전자 알고리즘을 이용한 뼈대구조물의 이산최적화 (Discrete Optimization of Plane Frame Structures Using Genetic Algorithms)

  • 김봉익;권중현
    • 한국해양공학회지
    • /
    • 제16권4호
    • /
    • pp.25-31
    • /
    • 2002
  • This paper is to find optimum design of plane framed structures with discrete variables. Global search algorithms for this problem are Genetic Algorithms(GAs), Simulated Annealing(SA) and Shuffled Complex Evolution(SCE), and hybrid methods (GAs-SA, GAs-SCE). GAs and SA are heuristic search algorithms and effective tools which is finding global solution for discrete optimization. In particular, GAs is known as the search method to find global optimum or near global optimum. In this paper, reinforced concrete plane frames with rectangular section and steel plane frames with W-sections are used for the design of discrete optimization. These structures are designed for stress constraints. The robust and effectiveness of Genetic Algorithms are demonstrated through several examples.

적응 분할법에 기반한 유전 알고리즘 및 그 응용에 관한 연구 (A Study on Adaptive Partitioning-based Genetic Algorithms and Its Applications)

  • 한창욱
    • 융합신호처리학회논문지
    • /
    • 제13권4호
    • /
    • pp.207-210
    • /
    • 2012
  • 유전 알고리즘은 확률에 기반한 매우 효과적인 최적화 기법이지만 지역해로의 조기수렴과 전역해로의 수렴 속도가 느리다는 단점이 있다. 본 논문에서는 이러한 단점을 보완하기 위해 적응 분할법에 기반한 유전 알고리즘을 제안하였다. 유전 알고리즘이 전역해를 효과적으로 찾도록 하는 적응 분할법은 최적화의 복잡도를 줄이기 위해 탐색공간을 적응적으로 분할한다. 이러한 적응 분할법은 탐색공간의 복잡도가 증가할수록 더 효과적이다. 제안된 방법을 테스트 함수의 최적화 및 도립진자 제어를 위한 퍼지 제어기 설계 최적화에 적용하여 그 유효성을 보였다.

A CLASS OF NONMONOTONE SPECTRAL MEMORY GRADIENT METHOD

  • Yu, Zhensheng;Zang, Jinsong;Liu, Jingzhao
    • 대한수학회지
    • /
    • 제47권1호
    • /
    • pp.63-70
    • /
    • 2010
  • In this paper, we develop a nonmonotone spectral memory gradient method for unconstrained optimization, where the spectral stepsize and a class of memory gradient direction are combined efficiently. The global convergence is obtained by using a nonmonotone line search strategy and the numerical tests are also given to show the efficiency of the proposed algorithm.

Optimal design of truss structures using a new optimization algorithm based on global sensitivity analysis

  • Kaveh, A.;Mahdavi, V.R.
    • Structural Engineering and Mechanics
    • /
    • 제60권6호
    • /
    • pp.1093-1117
    • /
    • 2016
  • Global sensitivity analysis (GSA) has been widely used to investigate the sensitivity of the model output with respect to its input parameters. In this paper a new single-solution search optimization algorithm is developed based on the GSA, and applied to the size optimization of truss structures. In this method the search space of the optimization is determined using the sensitivity indicator of variables. Unlike the common meta-heuristic algorithms, where all the variables are simultaneously changed in the optimization process, in this approach the sensitive variables of solution are iteratively changed more rapidly than the less sensitive ones in the search space. Comparisons of the present results with those of some previous population-based meta-heuristic algorithms demonstrate its capability, especially for decreasing the number of fitness functions evaluations, in solving the presented benchmark problems.

확률적 근사법과 후형질과 알고리즘을 이용한 다층 신경망의 학습성능 개선 (Improving the Training Performance of Multilayer Neural Network by Using Stochastic Approximation and Backpropagation Algorithm)

  • 조용현;최흥문
    • 전자공학회논문지B
    • /
    • 제31B권4호
    • /
    • pp.145-154
    • /
    • 1994
  • This paper proposes an efficient method for improving the training performance of the neural network by using a hybrid of a stochastic approximation and a backpropagation algorithm. The proposed method improves the performance of the training by appliying a global optimization method which is a hybrid of a stochastic approximation and a backpropagation algorithm. The approximate initial point for a stochastic approximation and a backpropagation algorihtm. The approximate initial point for fast global optimization is estimated first by applying the stochastic approximation, and then the backpropagation algorithm, which is the fast gradient descent method, is applied for a high speed global optimization. And further speed-up of training is made possible by adjusting the training parameters of each of the output and the hidden layer adaptively to the standard deviation of the neuron output of each layer. The proposed method has been applied to the parity checking and the pattern classification, and the simulation results show that the performance of the proposed method is superior to that of the backpropagation, the Baba's MROM, and the Sun's method with randomized initial point settings. The results of adaptive adjusting of the training parameters show that the proposed method further improves the convergence speed about 20% in training.

  • PDF