• Title/Summary/Keyword: Backpropagation

Search Result 591, Processing Time 0.031 seconds

Features Price Prediction Using Backpropagation Neural Network (신경망을 이용한 선물가 예측)

  • 김성환;이상훈;김기태
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2003.04c
    • /
    • pp.467-469
    • /
    • 2003
  • 본 논문에서는 KOSIP 200선물을 예측하기 위한 시스템으로 과거의 자료를 사용하여 거래패턴과 그 변화 및 시장의 가격과 거래량의 패턴을 학습하며, 미래의 선물가를 예측하는 시스템으로 역전파 신경망(Backpropagation Neural Network)을 학습 알고리즘으로 하는 L2K시스템 실험과 다양한 입력데이터와 훈련데이터의 변화를 테스트 하여 최적의 네트워크 구성하여 정확도를 향상 시켰다.

  • PDF

Development of a neural network with fuzzy preprocessor (퍼지 전처리기를 가진 신경회로망 모델의 개발)

  • 조성원;최경삼;황인호
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1993.10a
    • /
    • pp.718-723
    • /
    • 1993
  • In this paper, we propose a neural network with fuzzy preprocessor not only for improving the classification accuracy but also for being able to classify objects whose attribute values do not have clear boundaries. The fuzzy input signal representation scheme is included as a preprocessing module. It transforms imprecise input in linguistic form and precisely stated numerical input into multidimensional numerical values. The transformed input is processed in the postprocessing module. The experimental results indicate the superiority of the backpropagation network with fuzzy preprocessor in comparison to the conventional backpropagation network.

  • PDF

Backpropagation Classification of Statistically

  • Kim, Sungmo;Kim, Byungwhan
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2002.10a
    • /
    • pp.46.2-46
    • /
    • 2002
  • Plasma processing plays a crucial role in fabricating integrated circuits (ICs). Manufacturing ICs in a cost effective way, it is increasingly demanded a computer model that predicts plasma properties to unknown process inputs. Physical models are limited in the prediction accuracy since they are subject to many assumptions. Expensive computation time is another hindrance that prevents their widespread used in manufacturing site. To circumvent these difficulties inherent in physical models, neural networks have been used to learn nonlinear plasma data [1]. Among many types of networks, a backpropagation neural network (BPNN) is the most widely used architecture. Many training variables are...

  • PDF

Nonlinear System Modeling Based on Multi-Backpropagation Neural Network (다중 역전파 신경회로망을 이용한 비선형 시스템의 모델링)

  • Baeg, Jae-Huyk;Lee, Jung-Moon
    • Journal of Industrial Technology
    • /
    • v.16
    • /
    • pp.197-205
    • /
    • 1996
  • In this paper, we propose a new neural architecture. We synthesize the architecture from a combination of structures known as MRCCN (Multi-resolution Radial-basis Competitive and Cooperative Network) and BPN (Backpropagation Network). The proposed neural network is able to improve the learning speed of MRCCN and the mapping capability of BPN. The ability and effectiveness of identifying a ninlinear dynamic system using the proposed architecture will be demonstrated by computer simulation.

  • PDF

Pattern Recognition Using BP Learning Algorithm of Multiple Valued Logic Neural Network (다치 신경 망의 BP 학습 알고리즘을 이용한 패턴 인식)

  • 김두완;정환묵
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2002.12a
    • /
    • pp.502-505
    • /
    • 2002
  • 본 논문은 다치(MVL:Multiple Valued Logic) 신경망의 BP(Backpropagation) 학습 알고리즘을 이용하여 패턴 인식에 이용하는 방법을 제안한다. MVL 신경망을 이용하여 패턴 인식에 이용함으로서, 네트워크에 필요한 시간 및 기억 공간을 최소화할 수 있고 환경 변화에 적응할 수 있는 가능성을 제시하였다. MVL 신경망은 다치 논리 함수를 기반으로 신경망을 구성하였으며, 입력은 리터럴 함수로 변환시키고, 출력은 MIN과 MAX 연산을 사용하여 구하였고, 학습을 하기 위해 다치 논리식의 편 미분을 사용하였다.

The Performance Improvement of Backpropagation Algorithm using the Gain Variable of Activation Function (활성화 함수의 이득 가변화를 이용한 역전파 알고리즘의 성능개선)

  • Chung, Sung-Boo;Lee, Hyun-Kwan;Eom, Ki-Hwan
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.38 no.6
    • /
    • pp.26-37
    • /
    • 2001
  • In order to improve the several problems of the general backpropagation, we propose a method using a fuzzy logic system for automatic tuning of the activation function gain in the backpropagation. First, we researched that the changing of the gain of sigmoid function is equivalent to changing the learning rate, the weights, and the biases. The inputs of the fuzzy logic system were the sensitivity of error respect to the last layer and the mean sensitivity of error respect to the hidden layer, and the output was the gain of the sigmoid function. In order to verify the effectiveness of the proposed method, we performed simulations on the parity problem, function approximation, and pattern recognition. The results show that the proposed method has considerably improved the performance compared to the general backpropagation.

  • PDF

Analyzing Factors Contributing to Research Performance using Backpropagation Neural Network and Support Vector Machine

  • Ermatita, Ermatita;Sanmorino, Ahmad;Samsuryadi, Samsuryadi;Rini, Dian Palupi
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.16 no.1
    • /
    • pp.153-172
    • /
    • 2022
  • In this study, the authors intend to analyze factors contributing to research performance using Backpropagation Neural Network and Support Vector Machine. The analyzing factors contributing to lecturer research performance start from defining the features. The next stage is to collect datasets based on defining features. Then transform the raw dataset into data ready to be processed. After the data is transformed, the next stage is the selection of features. Before the selection of features, the target feature is determined, namely research performance. The selection of features consists of Chi-Square selection (U), and Pearson correlation coefficient (CM). The selection of features produces eight factors contributing to lecturer research performance are Scientific Papers (U: 154.38, CM: 0.79), Number of Citation (U: 95.86, CM: 0.70), Conference (U: 68.67, CM: 0.57), Grade (U: 10.13, CM: 0.29), Grant (U: 35.40, CM: 0.36), IPR (U: 19.81, CM: 0.27), Qualification (U: 2.57, CM: 0.26), and Grant Awardee (U: 2.66, CM: 0.26). To analyze the factors, two data mining classifiers were involved, Backpropagation Neural Networks (BPNN) and Support Vector Machine (SVM). Evaluation of the data mining classifier with an accuracy score for BPNN of 95 percent, and SVM of 92 percent. The essence of this analysis is not to find the highest accuracy score, but rather whether the factors can pass the test phase with the expected results. The findings of this study reveal the factors that have a significant impact on research performance and vice versa.

A Study on the Cutting Characteristics and Detection of the Abnormal Tool State in Hard Turning (고경도강 선삭시 절삭특성 및 공구 이상상태 검출에 관한 연구)

  • Lee S.J.;Shin H.G.;Kim M.H.;Kim J.T.;Lee H.K.;Kim T.Y.
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2005.06a
    • /
    • pp.452-455
    • /
    • 2005
  • The cutting characteristics of hardened steel by a PCBN tool is investigated with respect to workpiece surface roughness, cutting force and tool flank wear of the vision system. Backpropagation neural networks (BPNs) were used for detection of tool wear. The neural network consisted of three layers: input, hidden and output. The input vectors comprised of spindle rotational speed, feed rates, vision flank wear, and thrust force signals. The output was the tool wear state which was either usable or failure. Hard turning experiments with various spindle rotational speed and feed rates were carried out. The learning process was performed effectively by utilizing backpropagation. The detection of the abnormal states using BPNs achieved 96.4% reliability even when the spindle rotational speed and feedrate were changed.

  • PDF

Genetic Algorithm with the Local Fine-Tuning Mechanism (유전자 알고리즘을 위한 지역적 미세 조정 메카니즘)

  • 임영희
    • Korean Journal of Cognitive Science
    • /
    • v.4 no.2
    • /
    • pp.181-200
    • /
    • 1994
  • In the learning phase of multilyer feedforword neural network,there are problems such that local minimum,learning praralysis and slow learning speed when backpropagation algorithm used.To overcome these problems, the genetic algorithm has been used as learing method in the multilayer feedforword neural network instead of backpropagation algorithm.However,because the genetic algorith, does not have any mechanism for fine-tuned local search used in backpropagation method,it takes more time that the genetic algorithm converges to a global optimal solution.In this paper,we suggest a new GA-BP method which provides a fine-tunes local search to the genetic algorithm.GA-BP method uses gradient descent method as one of genetic algorithm's operators such as mutation or crossover.To show the effciency of the developed method,we applied it to the 3-parity bit problem with analysis.

A Study on Fuzzy Wavelet Neural Network System Based on ANFIS Applying Bell Type Fuzzy Membership Function (벨형 퍼지 소속함수를 적용한 ANFIS 기반 퍼지 웨이브렛 신경망 시스템의 연구)

  • 변오성;조수형;문성용
    • Journal of the Institute of Electronics Engineers of Korea TE
    • /
    • v.39 no.4
    • /
    • pp.363-369
    • /
    • 2002
  • In this paper, it could improved on the arbitrary nonlinear function learning approximation which have the wavelet neural network based on Adaptive Neuro-Fuzzy Inference System(ANFIS) and the multi-resolution Analysis(MRA) of the wavelet transform. ANFIS structure is composed of a bell type fuzzy membership function, and the wavelet neural network structure become composed of the forward algorithm and the backpropagation neural network algorithm. This wavelet composition has a single size, and it is used the backpropagation algorithm for learning of the wavelet neural network based on ANFIS. It is confirmed to be improved the wavelet base number decrease and the convergence speed performances of the wavelet neural network based on ANFIS Model which is using the wavelet translation parameter learning and bell type membership function of ANFIS than the conventional algorithm from 1 dimension and 2 dimension functions.