• Title/Summary/Keyword: Input Data

Search Result 8,338, Processing Time 0.046 seconds

A study on removal of unnecessary input variables using multiple external association rule (다중외적연관성규칙을 이용한 불필요한 입력변수 제거에 관한 연구)

  • Cho, Kwang-Hyun;Park, Hee-Chang
    • Journal of the Korean Data and Information Science Society
    • /
    • v.22 no.5
    • /
    • pp.877-884
    • /
    • 2011
  • The decision tree is a representative algorithm of data mining and used in many domains such as retail target marketing, fraud detection, data reduction, variable screening, category merging, etc. This method is most useful in classification problems, and to make predictions for a target group after dividing it into several small groups. When we create a model of decision tree with a large number of input variables, we suffer difficulties in exploration and analysis of the model because of complex trees. And we can often find some association exist between input variables by external variables despite of no intrinsic association. In this paper, we study on the removal method of unnecessary input variables using multiple external association rules. And then we apply the removal method to actual data for its efficiencies.

Cable damage identification of cable-stayed bridge using multi-layer perceptron and graph neural network

  • Pham, Van-Thanh;Jang, Yun;Park, Jong-Woong;Kim, Dong-Joo;Kim, Seung-Eock
    • Steel and Composite Structures
    • /
    • v.44 no.2
    • /
    • pp.241-254
    • /
    • 2022
  • The cables in a cable-stayed bridge are critical load-carrying parts. The potential damage to cables should be identified early to prevent disasters. In this study, an efficient deep learning model is proposed for the damage identification of cables using both a multi-layer perceptron (MLP) and a graph neural network (GNN). Datasets are first generated using the practical advanced analysis program (PAAP), which is a robust program for modeling and analyzing bridge structures with low computational costs. The model based on the MLP and GNN can capture complex nonlinear correlations between the vibration characteristics in the input data and the cable system damage in the output data. Multiple hidden layers with an activation function are used in the MLP to expand the original input vector of the limited measurement data to obtain a complete output data vector that preserves sufficient information for constructing the graph in the GNN. Using the gated recurrent unit and set2set model, the GNN maps the formed graph feature to the output cable damage through several updating times and provides the damage results to both the classification and regression outputs. The model is fine-tuned with the original input data using Adam optimization for the final objective function. A case study of an actual cable-stayed bridge was considered to evaluate the model performance. The results demonstrate that the proposed model provides high accuracy (over 90%) in classification and satisfactory correlation coefficients (over 0.98) in regression and is a robust approach to obtain effective identification results with a limited quantity of input data.

Stochastic Multiple Input-Output Model for Extension and Prediction of Monthly Runoff Series (월유출량계열의 확장과 예측을 위한 추계학적 다중 입출력모형)

  • 박상우;전병호
    • Water for future
    • /
    • v.28 no.1
    • /
    • pp.81-90
    • /
    • 1995
  • This study attempts to develop a stochastic system model for extension and prediction of monthly runoff series in river basins where the observed runoff data are insufficient although there are long-term hydrometeorological records. For this purpose, univariate models of a seasonal ARIMA type are derived from the time series analysis of monthly runoff, monthly precipitation and monthly evaporation data with trend and periodicity. Also, a causual model of multiple input-single output relationship that take monthly precipitation and monthly evaporation as input variables-monthly runoff as output variable is built by the cross-correlation analysis of each series. The performance of the univariate model and the multiple input-output model were examined through comparisons between the historical and the generated monthly runoff series. The results reveals that the multiple input-output model leads to the improved accuracy and wide range of applicability when extension and prediction of monthly runoff series is required.

  • PDF

Generation of the Input Profile for Fatigue Vibration Testing in MAST System (자동차부품(시트,도어) 6축 진동 재현을 위한 가진 프로파일 생성 기법)

  • Kim, Chan-Jung;Beak, Gyoung-Won;Lee, Bong-Hyun
    • Proceedings of the Korean Society for Noise and Vibration Engineering Conference
    • /
    • 2005.05a
    • /
    • pp.413-418
    • /
    • 2005
  • Vibration test using the MAST(Multi Axial Simulation Table) is more reliable test than conventional testing process focused on one directional vibration test. The former test could be possible with a advanced control algorithm and hardware supports so that most of the operation is automatically conducted by MAST system itself except the input information that is derived from the measured data. That means the reliability of the vibration test is highly depended on the input profile than any other cases before. In this paper, the optimal algorithm based on energy method is introduced so that the best combination of candidated input PSD data could be constructed. The optimal algorithm renders time information so that the vibration fatigue test is completely possible for any measured signals one wants. The real road test is conducted in short intervals containing some rough roads and the candidated input PSD is obtained from the extra road in proving ground. The testing is targeted for the electronically operated door and seat.

  • PDF

Robust Parameter Design Based on Back Propagation Neural Network (인공신경망을 이용한 로버스트설계에 관한 연구)

  • Arungpadang, Tritiya R.;Kim, Young Jin
    • Korean Management Science Review
    • /
    • v.29 no.3
    • /
    • pp.81-89
    • /
    • 2012
  • Since introduced by Vining and Myers in 1990, the concept of dual response approach based on response surface methodology has widely been investigated and adopted for the purpose of robust design. Separately estimating mean and variance responses, dual response approach may take advantages of optimization modeling for finding optimum settings of input factors. Explicitly assuming functional relationship between responses and input factors, however, it may not work well enough especially when the behavior of responses are poorly represented. A sufficient number of experimentations are required to improve the precision of estimations. This study proposes an alternative to dual response approach in which additional experiments are not required. An artificial neural network has been applied to model relationships between responses and input factors. Mean and variance responses correspond to output nodes while input factors are used for input nodes. Training, validating, and testing a neural network with empirical process data, an artificial data based on the neural network may be generated and used to estimate response functions without performing real experimentations. A drug formulation example from pharmaceutical industry has been investigated to demonstrate the procedures and applicability of the proposed approach.

System Identification by Adjusted Least Squares Method (ALS법에 의한 시스템동정)

  • Lee, Dong-Cheol;Bae, Jong-Il;Chung, Hwung-Hwan;Jo, Bong-Hwan
    • Proceedings of the KIEE Conference
    • /
    • 2002.07d
    • /
    • pp.2216-2218
    • /
    • 2002
  • A system identification is to measure the output in the presence of a adequate input for the controlled system and to estimate the mathematical model in the basic of input output data. In the system identification, it is possible to estimate the true parameter values by the adjusted least squares method in the input-output case of no observed noise, and it is possible to estimate the true parameter values by the total least squares method in the input-output case with the observed noise. In recent the adjusted least squares method is suggested as a consistent estimation method in the system identification not with the observed noise input but with the observed noise output. In this paper we have developed the adjusted least squares method from the least squares method and have made certain of the efficiency in comparing the estimating results with the generating data by the computer simulations.

  • PDF

Using Data Gloves for control of the 3-Dimensional postprocessing software (Data Glove를 이용한 3차원 데이터 후처리 소프트웨어의 제어)

  • Kim K. Y.;Kim B. S.
    • 한국전산유체공학회:학술대회논문집
    • /
    • 2004.03a
    • /
    • pp.56-61
    • /
    • 2004
  • As the size and dimension of target problems in the field of computational engineering including CFD gets bigger and higher, it is needed to have more efficient and flexible data visualization environment in terms of software and hardware. Even though it is still manageable to use a mouse in controlling 3-dimensional data visualization, it would be beneficial to use 3-D input device for 3-D visualization. 'Data Glove' is one of the best 3-D input devices, because human hands are best tools understanding 3-D space. Signals coming from 'Data Glove' are analog and very sensitive to finger motions, so we decided to use a digital filter. This paper describes our experience and benefits of using data glove in controlling 3-Dimensional Postprocess Software.

  • PDF

Automated Enterprise Data Model by Formulating Requirements

  • Lee, Sang-Won
    • Journal of Information Technology Applications and Management
    • /
    • v.16 no.4
    • /
    • pp.263-283
    • /
    • 2009
  • Although some CASE tools supported conceptual data design, they required for users too much preliminary knowledge to learn how to use and handle them. In addition, in a number of studies on conceptual data design with natural language, they claimed passive participation for users with conforming to messages predefined by CASE tools. As an alternative to these traditional CASE tools, we proposed an ERD formulator for automated data design tool, called ERDF, so that even ordinary users, not necessarily data modeler, are capable of formulating ERD on business requirements by use of ERDF. We, first of all, introduced NSM as the standard methodology. We also designed the structure of ERDF including main controller, input controller, operation controller, regulation controller, schema controller, and output controller. We then defined conceptual domains and basic operations to lay down schema operations as well as sentence rules to handle input sentence in natural language. To get an ERD that is faithful to business requirements, we laid out supplementary design for dialogue and confirmation of soundness and completion.

  • PDF

CONTROL OF A 3-DIMENSIONAL POSTPROCESSING SOFTWARE USING DATA GLOVES FOR IMMERSIVE ENVIRONMENT (몰입 환경을 위한 3차원 데이터 후처리 소프트웨어의 데이터 글로브에 의한 제어 구현)

  • Kim K.Y.;Kim B.S.
    • Journal of computational fluids engineering
    • /
    • v.11 no.2 s.33
    • /
    • pp.56-61
    • /
    • 2006
  • As the size and dimension of target problems in the field of computational engineering including CFD gets bigger and higher, it is needed to have more efficient and flexible data visualization environment in terms of software and hardware. Even though it is still manageable to use a mouse in controlling 3-dimensional data visualization, it would be beneficial to use 3-D input device for 3-D visualization. 'Data Glove' is one of the best 3-D input devices, because human hands are best tools for understanding 3-D space and manipulating 3-D objects. Signals coming from 'Data Glove' are analog and very sensitive to finger motions, therefore signal filtering using a digital filter is applied. This paper describes our experience and benefits of using data gloves in controlling 3-dimensional postprocessing softwares.

The Study on Application of Data Gathering for the site and Statistical analysis process (초기 데이터 분석 로드맵을 적용한 사례 연구)

  • Choi, Eun-Hyang;Ree, Sang-Bok
    • Proceedings of the Korean Society for Quality Management Conference
    • /
    • 2010.04a
    • /
    • pp.226-234
    • /
    • 2010
  • In this thesis, we present process that remove mistake of data before statistical analysis. If field data which is not simple examination about validity of data, we cannot believe analyzed statistics information. As statistical analysis information is produced based on data to be input in statistical analysis process, the data to be input should be free of error. In this paper, we study the application of statistical analysis road map that can enhance application on site by organizing basic theory and approaching on initial data exploratory phase, essential step before conducting statistical analysis. Therefore, access to statistical analysis can be enhanced and reliability on result of analysis can be secured by conducting correct statistical analysis.

  • PDF