• Title/Summary/Keyword: model reduction error

Search Result 389, Processing Time 0.028 seconds

Structural Design of the Outer Tie Rod for an Electrical Vehicle (전기 자동차용 아우터 타이로드의 구조설계)

  • Seo, Bu-Kyo;Kim, Jong-Kyu;Lee, Dong-Jin;Seo, Sun-Min;Lee, Kwon-Hee;Park, Young-Chul
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.14 no.9
    • /
    • pp.4171-4177
    • /
    • 2013
  • Outer tie rod is lighter than other, but there is the trend item weight and the number is increasing due to vehicle performance improvement. Thus, to improve vehicle fuel efficiency, weight lightening is essential. Therefore, this research performed the finite element analysis to investigate the structural performance of the outer tie rod for an electrical vehicle. This study was performed as the preliminary study for a lightweight design of the outer tie rod. The weight of outer tie rod was optimized by adopting the steel material and applying the trial and error method. The strengths due to durability and buckling should be considered in the structural design of an outer tie rod. Furthermore, the meta model-based optimization was applied to obtain its lightweight design, leading to 9 % weigh reduction.

Reconstruction and Deconvolution of X-Ray Backscatter Data Using Adaptive Filter (적응필터를 이용한 적층 복합재료에서의 역산란 X-Ray 신호처리 및 복원)

  • Kim, Noh-Yu
    • Journal of the Korean Society for Nondestructive Testing
    • /
    • v.20 no.6
    • /
    • pp.545-554
    • /
    • 2000
  • Compton X-ray backscatter technique has been used to quantitatively assess the impact damage in quasi-isotropic laminated composites and to obtain a cross-sectional profile of impact-damaged laminated composites from the density variation of the cross section. An adaptive filter is applied to the Compton backscattering data for the reconstruction and noise reduction from many sources including quantum noise, especially when the SNR(signal-to-noise ratio) of the image is relatively low. A nonlinear reconstruction model is also proposed to overcome distortion of the Compton backscatter image due to attenuation effects, beam hardening, and irregular distributions of the fibers and the matrix in composites. Delaminations masked or distorted by the first few delaminations near the front surface are detected and characterized both in width and location, by application of an error minimization algorithm.

  • PDF

The Analysis of the Effect of Fiscal Decentralization on Economic Growth: Centering The U. S. (재정분권화가 경제성장에 미치는 영향에 관한 실증연구: 미국의 경우를 중심으로)

  • Choi, Won Ick
    • International Area Studies Review
    • /
    • v.16 no.3
    • /
    • pp.77-97
    • /
    • 2012
  • Estimated coefficients has serious problems including inconsistency, biasness, etc. because many researches about the effect of fiscal decentralization on a country's economic growth use the traditional OLS method. Researches use the data intactly so that so called "spurious regression" phenomenon exists. This causes fundamental fallacy. This research tries unit root test, cointegration test, and then estimates the United States' economic time series by using VECM. The analysis of the effect of the state level-fiscal decentralization on economic growth shows two long term-equilibriums. During short term-dynamic adjustment, fiscal decentralization and economic growth move the same or different directions. In case of prediction GDP increases steeply and then from 2015 gently; and fiscal decentralization index shows a general reduction trend and then decreases slowly. At local level it shows two long term-equilibriums. During short term-dynamic adjustment, fiscal decentralization and economic growth also move the same or different directions. Impulse response analysis shows the very negative effect of fiscal decentralization on economic growth.

Vibration Data Denoising and Performance Comparison Using Denoising Auto Encoder Method (Denoising Auto Encoder 기법을 활용한 진동 데이터 전처리 및 성능비교)

  • Jang, Jun-gyo;Noh, Chun-myoung;Kim, Sung-soo;Lee, Soon-sup;Lee, Jae-chul
    • Journal of the Korean Society of Marine Environment & Safety
    • /
    • v.27 no.7
    • /
    • pp.1088-1097
    • /
    • 2021
  • Vibration data of mechanical equipment inevitably have noise. This noise adversely af ects the maintenance of mechanical equipment. Accordingly, the performance of a learning model depends on how effectively the noise of the data is removed. In this study, the noise of the data was removed using the Denoising Auto Encoder (DAE) technique which does not include the characteristic extraction process in preprocessing time series data. In addition, the performance was compared with that of the Wavelet Transform, which is widely used for machine signal processing. The performance comparison was conducted by calculating the failure detection rate. For a more accurate comparison, a classification performance evaluation criterion, the F-1 Score, was calculated. Failure data were detected using the One-Class SVM technique. The performance comparison, revealed that the DAE technique performed better than the Wavelet Transform technique in terms of failure diagnosis and error rate.

Damage localization and quantification of a truss bridge using PCA and convolutional neural network

  • Jiajia, Hao;Xinqun, Zhu;Yang, Yu;Chunwei, Zhang;Jianchun, Li
    • Smart Structures and Systems
    • /
    • v.30 no.6
    • /
    • pp.673-686
    • /
    • 2022
  • Deep learning algorithms for Structural Health Monitoring (SHM) have been extracting the interest of researchers and engineers. These algorithms commonly used loss functions and evaluation indices like the mean square error (MSE) which were not originally designed for SHM problems. An updated loss function which was specifically constructed for deep-learning-based structural damage detection problems has been proposed in this study. By tuning the coefficients of the loss function, the weights for damage localization and quantification can be adapted to the real situation and the deep learning network can avoid unnecessary iterations on damage localization and focus on the damage severity identification. To prove efficiency of the proposed method, structural damage detection using convolutional neural networks (CNNs) was conducted on a truss bridge model. Results showed that the validation curve with the updated loss function converged faster than the traditional MSE. Data augmentation was conducted to improve the anti-noise ability of the proposed method. For reducing the training time, the normalized modal strain energy change (NMSEC) was extracted, and the principal component analysis (PCA) was adopted for dimension reduction. The results showed that the training time was reduced by 90% and the damage identification accuracy could also have a slight increase. Furthermore, the effect of different modes and elements on the training dataset was also analyzed. The proposed method could greatly improve the performance for structural damage detection on both the training time and detection accuracy.

Simulation Analysis of Control Variates Method Using Stratified sampling (층화추출에 의한 통제변수의 시뮬레이션 성과분석)

  • Kwon, Chi-Myung;Kim, Seong-Yeon;Hwang, Sung-Won
    • Journal of the Korea Society for Simulation
    • /
    • v.19 no.1
    • /
    • pp.133-141
    • /
    • 2010
  • This research suggests a unified scheme for using stratified sampling and control variates method to improve the efficiency of estimation for parameters in simulation experiments. We utilize standardized concomitant variables defined during the course of simulation runs. We first use these concomitant variables to counteract the unknown error of response by the method of control variates, then use a concomitant variable not used in the controlled response and stratify the response into appropriate strata to reduce the variation of controlled response additionally. In case that the covariance between the response and a set of control variates is known, we identify the simulation efficiency of suggested method using control variates and stratified sampling. We conjecture the simulation efficiency of this method is better than that achieved by separated application of either control variates or stratified sampling in a simulation experiments. We investigate such an efficiency gain through simulation on a selected model.

Correlation Extraction from KOSHA to enable the Development of Computer Vision based Risks Recognition System

  • Khan, Numan;Kim, Youjin;Lee, Doyeop;Tran, Si Van-Tien;Park, Chansik
    • International conference on construction engineering and project management
    • /
    • 2020.12a
    • /
    • pp.87-95
    • /
    • 2020
  • Generally, occupational safety and particularly construction safety is an intricate phenomenon. Industry professionals have devoted vital attention to enforcing Occupational Safety and Health (OHS) from the last three decades to enhance safety management in construction. Despite the efforts of the safety professionals and government agencies, current safety management still relies on manual inspections which are infrequent, time-consuming and prone to error. Extensive research has been carried out to deal with high fatality rates confronting by the construction industry. Sensor systems, visualization-based technologies, and tracking techniques have been deployed by researchers in the last decade. Recently in the construction industry, computer vision has attracted significant attention worldwide. However, the literature revealed the narrow scope of the computer vision technology for safety management, hence, broad scope research for safety monitoring is desired to attain a complete automatic job site monitoring. With this regard, the development of a broader scope computer vision-based risk recognition system for correlation detection between the construction entities is inevitable. For this purpose, a detailed analysis has been conducted and related rules which depict the correlations (positive and negative) between the construction entities were extracted. Deep learning supported Mask R-CNN algorithm is applied to train the model. As proof of concept, a prototype is developed based on real scenarios. The proposed approach is expected to enhance the effectiveness of safety inspection and reduce the encountered burden on safety managers. It is anticipated that this approach may enable a reduction in injuries and fatalities by implementing the exact relevant safety rules and will contribute to enhance the overall safety management and monitoring performance.

  • PDF

Optimal Release Problems based on a Stochastic Differential Equation Model Under the Distributed Software Development Environments (분산 소프트웨어 개발환경에 대한 확률 미분 방정식 모델을 이용한 최적 배포 문제)

  • Lee Jae-Ki;Nam Sang-Sik
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.31 no.7A
    • /
    • pp.649-658
    • /
    • 2006
  • Recently, Software Development was applied to new-approach methods as a various form : client-server system and web-programing, object-orient concept, distributed development with a network environments. On the other hand, it be concerned about the distributed development technology and increasing of object-oriented methodology. These technology is spread out the software quality and improve of software production, reduction of the software develop working. Futures, we considered about the distributed software development technique with a many workstation. In this paper, we discussed optimal release problem based on a stochastic differential equation model for the distributed Software development environments. In the past, the software reliability applied to quality a rough guess with a software development process and approach by the estimation of reliability for a test progress. But, in this paper, we decided to optimal release times two method: first, SRGM with an error counting model in fault detection phase by NHPP. Second, fault detection is change of continuous random variable by SDE(stochastic differential equation). Here, we decide to optimal release time as a minimum cost form the detected failure data and debugging fault data during the system test phase and operational phase. Especially, we discussed to limitation of reliability considering of total software cost probability distribution.

A Study on the Reduction of Non-Point Source Pollution loads from Small Agricultural Watershed by Applying Surface Covering Scenario using HSPF Model (HSPF 모델을 이용한 지표피복 시나리오 적용에 따른 농촌 소유역에서의 비점원오염 저감연구)

  • Jung, Chung-Gil;Park, Jong-Yoon;Kim, Sang-Ho;Kim, Seong-Joon
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2012.05a
    • /
    • pp.103-103
    • /
    • 2012
  • 본 연구에서는 시험포장($1276.6m^2$)에서의 지표피복 BMPs (Best Management Practices) 시나리오를 적용하여 얻은 평균 유출저감율을 HSPF 모델에 적용하여 유역차원에서의 비점원오염 저감효과를 평가하고자 한다. 본 연구에서는 별미천 유역($1.21km^2$)을 대상으로 모형의 적용을 위한 입력자료로 기상자료와 지형자료를 구축하였으며 기상자료로 수원, 양평, 이천 기상관측소 자료를 구축하였으며, 지형자료로 격자크기 2m의 DEM (Digital Elevation Model)과 토지이용도는 2006년 5월 1일 QuickBird 영상을 제공받아 기존 환경부, 건교부, USGS의 토지피복분류체계 및 현장조사를 통하여 QuickBird 영상으로부터 추출 가능한 정밀농업정보에 대한 항목을 결정하였으며, 정사보정된 QuickBird 영상을 스크린 디지타이징 기법(On-Screen Digitizing Method)을 이용하여 총 21개 토지이용항목의 정밀토지이용도를 구축하였다. 실제모니터링으로 측정된 자료를 바탕으로 수위-유량곡선 산정 및 오염부하곡선을 선정, 2011년 6월 8일부터 10월 31일 분석기간으로 HSPF 모델링을 실시하였으며 모의결과 월별 통계에 따른 적용성 분석으로 RMSE (Root Mean Square Error) 는 1.15 ~ 1.76(mm/day), $R^2$는 0.62 ~ 0.78, Nash-Sutcliffe model efficiency (NSE)는 0.62 ~ 0.76로 모의치는 실측치와 유의성이 있는 것으로 분석되었다. 또한, Sediment, T-N, T-P의 $R^2$는 각각 0.72, 0.62, 0.63으로 상관성을 보이는 것으로 분석되었다. 시험포장으로부터 얻어진 event별 볏짚을 이용한 지표피복시나리오적용 후 밭에서의 평균 유출 약 10 % 유출율 감소 조건과 실제 평균 비점원오염 저감효과 89.7 % ~ 99.4 %의 결과로부터 지표피복효과의 침투효과를 HSPF 모델로 적용하기 위해 침투량(INFILT)를 조절하여 평균유출 약 10 %가 감소되는 16.0 mm/hr 값을 선정하였다. 그 결과, Sediment. T-N, T-P의 평균 저감율은 각각 87.2 %, 28.5 %, 85.1 %로 나타났으며 이는 시험포장에서의 실제 평균 비점오염 저감효과 89.7 % ~ 99.4 %에 근접함을 알 수 있었다. 이 결과로부터 침투량 조절에 따른 지표피복(침투짚단)효과는 Sediment, T-P에서 저감효율이 80 % 이상으로 높았지만 T-N은 약 30 %로 낮은 저감율을 보임으로써 저감효과가 크지 않음을 나타냈다.

  • PDF

Development of a Simplified Model for Estimating CO2 Emissions: Focused on Asphalt Pavement (CO2 배출량 추정을 위한 간략 모델 개발: 아스팔트 포장을 중심으로)

  • Kim, Kyu-Yeon;Kim, Sung-Keun
    • Land and Housing Review
    • /
    • v.12 no.2
    • /
    • pp.109-120
    • /
    • 2021
  • Global warming due to increased carbon dioxide is perceived as one of the factors threatening the future. Efforts are being made to reduce carbon dioxide emissions in each industry around the world. In particular, environmental loads and impacts during the life cycle of SOC structures and buildings have been quantitatively assessed through a quantitative method called Life Cycle Assessment (LCA). However, the construction sector has gone through difficulty in quantitative assessment for several reasons: 1) LCI DB is not fully established; 2) the life cycle is very long; 3) the building structures are unique. Therefore, it takes enormous effort and time to carry out LCA. Rather than estimating carbon emissions with accuracy, this study aims to present a simplified estimation model that allows owners or designers to easily estimate carbon dioxide emissions with little effort, given that rapid and rough decisions regarding environmental load reduction are to be made. This study performs the LCA using data from 25 road construction projects across the country, followed by multiple regression analyses to derive a simplified carbon estimation model (SLCA). The study also carries out a comparative analysis with values estimated by performing a typical LCA. The comparison analysis shows an error rate of less than 5% for 16 road projects.