• Title/Summary/Keyword: Test Validation

Search Result 1,808, Processing Time 0.027 seconds

Development of the Algorithm for Optimizing Wavelength Selection in Multiple Linear Regression

  • Hoeil Chung
    • Near Infrared Analysis
    • /
    • v.1 no.1
    • /
    • pp.1-7
    • /
    • 2000
  • A convenient algorithm for optimizing wavelength selection in multiple linear regression (MLR) has been developed. MOP (MLP Optimization Program) has been developed to test all possible MLR calibration models in a given spectral range and finally find an optimal MLR model with external validation capability. MOP generates all calibration models from all possible combinations of wavelength, and simultaneously calculates SEC (Standard Error of Calibration) and SEV (Standard Error of Validation) by predicting samples in a validation data set. Finally, with determined SEC and SEV, it calculates another parameter called SAD (Sum of SEC, SEV, and Absolute Difference between SEC and SEV: sum(SEC+SEV+Abs(SEC-SEV)). SAD is an useful parameter to find an optimal calibration model without over-fitting by simultaneously evaluating SEC, SEV, and difference of error between calibration and validation. The calibration model corresponding to the smallest SAD value is chosen as an optimum because the errors in both calibration and validation are minimal as well as similar in scale. To evaluate the capability of MOP, the determination of benzene content in unleaded gasoline has been examined. MOP successfully found the optimal calibration model and showed the better calibration and independent prediction performance compared to conventional MLR calibration.

Automatic RF Input Power Level Control Methodology for SAR Measurement Validation

  • Kim, Ki-Hwea;Choi, Dong-Geun;Gimm, Yoon-Myoung
    • Journal of electromagnetic engineering and science
    • /
    • v.15 no.3
    • /
    • pp.181-184
    • /
    • 2015
  • Evaluation of radiating radiofrequency fields from hand-held and body-mounted wireless communication devices to human bodies are conducted by measuring the specific absorption rate (SAR). The uncertainty of system validation and probe calibration in SAR measurement depend on the variation of RF power used for the validation and calibration. RF input power for system validation or probe calibration is controlled manually during the test process of the existing systems in the laboratories. Consequently, a long time is required to reach the stable power needed for testing that will cause less uncertainty. The standard uncertainty due to this power drift is typically 2.89%, which can be obtained by applying IEC 62209 in a normal operating condition. The principle of the Automatic Input Power Level Control System (AIPLC), which controls the equipment by a program that maintains a stable input power level, is suggested in this paper. The power drift is reduced to less than ${\pm}1.16dB$ by AIPLC, which reduces the standard uncertainty of power drift to 0.67%.

A Cross-Validation of SeismicVulnerability Assessment Model: Application to Earthquake of 9.12 Gyeongju and 2017 Pohang (지진 취약성 평가 모델 교차검증: 경주(2016)와 포항(2017) 지진을 대상으로)

  • Han, Jihye;Kim, Jinsoo
    • Korean Journal of Remote Sensing
    • /
    • v.37 no.3
    • /
    • pp.649-655
    • /
    • 2021
  • This study purposes to cross-validate its performance by applying the optimal seismic vulnerability assessment model based on previous studies conducted in Gyeongju to other regions. The test area was Pohang City, the occurrence site for the 2017 Pohang Earthquake, and the dataset was built the same influencing factors and earthquake-damaged buildings as in the previous studies. The validation dataset was built via random sampling, and the prediction accuracy was derived by applying it to a model based on a random forest (RF) of Gyeongju. The accuracy of the model success and prediction in Gyeongju was 100% and 94.9%, respectively, and as a result of confirming the prediction accuracy by applying the Pohang validation dataset, it appeared as 70.4%.

ASUSD nuclear data sensitivity and uncertainty program package: Validation on fusion and fission benchmark experiments

  • Kos, Bor;Cufar, Aljaz;Kodeli, Ivan A.
    • Nuclear Engineering and Technology
    • /
    • v.53 no.7
    • /
    • pp.2151-2161
    • /
    • 2021
  • Nuclear data (ND) sensitivity and uncertainty (S/U) quantification in shielding applications is performed using deterministic and probabilistic approaches. In this paper the validation of the newly developed deterministic program package ASUSD (ADVANTG + SUSD3D) is presented. ASUSD was developed with the aim of automating the process of ND S/U while retaining the computational efficiency of the deterministic approach to ND S/U analysis. The paper includes a detailed description of each of the programs contained within ASUSD, the computational workflow and validation results. ASUSD was validated on two shielding benchmark experiments from the Shielding Integral Benchmark Archive and Database (SINBAD) - the fission relevant ASPIS Iron 88 experiment and the fusion relevant Frascati Neutron Generator (FNG) Helium Cooled Pebble Bed (HCPB) Test Blanket Module (TBM) mock-up experiment. The validation process was performed in two stages. Firstly, the Denovo discrete ordinates transport solver was validated as a standalone solver. Secondly, the ASUSD program package as a whole was validated as a ND S/U analysis tool. Both stages of the validation process yielded excellent results, with a maximum difference of 17% in final uncertainties due to ND between ASUSD and the stochastic ND S/U approach. Based on these results, ASUSD has proven to be a user friendly and computationally efficient tool for deterministic ND S/U analysis of shielding geometries.

Uncertainty Analysis and Improvement of an Altitude TestFacility for Small Jet Engines

  • Jun, Yong-Min;Yang, In-Young;Kim, Chun-Taek;Yang, Soo-Seok;Lee, Dae-Sung
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.5 no.1
    • /
    • pp.46-56
    • /
    • 2004
  • The verification and improvement of the measurement uncertainty have beenperformed in the altitude test facility for small gas turbine engines, which was built atthe Korea Aerospace Research Institute (KARI) in October 1999. This test is performedwith a single spool turbojet engine at several flight conditions. This paper discussesthe evaluation and validation process for the measurement uncertainty improvements usedin the altitude test facility. The evaluation process, defined as tests before the facilitymodification, shows that the major contnbutors to the measurement uncertainty are theflow meter discharge coefficient, the inlet static and total pressures, the cell pressureand the fuel flow rate. The measurement uncertainty is focused on the primary parametersof the engine performance such as airflow rate, thrust and specific fuel consumption (SFC).The validation process, defined as tests after the facility modification, shows that themeasurement uncertainty, in seal level condition, is tmproved to the acceptable level throughthe facility modification. In altitude test conditions, the measurement uncertainties arenot improved as much as the uncertainty in sea level condition.

Header Data Interpreting S/W Design for MSC(Multi-Spectral Camera) image data

  • Kong Jong-Pil;Heo Haeng-Pal;Kim YoungSun;Park Jong-Euk;Youn Heong-Sik
    • Proceedings of the KSRS Conference
    • /
    • 2004.10a
    • /
    • pp.436-439
    • /
    • 2004
  • Output data streams of the MSC contain flags, Headers and image data according to the established protocols and data formats. Especially the Header added to each data lines contain information of a line sync, a line counter and, ancillary data which consist of ancillary identification bit and one ancillary data byte. This information is used by ground station to calculate the geographic coordinates of the image and get the on-board time and several EOS(Electro-Optical Subsystem) parameters used at the time of imaging. Therefore, the EGSE(Electrical Ground Supporting Equipment) that is used for testing MSC has to have functions of interpreting and displaying this Header information correctly following the protocols. This paper describes the design of the header data processing module which is in EOS­EGSE. This module provides users with various test functions such as header validation, ancillary block validation, line-counter and In-line counter validation checks which allow convenient and fast test on imagery data.

  • PDF

A Study on the Design and Validation Methodology of Communication Protocols Using International Communication Standard Languages (국제 통신 표준 언어를 이용한 통신 프로토콜 설계 및 검증 방법론 연구)

  • Ro, Cheul-Woo
    • The Journal of Korean Association of Computer Education
    • /
    • v.5 no.4
    • /
    • pp.31-42
    • /
    • 2002
  • In this paper, We set up the development methodology of communication protocols as well as the concrete design concept concerning for how to define and use the PDU, SDU, SAP, and service primitives using SDL, which is recommended by ITU-T, and other international standard languages such as ASN.l, MSC, and TTCN. This methodology covers the SDL design of extended lures protocol, a well known protocol example, insertion of ASN.1 message for transportation of bit string, generation of MSC for validation of design specification, generation of test cases using TTCN from validation, and performance of conformance test.

  • PDF

An Error Assessment of the Kriging Based Approximation Model Using a Mean Square Error (평균제곱오차를 이용한 크리깅 근사모델의 오차 평가)

  • Ju Byeong-Hyeon;Cho Tae-Min;Jung Do-Hyun;Lee Byung-Chai
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.30 no.8 s.251
    • /
    • pp.923-930
    • /
    • 2006
  • A Kriging model is a sort of approximation model and used as a deterministic model of a computationally expensive analysis or simulation. Although it has various advantages, it is difficult to assess the accuracy of the approximated model. It is generally known that a mean square error (MSE) obtained from the kriging model can't calculate statistically exact error bounds contrary to a response surface method, and a cross validation is mainly used. But the cross validation also has many uncertainties. Moreover, the cross validation can't be used when a maximum error is required in the given region. For solving this problem, we first proposed a modified mean square error which can consider relative errors. Using the modified mean square error, we developed the strategy of adding a new sample to the place that the MSE has the maximum when the MSE is used for the assessment of the kriging model. Finally, we offer guidelines for the use of the MSE which is obtained from the kriging model. Four test problems show that the proposed strategy is a proper method which can assess the accuracy of the kriging model. Based on the results of four test problems, a convergence coefficient of 0.01 is recommended for an exact function approximation.

A Study on the Application of Risk Management for Medical Device Software Test (의료기기 소프트웨어 테스트 위험관리 적용 방안 연구)

  • Kim, S.H.;Lee, jong-rok;Jeong, Dong-Hun;Park, Hui-Byeong
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2012.10a
    • /
    • pp.495-497
    • /
    • 2012
  • Development of application risk management for medical device software test. First, Through questionnaires, Medical device manufacturers, Analysis of software validation and risk management status. Second, Analyzed by comparing the difference between black box testing and white box testing. Third, After analyzing the potential for software analysis tools using code derived factors were quantified, Finally, Medical device risk management process so that it can be applied to build the framework by FMEA(Failure Mode and Effect Analysis) technique. Through this Difficult to build software validation and risk management processes for manufacturers to take advantage of support in medical device GMP(Good Manufacture Practice).

  • PDF