• 제목/요약/키워드: Data quality control

검색결과 3,205건 처리시간 0.031초

스마트시티 IoT 품질 지표 개발 및 우선순위 도출 (Development of Smart City IoT Data Quality Indicators and Prioritization Focusing on Structured Sensing Data)

  • 양현모;한규보;이정훈
    • 한국빅데이터학회지
    • /
    • 제6권1호
    • /
    • pp.161-178
    • /
    • 2021
  • '빅데이터'는 '21세기 원유'로 비유될 만큼 그 중요성이 증대되고 있다. 스마트시티에서 생성 및 수집되는 IoT 데이터의 경우 데이터의 품질이 공공서비스의 품질과 연관되므로 품질관리에 주의를 기울여야 한다. 그러나 ISO/IEC 기관 및 국내/외 여러 기관을 통해 제시된 데이터 품질 지표는 '사용자' 중심에 한정되어 있다는 한계점을 지닌다. 본 연구는 이러한 한계점을 보완하기 위해 공급자 중심의 지표와 그 우선순위를 도출하였다. 공급자 중심의 스마트시티 IoT 데이터 품질 평가지표 3개의 카테고리와 13개의 지표를 도출한 후 AHP 분석을 통하여 지표 카테고리와 데이터 품질 지표의 우선순위를 도출하였고 각 지표의 타당성을 조사하였다. 해당 연구를 통해 센서 데이터를 수집하고 취합하여 전달하는 직무를 수행하는 개인 혹은 기업에게 데이터가 지녀야 하는 기본적인 요건을 제시함으로써 센서 데이터 품질 향상에 기여할 수 있다. 또한 지표 우선순위를 기반으로 데이터 품질관리를 수행하여 품질관리 업무 효율의 향상을 제공할 수 있다.

Estimation of Qualities and Inference of Operating Conditions for Optimization of Wafer Fabrication Using Artificial Intelligent Methods

  • Bae, Hyeon;Kim, Sung-Shin;Woo, Kwang-Bang
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2005년도 ICCAS
    • /
    • pp.1101-1106
    • /
    • 2005
  • The purpose of this study was to develop a process management system to manage ingot fabrication and the quality of the ingot. The ingot is the first manufactured material of wafers. Operating data (trace parameters) were collected on-line but quality data (measurement parameters) were measured by sampling inspection. The quality parameters were applied to evaluate the quality. Thus, preprocessing was necessary to extract useful information from the quality data. First, statistical methods were employed for data generation, and then modeling was accomplished, using the generated data, to improve the performance of the models. The function of the models is to predict the quality corresponding to the control parameters. The dynamic polynomial neural network (DPNN) was used for data modeling that used the ingot fabrication data.

  • PDF

SVM 기반 자동 품질검사 시스템에서 상관분석 기반 데이터 선정 연구 (Study on Correlation-based Feature Selection in an Automatic Quality Inspection System using Support Vector Machine (SVM))

  • 송동환;오영광;김남훈
    • 대한산업공학회지
    • /
    • 제42권6호
    • /
    • pp.370-376
    • /
    • 2016
  • Manufacturing data analysis and its applications are getting a huge popularity in various industries. In spite of the fast advancement in the big data analysis technology, however, the manufacturing quality data monitored from the automated inspection system sometimes is not reliable enough due to the complex patterns of product quality. In this study, thus, we aim to define the level of trusty of an automated quality inspection system and improve the reliability of the quality inspection data. By correlation analysis and feature selection, this paper presents a method of improving the inspection accuracy and efficiency in an SVM-based automatic product quality inspection system using thermal image data in an auto part manufacturing case. The proposed method is implemented in the sealer dispensing process of the automobile manufacturing and verified by the analysis of the optimal feature selection from the quality analysis results.

무선데이터 서비스 품질 측정 시스템의 신뢰성 검증 및 평가 (Assessment and Validation of the Reliability of Quality Measurement System for Wireless Data Services)

  • 최동환;박석천
    • 한국IT서비스학회지
    • /
    • 제11권1호
    • /
    • pp.239-246
    • /
    • 2012
  • The user increases around the WiBro and in which the mobile broad band service is the wireless internet base technology HSDPA, that is the cellular phone foundation technique, and moreover the issue about the user effective quality guarantee emerges but the quality control of the wireless network is the incomplete actual condition. Therefore, the reliability verification according to the development of the system for the performance measure for the continued wireless data service and it and evaluation are needed for the quality control. In this paper, the wireless data service quality measurement system was implemented with the design. The same test environment as the existing network performance measurement program was built for the reliability verification of the embodied device for measuring quality and evaluation and the cost performance was performed. It could confirm that design and embodied wireless data service quality measurement system operated accurately in this paper through the quality measure result value analysis.

유동화 콘크리트의 현장적용과 품질관리 시스템 구축 (Application and System Establishment on Quality Control of Flowing Concrete)

  • 김규용;길배수;한장현;주지현;박선규;한승구;조성기;김무한
    • 한국콘크리트학회:학술대회논문집
    • /
    • 한국콘크리트학회 1999년도 학회창립 10주년 기념 1999년도 가을 학술발표회 논문집
    • /
    • pp.801-804
    • /
    • 1999
  • The interest in workability and quality control of concrete is increasing as to improve quality of concrete structure according to industrialization. Therefore, it is the aim of this study to evaluate the quality of flowing concrete and systematize quality control by analyzing the data of the quality control of concrete through a basic quality control system for improvement of the concrete in pumping flowing concrete for construction industy.

  • PDF

KoFlux 관측지에서 에디 공분산 자료의 품질관리 및 보증 (Quality Control and Assurance of Eddy Covariance Data at the Two KoFlux Sites)

  • 권효정;박성빈;강민석;유재일;렌민 유안;김준
    • 한국농림기상학회지
    • /
    • 제9권4호
    • /
    • pp.260-267
    • /
    • 2007
  • 이 연구노트에서는 KoFlux 관측지인 광릉 산림과 해남 농경지에서 관측된 에디 공분산 자료의 품질을 관리하고 보증하는 방법을 소개하였다. 자료의 품질 관리는 미기상학적/통계학적 분석을 기반으로 8 단계의 과정을 거쳐 이루어졌고 자료의 품질에 따라 5 등급으로 분리되어 표시되었다. 품질관리를 위해 사용된 프로그램과 품질 보증된 최종 자료는 KoFlux 웹사이트(http://www.koflux.org/)에서 내려 받을 수 있다.

퍼지 데이터를 이용한 불량률(p) 관리도의 설계 (A Design of Control Chart for Fraction Nonconforming Using Fuzzy Data)

  • 김계완;서현수;윤덕균
    • 품질경영학회지
    • /
    • 제32권2호
    • /
    • pp.191-200
    • /
    • 2004
  • Using the p chart is not adequate in case that there are lots of data and it is difficult to divide into products conforming or nonconforming because of obscurity of binary classification. So we need to design a new control chart which represents obscure situation efficiently. This study deals with the method to performing arithmetic operation representing fuzzy data into fuzzy set by applying fuzzy set theory and designs a new control chart taking account of a concept of classification on the term set and membership function associated with term set.

실시간 철도안전 통합 감시제어시스템의 데이터 분산 서비스 품질 적합성 분석 (Quality Analysis for the Data Distribution Service of the Real-time Integrated Railway Safety Monitoring and Control System)

  • 김상암;김선우
    • 한국도시철도학회논문집
    • /
    • 제6권4호
    • /
    • pp.351-361
    • /
    • 2018
  • 본 논문에서는 실시간 통합 철도 안전 감시 시스템의 데이터 전송 품질 요구 사항을 만족시키기 위해 OMG DDS 표준에서 제공하는 네트워크 전송 품질을 제어 할 수 있는 DDS(Data Distribution Service) QoS(Quality of Service)에 관한 실험을 통한 분석을 진행하였다. '실시간 통합 철도 안전 감시제어시스템'은 철도 분야에서 다양한 센서의 데이터를 수집 및 분석하여 잠재적인 철도사고 위험을 예측하고 방지하는 시스템이다. 이 시스템에서 수집된 데이터를 정확하고 안정적으로 실시간으로 전송하려면 데이터 전송 품질을 보장해야 한다. 실험 결과에 따르면 DDS QoS를 사용하여 철도 안전을 모니터링하고 제어하기 위해 실시간으로 정확하고 안정적인 데이터 전송을 보장할 수 있다.

혈액성분제제 품질관리 자료의 통계학적인 비교 (Statistical Analysis of Quality Control Data of Blood Components)

  • 김종암;서동희;권소영;오영철;임채승;장충훈;김순덕
    • 대한임상검사과학회지
    • /
    • 제36권1호
    • /
    • pp.19-26
    • /
    • 2004
  • According to increase of domestic blood components use, the quality control of blood components is necessary to support good products. The purpose of this study is used to provide the producing index of the good product as compared with the accuracy and validity for the distribution of the quality control data. The value of mean, standard deviation, 95% confidence interval and degree of normal distribution of data were calculated by univariate procedure, the value of monthly mean of each blood centers per items were compared by Analysis of Variance(ANOVA) test for the degree of distribution. When there was difference among the mean values, the Duncan's multiple range test was done to confirm the difference. Finally, methods for accessing accuracy and validity of the quality data was done by the Contingency table test. The quality data of five blood centers was showed to the normal distribution and it was in a acceptable range. For each blood centers, the monthly means of Hematocrit(Hct), Platelet(PLT) and pH were not significantly different except Hct of C center, PLT of B, D center and pH of A center. The quality data per items was graded according to quality to six level. As a result of the comparative analysis, the monthly means of Hct of C and E center was significantly different higher than that of D, B and A center. The monthly means of PLT of A center and pH of C center was significantly different higher than that of the others. In the accuracy and validity of the quality control data, C center for Hct, A center for PLT and C center for pH were better than the other. The C blood center was most satisfiable and stable in the quality control for blood component. If the quality control method used in C blood center is adopted in other blood centers, the prepared level of the blood component of the center will be improved partly.

  • PDF

부유분진측정기(PM10) 관측 자료 실시간 품질관리 알고리즘 개발 및 평가 (Development and Assessment of Real-Time Quality Control Algorithm for PM10 Data Observed by Continuous Ambient Particulate Monitor)

  • 김선영;이희춘;류상범
    • 대기
    • /
    • 제26권4호
    • /
    • pp.541-551
    • /
    • 2016
  • A real-time quality control algorithm for $PM_{10}$ concentration measured by Continuous Ambient Particulate Monitor (FH62C14, Thermo Fisher Scientific Inc.) has been developed. The quality control algorithm for $PM_{10}$ data consists of five main procedures. The first step is valid value check. The values should be within the acceptable range limit. Upper ($5,000{\mu}g\;m^{-3}$) and lower ($0{\mu}g\;m^{-3}$) values of instrument detectable limit have to be eliminated as being unrealistic. The second step is valid error check. Whenever unusual condition occurs, the instrument will save error code. Value having an error code is eliminated. The third step is persistence check. This step checks on a minimum required variability of data during a certain period. If the $PM_{10}$ data do not vary over the past 60 minutes by more than the specific limit ($0{\mu}g\;m^{-3}$) then the current 5-minute value fails the check. The fourth step is time continuity check, which is checked to eliminate gross outlier. The last step is spike check. The spikes in the time series are checked. The outlier detection is based on the double-difference time series, using the median. Flags indicating normal and abnormal are added to the raw data after quality control procedure. The quality control algorithm is applied to $PM_{10}$ data for Asian dust and non-Asian dust case at Seoul site and dataset for the period 2013~2014 at 26 sites in Korea.