• Title/Summary/Keyword: Process Data Analysis

Search Result 9,450, Processing Time 0.037 seconds

Accuracy Analysis of Online GPS Data Processing Service (온라인 GPS 자료처리 서비스의 정확도분석)

  • Kong, Joon-Mook;Park, Joon-Kyu;Lee, Choi-Gu;Lee, Young-Wook
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.28 no.1
    • /
    • pp.13-20
    • /
    • 2010
  • Currently, GPS data process software appears different results that according to user's skills or software. Also, lots of time and efforts are necessary for using GPS data process software to general user, not a specialist On the other band, on-line GPS data process service have a merit that can cony out GPS data process without technical efforts and time. In this study, permanent GPS site's observation data of NGII(National Geographic Information Institute) was processed by on-line GPS data process service, and utilization assessment of on-line GPS data process service was performed by comparing this result with notified coordinates by the NGII in order to analyze positional accuracy. 10 permanent GPS sites of NGII including Suwon which is registered in IGS(International GNSS Service) were selected and these GPS observation data was processed by AUSPOS and CSRS-PPP.

Development of Reliability Analysis Procedures for Repairable Systems with Interval Failure Time Data and a Related Case Study (구간 고장 데이터가 주어진 수리가능 시스템의 신뢰도 분석절차 개발 및 사례연구)

  • Cho, Cha-Hyun;Yum, Bong-Jin
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.14 no.5
    • /
    • pp.859-870
    • /
    • 2011
  • The purpose of this paper is to develop reliability analysis procedures for repairable systems with interval failure time data and apply the procedures for assessing the storage reliability of a subsystem of a certain type of guided missile. In the procedures, the interval failure time data are converted to pseudo failure times using the uniform random generation method, mid-point method or equispaced intervals method. Then, such analytic trend tests as Laplace, Lewis-Robinson, Pair-wise Comparison Nonparametric tests are used to determine whether the failure process follows a renewal or non-renewal process. Monte Carlo simulation experiments are conducted to compare the three conversion methods in terms of the statistical performance for each trend test when the underlying process is homogeneous Poisson, renewal, or non-homogeneous Poisson. The simulation results show that the uniform random generation method is best among the three. These results are applied to actual field data collected for a subsystem of a certain type of guided missile to identify its failure process and to estimate its mean time to failure and annual mean repair cost.

Development of Quality Improvement Process based on the Maintenance Data of Weapon Systems (무기체계 정비 데이터를 활용한 품질 개선 프로세스 개발)

  • Kim, HunGil;Kwon, SeMin;Cho, KyoungHo;Sung, Si-Il
    • Journal of Korean Society for Quality Management
    • /
    • v.43 no.4
    • /
    • pp.499-510
    • /
    • 2015
  • Purpose: This paper treats the improvement of the quality and reliability of military weapon systems based on the maintenance data. Methods: The proposed method of the data integration and refinement are used to obtain the component reliability information and to find the frequently failed components based on the Pareto analysis. Based on the reliability information and the number of failed component frequencies, the target components of quality improvement are determined and improved by multiple methods such as engineering changes, special meetings, additional training and revising maintenance manuals. Results: Based on the proposed process, we find some components which need to be improved in order to enhance the quality and reliability. Conclusion: A process is developed for improving the quality and reliability of weapon systems. This process will be adopted by various weapon systems to enhance the quality and reliability, as well as reduce military spending.

Development of a method for securing the operator's situation awareness from manipulation attacks on NPP process data

  • Lee, Chanyoung;Song, Jae Gu;Lee, Cheol Kwon;Seong, Poong Hyun
    • Nuclear Engineering and Technology
    • /
    • v.54 no.6
    • /
    • pp.2011-2022
    • /
    • 2022
  • According to the defense-in-depth concept, not only a preventive strategy but also an integrated cyberattack response strategy for NPPs should be established. However, there are limitations in terms of responding to penetrations, and the existing EOPs are insufficient for responding to intentional disruptions. In this study, we focus on manipulative attacks on process data. Based on an analysis of the related attack vectors and possible attack scenarios, we adopt the Kalman filter to detect process anomalies that can be caused by manipulations of process data. To compensate for these manipulations and secure MCR operators' situational awareness, we modify the Kalman filter such that it can filter out the effects of the manipulations adaptively. A case study was conducted using a hardware-in-the-loop system. The results indicated that the developed method can be used to verify whether the displayed safety-related state data are reliable and to implement the required safety response actions.

A Monitoring System for Functional Input Data in Multi-phase Semiconductor Manufacturing Process (다단계 반도체 제조공정에서 함수적 입력 데이터를 위한 모니터링 시스템)

  • Jang, Dong-Yoon;Bae, Suk-Joo
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.36 no.3
    • /
    • pp.154-163
    • /
    • 2010
  • Process monitoring of output variables affecting final performance have been mainly executed in semiconductor manufacturing process. However, even earlier detection of causes of output variation cannot completely prevent yield loss because a number of wafers after detecting them must be re-processed or cast away. Semiconductor manufacturers have put more attention toward monitoring process inputs to prevent yield loss by early detecting change-point of the process. In the paper, we propose the method to efficiently monitor functional input variables in multi-phase semiconductor manufacturing process. Measured input variables in the multi-phase process tend to be of functional structured form. After data pre-processing for these functional input data, change-point analysis is practiced to the pre-processed data set. If process variation occurs, key variables affecting process variation are selected using contribution plot for monitoring efficiency. To evaluate the propriety of proposed monitoring method, we used real data set in semiconductor manufacturing process. The experiment shows that the proposed method has better performance than previous output monitoring method in terms of fault detection and process monitoring.

Air-Data Estimation for Air-Breathing Hypersonic Vehicles

  • Kang, Bryan-Heejin
    • Transactions on Control, Automation and Systems Engineering
    • /
    • v.1 no.1
    • /
    • pp.75-86
    • /
    • 1999
  • An air-data estimator for generic air-breathing hypersonic vehicles (AHSVs) is developed and demonstrated with an example vehicle configuration. The AHSV air-data estimation strategy emphasized improvement of the angle of attack estimate accuracy to a degree necessitated by the stringent operational requirements of the air-breathing propulsion. the resulting estimation problem involves highly nonlinear diffusion process (propagation); consequently, significant distortion of a posteriori conditional density is suspected. A simulation based statistical analysis tool is developed to characterize the nonlinear diffusion process. The statistical analysis results indicate that the diffusion process preserves the symmetry and unimodality of initial probability density shape state variables, and provide the basis for applicability of an Extended Kalman Filter (EKF). An EKF is designed for the AHSV air-data system and the air data estimation capabilities are demonstrated.

  • PDF

The Total Ranking Method from Multi-Categorized Voting Data Based on Analytic Hierarchy Process

  • Ogawa, Masaru;Ishii, Hiroaki
    • Industrial Engineering and Management Systems
    • /
    • v.1 no.1
    • /
    • pp.93-98
    • /
    • 2002
  • It is important to evaluate the performance of candidates mathematically from various aspects, and reflect it on decision making. In decision making, we judge the candidates through two steps, classification of objects and comparison of objects or candidates with plural elements. In the former step, Analytic Hierarchy Process (AHP) is useful method to evaluate candidates from plural viewpoints, and in the later step, Data Envelopment Analysis (DEA) is also useful method to evaluate candidates with plural categorized data. In fact, each candidate has plural elements, nevertheless it has been more important to evaluate from various aspects in IT society. So, we propose a new procedure complementing AHP with DEA.

Design of Data Fusion and Data Processing Model According to Industrial Types (산업유형별 데이터융합과 데이터처리 모델의 설계)

  • Jeong, Min-Seung;Jin, Seon-A;Cho, Woo-Hyun
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.6 no.2
    • /
    • pp.67-76
    • /
    • 2017
  • In industrial site in various fields it will be generated in combination with large amounts of data have a correlation. It is able to collect a variety of data in types of industry process, but they are unable to integrate each other's association between each process. For the data of the existing industry, the set values of the molding condition table are input by the operator as an arbitrary value When a problem occurs in the work process. In this paper, design the fusion and analysis processing model of data collected for each industrial type, Prediction Case(Automobile Connect), a through for corporate earnings improvement and process manufacturing industries such as master data through standard molding condition table and the production history file comparison collected during the manufacturing process and reduced failure rate with a new molding condition table digitized by arbitrary value for worker, a new pattern analysis and reinterpreted for various malfunction factors and exceptions, increased productivity, process improvement, the cost savings. It can be designed in a variety of data analysis and model validation. In addition, to secure manufacturing process of objectivity, consistency and optimization by standard set values analyzed and verified and may be optimized to support the industry type, fits optimization(standard setting) techniques through various pattern types.

Die Design of Hot Extrusion for Hexagonal Insert (Hexagonal 인서트용 열간압출 금형설계)

  • 권혁홍;이정로
    • Transactions of the Korean Society of Machine Tool Engineers
    • /
    • v.11 no.1
    • /
    • pp.32-37
    • /
    • 2002
  • The use of hexagonal ceramic inserts for copper extrusion dies offers significant technical and economic advantages over other forms of manufacture. In this paper the data on the loading of the tools is determined from a commercial FEM package as the contact stress distribution on the die-workpiece interface and as temperature distributions in the die. This data can be processed as load input data for a finite element die-stress analysis. Process simulation and stress analysis are thus combined during the design and a data exchange program has been developed that enables optimal design of the dies taking into account the elastic deflections generated in shrink fitting the die inserts and that caused by the stresses generated in the process. The stress analysis of the dies is used to determine the stress conditions on the ceramic insert by considering contact and interference effects under both mechanical and thermal loads.

Machine Learning Methodology for Management of Shipbuilding Master Data

  • Jeong, Ju Hyeon;Woo, Jong Hun;Park, JungGoo
    • International Journal of Naval Architecture and Ocean Engineering
    • /
    • v.12 no.1
    • /
    • pp.428-439
    • /
    • 2020
  • The continuous development of information and communication technologies has resulted in an exponential increase in data. Consequently, technologies related to data analysis are growing in importance. The shipbuilding industry has high production uncertainty and variability, which has created an urgent need for data analysis techniques, such as machine learning. In particular, the industry cannot effectively respond to changes in the production-related standard time information systems, such as the basic cycle time and lead time. Improvement measures are necessary to enable the industry to respond swiftly to changes in the production environment. In this study, the lead times for fabrication, assembly of ship block, spool fabrication and painting were predicted using machine learning technology to propose a new management method for the process lead time using a master data system for the time element in the production data. Data preprocessing was performed in various ways using R and Python, which are open source programming languages, and process variables were selected considering their relationships with the lead time through correlation analysis and analysis of variables. Various machine learning, deep learning, and ensemble learning algorithms were applied to create the lead time prediction models. In addition, the applicability of the proposed machine learning methodology to standard work hour prediction was verified by evaluating the prediction models using the evaluation criteria, such as the Mean Absolute Percentage Error (MAPE) and Root Mean Squared Logarithmic Error (RMSLE).