• Title/Summary/Keyword: Process Data Analysis

Search Result 9,332, Processing Time 0.047 seconds

The Study for Process Capability Analysis of Software Failure Interval Time (소프트웨어 고장 간격 시간에 대한 공정능력분석에 관한 연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul
    • Convergence Security Journal
    • /
    • v.7 no.2
    • /
    • pp.49-55
    • /
    • 2007
  • Software failure time presented in the literature exhibit either constant, monotonic increasing or monotonic decreasing. For data analysis of software reliability model, data scale tools of trend analysis are developed. The methods of trend analysis are arithmetic mean test and Laplace trend test. Trend analysis only offer information of outline content. From the subdivision of this analysis, new attemp needs the side of the quality control. In this paper, we discuss process capability analysis using process capability indexs. Because of software failure interval time is pattern of nonnegative value, instead of capability analysis of suppose to normal distribution, capability analysis of process distribution using to Box-Cox transformation is attermpted. The used software failure time data for capability analysis of process is SS3, the result of analysis listed on this chapter 4 and 5. The practical use is presented.

  • PDF

Analysis of a Repair Processes Using a Process Mining Tool (프로세스 마이닝 기법을 활용한 고장 수리 프로세스 분석)

  • Choi, Sang Hyun;Han, Kwan Hee;Lim, Gun Hoon
    • The Journal of the Korea Contents Association
    • /
    • v.13 no.4
    • /
    • pp.399-406
    • /
    • 2013
  • Recently, studies about process mining for creating and analyzing business process models from log data have received much attention from BPM (Business Process Management) researchers. Process mining is a kind of method that extracts meaningful information and hidden rules from the event log of enterprise information systems such as ERP and BPM. In this paper, repair processes of electronic devices are analyzed using ProM which is a process mining tool. And based on the analysis of repair processes, the method for finding major failure patterns is proposed by multi-dimensional data analysis beyond simple statistics. By using the proposed method, the reliability of electronic device can be increased by providing the identified failure patterns to design team.

The Basic Study on the Pre-Process Development of Integrated System for the Structural Analysis of Space Frame (스페이스 프레임 구조 해석을 위한 통합 시스템의 전처리 과정 개발을 위한 기초 연구)

  • 권영환;정환목;석창목;김선희
    • Proceedings of the Computational Structural Engineering Institute Conference
    • /
    • 1999.10a
    • /
    • pp.378-386
    • /
    • 1999
  • The integrated system for the structural analysis of space frame is made up 4 modules ; pre-process module, structural analysis module, optimum member design module and post-process module. Re-process module of these 4 modules involves data input module and structure modeling module. This study is to develope an efficient modeling program as a basic for development of pre-process module. This modeling program generates geometric information of space frame and performs the input fie form for structure analysis only by input general data. User can mode1 space frame efficiently within shut time by using this program.

  • PDF

Analysis on Types of Golf Tourism After COVID-19 by using Big Data

  • Hyun Seok Kim;Munyeong Yun;Gi-Hwan Ryu
    • International Journal of Advanced Culture Technology
    • /
    • v.12 no.1
    • /
    • pp.270-275
    • /
    • 2024
  • Introduction. In this study, purpose is to analize the types of golf tourism, inbound or outbound, by using big data and see how movement of industry is being changed and what changes have been made during and after Covid-19 in golf industry. Method Using Textom, a big data analysis tool, "golf tourism" and "Covid-19" were selected as keywords, and search frequency information of Naver and Daum was collected for a year from 1 st January, 2023 to 31st December, 2023, and data preprocessing was conducted based on this. For the suitability of the study and more accurate data, data not related to "golf tourism" was removed through the refining process, and similar keywords were grouped into the same keyword to perform analysis. As a result of the word refining process, top 36 keywords with the highest relevance and search frequency were selected and applied to this study. The top 36 keywords derived through word purification were subjected to TF-IDF analysis, visualization analysis using Ucinet6 and NetDraw programs, network analysis between keywords, and cluster analysis between each keyword through Concor analysis. Results By using big data analysis, it was found out option of oversea golf tourism is affecting on inbound golf travel. "Golf", "Tourism", "Vietnam", "Thailand" showed high frequencies, which proves that oversea golf tour is now the re-coming trends.

Workflow Process-Aware Data Cubes and Analysis (워크플로우 프로세스 기반 데이터 큐브 및 분석)

  • Jin, Min-hyuck;Kim, Kwang-hoon Pio
    • Journal of Internet Computing and Services
    • /
    • v.19 no.6
    • /
    • pp.83-89
    • /
    • 2018
  • In workflow process intelligence and systems, workflow process mining and analysis issues are becoming increasingly important. In order to improve the quality of workflow process intelligence, it is essential for an efficient and effective data center storing workflow enactment event logs to be provisioned in carrying out the workflow process mining and analytics. In this paper, we propose a three-dimensional process-aware datacube for organizing workflow enterprise data centers to efficiently as well as effectively store the workflow process enactment event logs in the XES format. As a validation step, we carry out an experimental process mining to show how much perfectly the process-aware datacubes are suitable for discovering workflow process patterns and its analytical knowledge, like enacted proportions and enacted work transferences, from the workflow process enactment event histories. Finally, we confirmed that it is feasible to discover the fundamental control-flow patterns of workflow processes through the implemented workflow process mining system based on the process-aware data cube.

A multivariate latent class profile analysis for longitudinal data with a latent group variable

  • Lee, Jung Wun;Chung, Hwan
    • Communications for Statistical Applications and Methods
    • /
    • v.27 no.1
    • /
    • pp.15-35
    • /
    • 2020
  • In research on behavioral studies, significant attention has been paid to the stage-sequential process for multiple latent class variables. We now explore the stage-sequential process of multiple latent class variables using the multivariate latent class profile analysis (MLCPA). A latent profile variable, representing the stage-sequential process in MLCPA, is formed by a set of repeatedly measured categorical response variables. This paper proposes the extended MLCPA in order to explain an association between the latent profile variable and the latent group variable as a form of a two-dimensional contingency table. We applied the extended MLCPA to the National Longitudinal Survey on Youth 1997 (NLSY97) data to investigate the association between of developmental progression of depression and substance use behaviors among adolescents who experienced Authoritarian parental styles in their youth.

Stock Forecasting Using Prophet vs. LSTM Model Applying Time-Series Prediction

  • Alshara, Mohammed Ali
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.2
    • /
    • pp.185-192
    • /
    • 2022
  • Forecasting and time series modelling plays a vital role in the data analysis process. Time Series is widely used in analytics & data science. Forecasting stock prices is a popular and important topic in financial and academic studies. A stock market is an unregulated place for forecasting due to the absence of essential rules for estimating or predicting a stock price in the stock market. Therefore, predicting stock prices is a time-series problem and challenging. Machine learning has many methods and applications instrumental in implementing stock price forecasting, such as technical analysis, fundamental analysis, time series analysis, statistical analysis. This paper will discuss implementing the stock price, forecasting, and research using prophet and LSTM models. This process and task are very complex and involve uncertainty. Although the stock price never is predicted due to its ambiguous field, this paper aims to apply the concept of forecasting and data analysis to predict stocks.

Measurement and Analysis Process Improvement Based on CMMI (CMMI 기반의 측정 및 분석 프로세스 개선)

  • Han, Hyuk-Soo;Do, Sung-Ryong
    • Journal of Information Technology Services
    • /
    • v.10 no.4
    • /
    • pp.229-242
    • /
    • 2011
  • It is necessary to have measurement and analysis activity for managing software project. At least, every project measures time and cost in order to figure it out whether it will finish within its deadline. CMMI has Measurement and Analysis process in Maturity Level 2. In Measurement and Analysis process, Indicators for decision making in project management are defined and analysis procedure of the measurements to get the indicators are specified. Also, the way of collecting the data and storing them is also planned. Establishing efficient and effective measurement and analysis process in the organization by improving existing process is very important for project success. In this paper, we provide a method for analyzing the measurement and analysis process and improving it based on IDEAL model. It will support the organizations which are trying to adopt CMMI to establish measurement and analysis process.

A Study on Unstructured text data Post-processing Methodology using Stopword Thesaurus (불용어 시소러스를 이용한 비정형 텍스트 데이터 후처리 방법론에 관한 연구)

  • Won-Jo Lee
    • The Journal of the Convergence on Culture Technology
    • /
    • v.9 no.6
    • /
    • pp.935-940
    • /
    • 2023
  • Most text data collected through web scraping for artificial intelligence and big data analysis is generally large and unstructured, so a purification process is required for big data analysis. The process becomes structured data that can be analyzed through a heuristic pre-processing refining step and a post-processing machine refining step. Therefore, in this study, in the post-processing machine refining process, the Korean dictionary and the stopword dictionary are used to extract vocabularies for frequency analysis for word cloud analysis. In this process, "user-defined stopwords" are used to efficiently remove stopwords that were not removed. We propose a methodology for applying the "thesaurus" and examine the pros and cons of the proposed refining method through a case analysis using the "user-defined stop word thesaurus" technique proposed to complement the problems of the existing "stop word dictionary" method with R's word cloud technique. We present comparative verification and suggest the effectiveness of practical application of the proposed methodology.

The study of a full cycle semi-automated business process re-engineering: A comprehensive framework

  • Lee, Sanghwa;Sutrisnowati, Riska A.;Won, Seokrae;Woo, Jong Seong;Bae, Hyerim
    • Journal of the Korea Society of Computer and Information
    • /
    • v.23 no.11
    • /
    • pp.103-109
    • /
    • 2018
  • This paper presents an idea and framework to automate a full cycle business process management and re-engineering by integrating traditional business process management systems, process mining, data mining, machine learning, and simulation. We build our framework on the cloud-based platform such that various data sources can be incorporated. We design our systems to be extensible so that not only beneficial for practitioners of BPM, but also for researchers. Our framework can be used as a test bed for researchers without the complication of system integration. The automation of redesigning phase and selecting a baseline process model for deployment are the two main contributions of this study. In the redesigning phase, we deal with both the analysis of the existing process model and what-if analysis on how to improve the process at the same time, Additionally, improving a business process can be applied in a case by case basis that needs a lot of trial and error and huge data. In selecting the baseline process model, we need to compare many probable routes of business execution and calculate the most efficient one in respect to production cost and execution time. We also discuss the challenges and limitation of the framework, including the systems adoptability, technical difficulties and human factors.