• Title/Summary/Keyword: Process Data Analysis

Search Result 9,332, Processing Time 0.048 seconds

IFCXML BASED AUTOMATIC DATA INPUT APPROACH FOR BUILDING ENERGY PERFORMANCE ANALYSIS

  • Ka-Ram Kim;Jung-Ho Yu
    • International conference on construction engineering and project management
    • /
    • 2013.01a
    • /
    • pp.173-180
    • /
    • 2013
  • To analyze building energy consumption, the building description for building energy performance analysis (BEPA) is required. The required data input for subject building is a basic step in the BEPA process. Since building information modeling (BIM) is applied in the construction industry, the required data for BEPA can be gathered from a single international standard file format like IFCXML. However, in most BEPA processes, since the required data cannot be fully used from the IFCXML file, a building description for BEPA must be created again. This paper proposes IFCXML-based automatic data input approach for BEA. After the required data for BEPA has been defined, automatic data input for BEPA is developed by a prototype system. To evaluate the proposed system, a common BIM file from the BuildingSMART website is applied as a sample model. This system can increase the efficiency and reliability of the BEPA process, since the data input is automatically and efficiently improved by directly using the IFCXML file.

  • PDF

A Container Orchestration System for Process Workloads

  • Jong-Sub Lee;Seok-Jae Moon
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.15 no.4
    • /
    • pp.270-278
    • /
    • 2023
  • We propose a container orchestration system for process workloads that combines the potential of big data and machine learning technologies to integrate enterprise process-centric workloads. This proposed system analyzes big data generated from industrial automation to identify hidden patterns and build a machine learning prediction model. For each machine learning case, training data is loaded into a data store and preprocessed for model training. In the next step, you can use the training data to select and apply an appropriate model. Then evaluate the model using the following test data: This step is called model construction and can be performed in a deployment framework. Additionally, a visual hierarchy is constructed to display prediction results and facilitate big data analysis. In order to implement parallel computing of PCA in the proposed system, several virtual systems were implemented to build the cluster required for the big data cluster. The implementation for evaluation and analysis built the necessary clusters by creating multiple virtual machines in a big data cluster to implement parallel computation of PCA. The proposed system is modeled as layers of individual components that can be connected together. The advantage of a system is that components can be added, replaced, or reused without affecting the rest of the system.

A Study on System Applications of e-CRM to Enforcement of consumer Service (e-Commerce 쇼핑몰의 소비자 서비스 강화를 위한 활용연구)

  • Kim Yeonjeong
    • Journal of the Korean Home Economics Association
    • /
    • v.43 no.3 s.205
    • /
    • pp.1-10
    • /
    • 2005
  • The purpose of this study was to investigate the enforcement strategy for Consumer Service marketing of an e-Commerce shopping mall. An e-CRM for a Cosmetic e-Commerce shopping mall, Data Warehousing(DW) component, analysis of data mining of the DW, and web applications and strategies had to developed for marketing of consumer service satisfaction. The major findings were as follows: An RFM analysis was used for consumer classification, which is a fundamental process of e-CRM application. The components of the DW were web sales data and consumer data fields. The visual process of consumer segmentations (superior consumer class) for e-CRM solutions is presented. The association analysis algorithm of data mining to up-selling and cross-selling indicates an association rule. These e-CRM results apply web DB marketing and operating principles to a shopping mall. Therefore, the system applications of e-CRM to Consumer services indicate a marketing strategy for consumer-oriented management.

Analysis of Freight Big Data using R-Language (화물 배차 빅데이터 분석)

  • Selvaraj, Suganya;Choi, Eunmi
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2018.05a
    • /
    • pp.320-322
    • /
    • 2018
  • Data analysis is a process of generating useful information by evaluating real-world raw data for making better decisions in business development. In the freight transport logistics companies, the analysis of freight data is increasingly garnering considerable importance among the users for making better decisions regarding freight cost reductions. Consequently, in this study, we used R programming language to analyze the freight data that are collected from freight transport logistics company. Usually, the freight rate varies based on chosen day of the week. In here, we analyzed and visualized the results such as frequency of cost vs days, frequency of requested goods in ton vs days, frequency of order vs days, and frequency of order status vs days for the last one-year freight data. These analysis results are beneficial in the viewpoint of the users in ordering process.

Comparative Policy Analysis on ICT Small and Medium-sized Venture Using Cognitive Map Analysis (인지지도를 활용한 ICT 중소벤처 지원정책 비교분석)

  • Park, Eunyub;Lee, Jung Mann
    • Journal of Information Technology Applications and Management
    • /
    • v.29 no.3
    • /
    • pp.75-93
    • /
    • 2022
  • The purpose of this study is to compare and analyze each government's ICT SME support policies to cope with changes in the ICT ecosystem paradigm. In particular, the core policies and policy trends of the Moon's government are presented through keyword network analysis and cognitive map analysis. As a result, core technologies such as ICT(Information Communication Technology), AI(Artificial Intelligence), Big Data, and 5G, which have high values of betweenness centrality and closeness centrality, are major keywords with high propagation power. The cognitive map analysis shows that the opportunity factors for the 4th industrial revolution are being activated through the ICT infrastructure circulation process, the domestic market circulation process, and the global market circulation process. This study is meaningful in terms of cognitive map analysis and utilization based on scientific analysis.

A Study on Process and Case of RAM Analysis in Ground Weapon System Using Field-Data (야전운용제원을 활용한 지상무기체계 RAM 분석 절차 및 사례연구)

  • Park, Gyeong-Mi
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.20 no.5
    • /
    • pp.485-491
    • /
    • 2019
  • In this paper, we present a process and case of RAM analysis using Field-Data of the ground weapon system in operation in the army. In order to perform RAM analysis in filed-Data, we propose data collection, data refining and calibration, and RAM analysis process. RAM analysis was performed with the RAM verification and evaluation system developed by Defense Agency for Technology and Quality. We enhance the objectivity and reliability in result of data, which contains a variety of conditions; operation and maintenance concept of domestic ground weapon system, relevant regulation and after-sales service data of developer. Results are compared 2015, 2018 and development RAM value. We prove results of RAM analysis through discussion experts. Studies show that proposed method can effectively apply database from setting to evaluation RAM value in various ground weapon system.

A Tool to Support Personal Software Process (개인 소프트웨어 프로세스 지원을 위한 도구)

  • Shin, Hyun-Il;Jung, Kyoung-Hak;Song, Il-Sun;Choi, Ho-Jin;Baik, Jong-Moon
    • Journal of KIISE:Software and Applications
    • /
    • v.34 no.8
    • /
    • pp.752-762
    • /
    • 2007
  • The PSP (Personal Software Process) is developed to help developers make high-quality products through improving their personal process. With consistent measurement and analysis activity that the PSP suggests, developers can identify process deficiencies and make reliable estimates on effort and quality. However, due to the high-overhead and context-switching problem of manual data recording, developers have difficulties in collecting reliable data, which can lead wrong analysis results. On the other hand, the paper-based process guides of the PSP are inconvenient to navigate its process information and difficult to attach additional information. In this paper, we introduce a PSP supporting tool developed to handle these problems. The tool provides automated data collection facilities to help acquire reliable data, an EPG (Electronic Process Guide) for the PSP to provide easy access and navigation of the process information, and an experience repository to store development experience as additional information about the process.

Information Modeling for Finite Element Analysis Using STEP (STEP을 이용한 유한요소해석 정보모델 구축)

  • Choi, Young;Cho, Seong-Wook;Kwon, Ki-Eak
    • Korean Journal of Computational Design and Engineering
    • /
    • v.3 no.1
    • /
    • pp.48-56
    • /
    • 1998
  • Finite element analysis is very important in the design and analysis of mechanical engineering. The process of FEA encompasses shape modeling, mesh generation, matrix solving and post-processing. Some of these processes can be tightly integrated with the current software architectures and data sharing mode. However, complete integration of all the FEA process itself and the integration to the manufacturing processes is almost impossible in the current practice. The barriers to this problem are inconsistent data format and the enterprise-wise software integration technology. In this research, the information model based on STEP AP209 was chosen for handling finite element analysis data. The international standard for the FEA data can bridge the gap between design, analysis and manufacturing processes. The STEP-based FEA system can be further tightly integrated to the distributed software and database environment using CORBA technology. The prototype FEA system DICESS is implemented to verify the proposed concepts.

  • PDF

A Study on Searching Stabled EMI Shielding Effectiveness Measurement Point for Military Communication Shelter Using Support Vector Machine and Process Capability Analysis (서포트 벡터 머신과 공정능력분석을 이용한 군 통신 쉘터의 EMI 차폐효과 안정 포인트 탐색 연구)

  • Ku, Ki-Beom;Kwon, Jae-Wook;Jin, Hong-Sik
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.20 no.2
    • /
    • pp.321-328
    • /
    • 2019
  • A military shelter for communication and information is necessary to optimize the integrated combat ability of weapon systems in the network centric warfare. Therefore, the military shelter is required for EMI shielding performance. This study examines the stable measurement points for EMI shielding effectiveness of a military shelter for communication and information. The measurement points were found by analyzing the EMI shielding effectiveness measurement data with data mining technique and process capability analysis. First, a support vector machine was used to separate the measurement point that has stable EMI shielding effectiveness according to set condition. Second, this process was conducted with process capability analysis. Finally, the results of data mining technique were compared with those of process capability analysis. As a result, 24 measurement points with stable EMI shielding effectiveness were found.

Synthesis of Human Body Shape for Given Body Sizes using 3D Body Scan Data (3차원 스캔 데이터를 이용하여 임의의 신체 치수에 대응하는 인체 형상 모델 생성 방법)

  • Jang, Tae-Ho;Baek, Seung-Yeob;Lee, Kun-Woo
    • Korean Journal of Computational Design and Engineering
    • /
    • v.14 no.6
    • /
    • pp.364-373
    • /
    • 2009
  • In this paper, we suggest the method for constructing parameterized human body model which has any required body sizes from 3D scan data. Because of well developed 3D scan technology, we can get more detailed human body model data which allow to generate precise human model. In this field, there are a lot of research is performed with 3D scan data. But previous researches have some limitations to make human body model. They need too much time to perform hole-filling process or calculate parameterization of model. Even more they missed out verification process. To solve these problems, we used several methods. We first choose proper 125 3D scan data from 5th Korean body size survey of Size Korea according to age, height and weight. We also did post process, feature point setting, RBF interpolation and align, to parameterize human model. Then principal component analysis is adapted to the result of post processed data to obtain dominant shape parameters. These steps allow to reduce process time without loss of accuracy. Finally, we compare these results and statistical data of Size Korea to verify our parameterized human model.