• Title/Summary/Keyword: Data Analysis Module

Search Result 874, Processing Time 0.028 seconds

The Design and Implementation of Parameter Extraction System for Analyzing Internet Using SNMP (SNMP를 이용한 인터넷 분석 파라미터 추출 시스템의 설계 및 구현)

  • Sin, Sang-Cheol;An, Seong-Jin;Jeong, Jin-Uk
    • The Transactions of the Korea Information Processing Society
    • /
    • v.6 no.3
    • /
    • pp.710-721
    • /
    • 1999
  • In this paper, we have designed and implemented a parameter extraction system for analyzing Internet using SNMP. The extraction system has two modules; one is collection request module, and the other is analysis request module. The collection request module generates a polling script, which is used to collect management information from the managed system periodically. With this collected data, analysis request module extracts analysis parameters. These parameters are traffic flow analysis, interface traffic analysis, packet traffic analysis, and management traffic analysis parameter. For management activity, we have introduced two-step-analysis-view. One is Summary-View, which is used find out malfunction of a system among the entire managed systems. The Other is Specific-View. With this view we can analyze the specific system with all our analysis parameters. To show available data as indicators for line capacity planning, network redesigning decision making of performance upgrade for a network device and things like that.

  • PDF

A Development of a Precision Underwater Data Aquisition System (정밀수중자료획득 장치 개발)

  • Kim, Y.I.;Yoon, K.H.;Park, S.S.
    • Proceedings of the Korean Society of Marine Engineers Conference
    • /
    • 2006.06a
    • /
    • pp.213-214
    • /
    • 2006
  • In this paper, it is described about a system that acquire several underwater information. This system is composed of SIM(Sensor Interface Module), MCM(Main Control Module), PSD(Precision Sensor Driver), PMM(Power Management Module), and Data Analysis Program etc.

  • PDF

Design and Implementation of MongoDB-based Unstructured Log Processing System over Cloud Computing Environment (클라우드 환경에서 MongoDB 기반의 비정형 로그 처리 시스템 설계 및 구현)

  • Kim, Myoungjin;Han, Seungho;Cui, Yun;Lee, Hanku
    • Journal of Internet Computing and Services
    • /
    • v.14 no.6
    • /
    • pp.71-84
    • /
    • 2013
  • Log data, which record the multitude of information created when operating computer systems, are utilized in many processes, from carrying out computer system inspection and process optimization to providing customized user optimization. In this paper, we propose a MongoDB-based unstructured log processing system in a cloud environment for processing the massive amount of log data of banks. Most of the log data generated during banking operations come from handling a client's business. Therefore, in order to gather, store, categorize, and analyze the log data generated while processing the client's business, a separate log data processing system needs to be established. However, the realization of flexible storage expansion functions for processing a massive amount of unstructured log data and executing a considerable number of functions to categorize and analyze the stored unstructured log data is difficult in existing computer environments. Thus, in this study, we use cloud computing technology to realize a cloud-based log data processing system for processing unstructured log data that are difficult to process using the existing computing infrastructure's analysis tools and management system. The proposed system uses the IaaS (Infrastructure as a Service) cloud environment to provide a flexible expansion of computing resources and includes the ability to flexibly expand resources such as storage space and memory under conditions such as extended storage or rapid increase in log data. Moreover, to overcome the processing limits of the existing analysis tool when a real-time analysis of the aggregated unstructured log data is required, the proposed system includes a Hadoop-based analysis module for quick and reliable parallel-distributed processing of the massive amount of log data. Furthermore, because the HDFS (Hadoop Distributed File System) stores data by generating copies of the block units of the aggregated log data, the proposed system offers automatic restore functions for the system to continually operate after it recovers from a malfunction. Finally, by establishing a distributed database using the NoSQL-based Mongo DB, the proposed system provides methods of effectively processing unstructured log data. Relational databases such as the MySQL databases have complex schemas that are inappropriate for processing unstructured log data. Further, strict schemas like those of relational databases cannot expand nodes in the case wherein the stored data are distributed to various nodes when the amount of data rapidly increases. NoSQL does not provide the complex computations that relational databases may provide but can easily expand the database through node dispersion when the amount of data increases rapidly; it is a non-relational database with an appropriate structure for processing unstructured data. The data models of the NoSQL are usually classified as Key-Value, column-oriented, and document-oriented types. Of these, the representative document-oriented data model, MongoDB, which has a free schema structure, is used in the proposed system. MongoDB is introduced to the proposed system because it makes it easy to process unstructured log data through a flexible schema structure, facilitates flexible node expansion when the amount of data is rapidly increasing, and provides an Auto-Sharding function that automatically expands storage. The proposed system is composed of a log collector module, a log graph generator module, a MongoDB module, a Hadoop-based analysis module, and a MySQL module. When the log data generated over the entire client business process of each bank are sent to the cloud server, the log collector module collects and classifies data according to the type of log data and distributes it to the MongoDB module and the MySQL module. The log graph generator module generates the results of the log analysis of the MongoDB module, Hadoop-based analysis module, and the MySQL module per analysis time and type of the aggregated log data, and provides them to the user through a web interface. Log data that require a real-time log data analysis are stored in the MySQL module and provided real-time by the log graph generator module. The aggregated log data per unit time are stored in the MongoDB module and plotted in a graph according to the user's various analysis conditions. The aggregated log data in the MongoDB module are parallel-distributed and processed by the Hadoop-based analysis module. A comparative evaluation is carried out against a log data processing system that uses only MySQL for inserting log data and estimating query performance; this evaluation proves the proposed system's superiority. Moreover, an optimal chunk size is confirmed through the log data insert performance evaluation of MongoDB for various chunk sizes.

A Study on the Real-Time Monitoring System of Wind Power in Jeju (제주지역 풍력발전량 실시간 감시 시스템 구축에 관한 연구)

  • Kim, Kyoung-Bo;Yang, Kyung-Bu;Park, Yun-Ho;Mun, Chang-Eun;Park, Jeong-Keun;Huh, Jong-Chul
    • Journal of the Korean Solar Energy Society
    • /
    • v.30 no.3
    • /
    • pp.25-32
    • /
    • 2010
  • A real-time monitoring system was developed for transfer, receive, backup and analysis of wind power data at three wind farm(Hang won, Hankyung and Sung san) in Jeju. For this monitoring system a communication system analysis, a collection of data and transmission module development, data base construction and data analysis and management module was developed, respectively. These modules deal with mechanical, electrical and environmental problem. Especially, time series graphic is supported by the data analysis and management module automatically. The time series graphic make easier to raw data analysis. Also, the real-time monitoring system is connected with wind power forecasting system through internet web for data transfer to wind power forecasting system's data base.

Analysis of TCP packet by Protocol Analysis module Design (프로토콜 분석모듈 설계에 의한 TCP 패킷 분석)

  • Eom, Gum-Yong
    • Proceedings of the KIEE Conference
    • /
    • 2004.11c
    • /
    • pp.234-236
    • /
    • 2004
  • Transmission control protocol(TCP) is protocol used in internet. TCP is seldom transmission error and is protocol based on wire environment. TCP uses 3 way handshake ways, data transmission control through windows size, data transmission control through reception confirmation, sliding window for packet delivery. In this study, designed TCP packet ion module for analyze the TCP segments & correct information about TCP. TCP capture in internet using designed TCP module and analysed TCP segments composition. Through this, could analyze the correct information of protocol in network.

  • PDF

Development of an Unsteady Aerodynamic Analysis Module for Rotor Comprehensive Analysis Code

  • Lee, Joon-Bae;Yee, Kwan-Jung;Oh, Se-Jong;Kim, Do-Hyung
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.10 no.2
    • /
    • pp.23-33
    • /
    • 2009
  • The inherent aeromechanical complexity of a rotor system necessitated the comprehensive analysis code for helicopter rotor system. In the present study, an aerodynamic analysis module has been developed as a part of rotorcraft comprehensive program. Aerodynamic analysis module is largely classified into airload calculation routine and inflow analysis routine. For airload calculation, quasi-steady analysis model is employed based on the blade element method with the correction of unsteady aerodynamic effects. In order to take unsteady effects - body motion effects and dynamic stall - into account, aerodynamic coefficients are corrected by considering Leishman-Beddoes's unsteady model. Various inflow models and vortex wake models are implemented in the aerodynamic module to consider wake induced inflow. Specifically, linear inflow, dynamic inflow, prescribed wake and free wake model are integrated into the present module. The aerodynamic characteristics of each method are compared and validated against available experimental data such as Elliot's induced inflow distribution and sectional normal force coefficients of AH-1G. In order to validate unsteady aerodynamic model, 2-D unsteady model for NACA0012 airfoil is validated against aerodynamic coefficients of McAlister's experimental data.

Biaxial Accelerometer-based Magnetic Compass Module Calibration and Analysis of Azimuth Computational Errors Caused by Accelerometer Errors (2 축 가속도계 기반 지자기 센서 모듈의 교정 및 가속도계 오차에 의한 방위각 계산 오차 분석)

  • Cho, Seong Yun
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.2
    • /
    • pp.149-156
    • /
    • 2014
  • A magnetic compass module must be calibrated accurately before use. Moreover, the calibration process must be performed taking into account any magnetic dip if the magnetic compass module has tilt angles. For this, a calibration method for a magnetic compass module is explained. Tilt error of the magnetic compass module is compensated using a biaxial accelerometer generally. The accelerometer error causes a tilt angle calculation error that gives rise to an azimuth calculation error. For error property analysis, error equations are derived and simulations are performed. In the simulation results, the accuracy of derived error equations is verified. If a biaxial magnetic compass module is used instead of a triaxial one, the magnetic dip and z-axis magnetic compass data must be estimated for tilt compensation. Lastly, estimation equations for the magnetic dip and z-axis magnetic compass data are derived, and the performance of the equations is verified based on a simulation.

BIM-Based Integrated Module for Apartment Environmental Performance and Energy Analysis (BIM기반 공동주택 환경성능 및 에너지 해석 시스템 통합 개발)

  • Suh, Hye-Soo;Lee, Soo-Hyun;Lim, Jae-Sang;Choi, Cheol-Ho
    • Journal of KIBIM
    • /
    • v.4 no.2
    • /
    • pp.1-9
    • /
    • 2014
  • As interest in green building has increased, construction market has evolved through BIM-based architecture also, BIM-based technologies have been developed simultaneously. Due to this aspect, the need of environmental analysis software utilizing BIM data became essential. This study shows that BIM-based integrated module provides objective analysis to proceed quick decision-making for a proposal. In addition to that, this integrated module creates a model through BIM data to analyze and report residential environment and energy consumption such as, daylight, view, ventilation and privacy in order to practically apply the BIM technology from the schematic design.

3D Parametric Modeling of RC Piers and Development of Data Generation Module for a Structural Analysis with 3D Model of RC Piers (RC 교각의 3차원 매개변수 모델링 및 비선형 구조해석 입력 데이터 생성 모듈 구축)

  • Son, You-Jin;Shin, Won-Chul;Lee, Sang-Chul;Lee, Heon-Min;Shin, Hyun-Mock
    • Journal of KIBIM
    • /
    • v.3 no.3
    • /
    • pp.19-28
    • /
    • 2013
  • In Korea highway bridges, most piers are the type of one-column or multi-column ones. So, in this study, under an environment applying BIM so fast, to activate researches on two-column piers subjected to bidirectional seismic loading, a 3D parametric modeling method was selected when the model of two-column piers and one-column piers were formed. Also, interface module between input data in structural analysis and 3D model of RC pier was developed. The module can create the input data for non-linear structural analysis like material, geometric properties and additional coefficients.

Thermo-Mechanical Fatigue Analysis of Ribbon Wire/Ag Electrode Interfaces for PV Module

  • Park, No-Chang;Hong, Won-Sik;Han, Chang-Un;Kim, Dong-Hwan
    • Proceedings of the Materials Research Society of Korea Conference
    • /
    • 2011.05a
    • /
    • pp.48.1-48.1
    • /
    • 2011
  • In this presentation, We monitored weather data, such as global irradiance, ambient temperature, temperature of PV module, relative humidity and windspeed for 2 years, for determining accelerated test condition. then, we determined the temperature limit of accelerated test through weather data and FEM analysis. Detailed procedures will be summarized in this work. After analysing outdoor stress such as thermal stress, we decided main failure modes and mechanisms of PV module, especially solder joint of ribbon wire. we carried out the measurement of material properties such as thermal expansion coefficient for planning of accelerated test. we designed accelerated test based on FEM analysis results. we carried out thermal cycling test with 1 cell mini module for 3 months. We monitored the change of electrical performance every 1 week such as Voc, Isc, Pmax, etc. and then, we analized the ribbon wire/electrode intefaces. Detailed results will be summarized in this work.

  • PDF