• Title/Summary/Keyword: Event Data

Search Result 2,734, Processing Time 0.03 seconds

Design of Optimized Type-2 Fuzzy RBFNN Echo Pattern Classifier Using Meterological Radar Data (기상레이더를 이용한 최적화된 Type-2 퍼지 RBFNN 에코 패턴분류기 설계)

  • Song, Chan-Seok;Lee, Seung-Chul;Oh, Sung-Kwun
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.64 no.6
    • /
    • pp.922-934
    • /
    • 2015
  • In this paper, The classification between precipitation echo(PRE) and non-precipitation echo(N-PRE) (including ground echo and clear echo) is carried out from weather radar data using neuro-fuzzy algorithm. In order to classify between PRE and N-PRE, Input variables are built up through characteristic analysis of radar data. First, the event classifier as the first classification step is designed to classify precipitation event and non-precipitation event using input variables of RBFNNs such as DZ, DZ of Frequency(DZ_FR), SDZ, SDZ of Frequency(SDZ_FR), VGZ, VGZ of Frequency(VGZ_FR). After the event classification, in the precipitation event including non-precipitation echo, the non-precipitation echo is completely removed by the echo classifier of the second classifier step that is built as Type-2 FCM based RBFNNs. Also, parameters of classification system are acquired for effective performance using PSO(Particle Swarm Optimization). The performance results of the proposed echo classifier are compared with CZ. In the sequel, the proposed model architectures which use event classifier as well as the echo classifier of Interval Type-2 FCM based RBFNN show the superiority of output performance when compared with the conventional echo classifier based on RBFNN.

EDF: An Interactive Tool for Event Log Generation for Enabling Process Mining in Small and Medium-sized Enterprises

  • Frans Prathama;Seokrae Won;Iq Reviessay Pulshashi;Riska Asriana Sutrisnowati
    • Journal of the Korea Society of Computer and Information
    • /
    • v.29 no.6
    • /
    • pp.101-112
    • /
    • 2024
  • In this paper, we present EDF (Event Data Factory), an interactive tool designed to assist event log generation for process mining. EDF integrates various data connectors to improve its capability to assist users in connecting to diverse data sources. Our tool employs low-code/no-code technology, along with graph-based visualization, to help non-expert users understand process flow and enhance the user experience. By utilizing metadata information, EDF allows users to efficiently generate an event log containing case, activity, and timestamp attributes. Through log quality metrics, our tool enables users to assess the generated event log quality. We implement EDF under a cloud-based architecture and run a performance evaluation. Our case study and results demonstrate the usability and applicability of EDF. Finally, an observational study confirms that EDF is easy to use and beneficial, expanding small and medium-sized enterprises' (SMEs) access to process mining applications.

Proposing a New Approach for Detecting Malware Based on the Event Analysis Technique

  • Vu Ngoc Son
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.12
    • /
    • pp.107-114
    • /
    • 2023
  • The attack technique by the malware distribution form is a dangerous, difficult to detect and prevent attack method. Current malware detection studies and proposals are often based on two main methods: using sign sets and analyzing abnormal behaviors using machine learning or deep learning techniques. This paper will propose a method to detect malware on Endpoints based on Event IDs using deep learning. Event IDs are behaviors of malware tracked and collected on Endpoints' operating system kernel. The malware detection proposal based on Event IDs is a new research approach that has not been studied and proposed much. To achieve this purpose, this paper proposes to combine different data mining methods and deep learning algorithms. The data mining process is presented in detail in section 2 of the paper.

Prediction of EPB tunnelling performance for various grounds in Korea using discrete event simulation

  • Young Jin Shin;Jae Won Lee;Juhyi Yim;Han Byul Kang;Jae Hoon Jung;Jun Kyung Park
    • Geomechanics and Engineering
    • /
    • v.38 no.5
    • /
    • pp.467-476
    • /
    • 2024
  • This study investigates Tunnel Boring Machine (TBM) performance prediction by employing discrete event simulation technique, which is a potential remedy highlighting its stochastic adaptability to the complex nature of TBM tunnelling activities. The new discrete event simulation model using AnyLogic software was developed and validated by comparing its results with actual performance data for Daegok-Sosa railway project that Earth Pressure Balance (EPB) TBM machine was used in Korea. The results showed the successful implementation of predicting TBM performance. However, it necessitates high-quality database establishment including geological formations, machine specifications, and operation settings. Additionally, this paper introduces a novel methodology for daily performance updates during construction, using automated data processing techniques. This approach enables daily updates and predictions for the ongoing projects, offering valuable insights for construction management. Overall, this study underlines the potential of discrete event simulation in predicting TBM performance, its applicability to other tunneling projects, and the importance of continual database expansion for future model enhancements.

Video Event Detection according to Generating of Semantic Unit based on Moving Object (객체 움직임의 의미적 단위 생성을 통한 비디오 이벤트 검출)

  • Shin, Ju-Hyun;Baek, Sun-Kyoung;Kim, Pan-Koo
    • Journal of Korea Multimedia Society
    • /
    • v.11 no.2
    • /
    • pp.143-152
    • /
    • 2008
  • Nowadays, many investigators are studying various methodologies concerning event expression for semantic retrieval of video data. However, most of the parts are still using annotation based retrieval that is defined into annotation of each data and content based retrieval using low-level features. So, we propose a method of creation of the motion unit and extracting event through the unit for the more semantic retrieval than existing methods. First, we classify motions by event unit. Second, we define semantic unit about classified motion of object. For using these to event extraction, we create rules that are able to match the low-level features, from which we are able to retrieve semantic event as a unit of video shot. For the evaluation of availability, we execute an experiment of extraction of semantic event in video image and get approximately 80% precision rate.

  • PDF

Bivariate Data Analysis for the Lifetime and the Number of Indicative Events of a System

  • Lee, Sukhoon;Park, Heechang;Park, Raehyun
    • International Journal of Reliability and Applications
    • /
    • v.1 no.1
    • /
    • pp.65-79
    • /
    • 2000
  • This research considers a system which has an ultimate terminal event such as death, critical failure, bankruptcy together with a certain indicative events (temporary malfunction, special treatment, kind of defaults) that frequently occurs before the terminal event comes to the system. Some investigation of a model for the corresponding bivariate data of the system have been done with an explanation of the situation in terms of two continuous variables instead of continuous-discrete variables and some other properties. Also an analysis has been carried out to evaluate the effect of intermediate observation of occurrence of indicative event so that the result can be used for a possible suggestion of an intermediate observing schedule.

  • PDF

Development of a Portable Cardiac Event Recorder (휴대용 심전도 이벤트 기록기 개발)

  • Chun, H.G.;Kim, H.C.;Lee, C.Y.;Kim, I.Y.
    • Proceedings of the KOSOMBE Conference
    • /
    • v.1998 no.11
    • /
    • pp.187-188
    • /
    • 1998
  • A low cost, low power, portable cardiac event recorder as a tether-free biological signal processor was developed. Dual channel ECG signals are sampled at 128Hz in 12 bits resolution. Sampled data are continuously recorded in a circular buffer. If event button is pressed, 2 minutes data before and after the event are recorded in 512 Kbyte SRAM. Total 11 events can be recorded. Data can be transferred to PC through RS-232 protocol. It operates for two months by a half AA size 3.6V Lithium battery. The system size is $55\times55\times13[mm^3]$.

  • PDF

Wide-Area SCADA System with Distributed Security Framework

  • Zhang, Yang;Chen, Jun-Liang
    • Journal of Communications and Networks
    • /
    • v.14 no.6
    • /
    • pp.597-605
    • /
    • 2012
  • With the smart grid coming near, wide-area supervisory control and data acquisition (SCADA) becomes more and more important. However, traditional SCADA systems are not suitable for the openness and distribution requirements of smart grid. Distributed SCADA services should be openly composable and secure. Event-driven methodology makes service collaborations more real-time and flexible because of the space, time and control decoupling of event producer and consumer, which gives us an appropriate foundation. Our SCADA services are constructed and integrated based on distributed events in this paper. Unfortunately, an event-driven SCADA service does not know who consumes its events, and consumers do not know who produces the events either. In this environment, a SCADA service cannot directly control access because of anonymous and multicast interactions. In this paper, a distributed security framework is proposed to protect not only service operations but also data contents in smart grid environments. Finally, a security implementation scheme is given for SCADA services.

Joint HGLM approach for repeated measures and survival data

  • Ha, Il Do
    • Journal of the Korean Data and Information Science Society
    • /
    • v.27 no.4
    • /
    • pp.1083-1090
    • /
    • 2016
  • In clinical studies, different types of outcomes (e.g. repeated measures data and time-to-event data) for the same subject tend to be observed, and these data can be correlated. For example, a response variable of interest can be measured repeatedly over time on the same subject and at the same time, an event time representing a terminating event is also obtained. Joint modelling using a shared random effect is useful for analyzing these data. Inferences based on marginal likelihood may involve the evaluation of analytically intractable integrations over the random-effect distributions. In this paper we propose a joint HGLM approach for analyzing such outcomes using the HGLM (hierarchical generalized linear model) method based on h-likelihood (i.e. hierarchical likelihood), which avoids these integration itself. The proposed method has been demonstrated using various numerical studies.

Review of statistical methods for survival analysis using genomic data

  • Lee, Seungyeoun;Lim, Heeju
    • Genomics & Informatics
    • /
    • v.17 no.4
    • /
    • pp.41.1-41.12
    • /
    • 2019
  • Survival analysis mainly deals with the time to event, including death, onset of disease, and bankruptcy. The common characteristic of survival analysis is that it contains "censored" data, in which the time to event cannot be completely observed, but instead represents the lower bound of the time to event. Only the occurrence of either time to event or censoring time is observed. Many traditional statistical methods have been effectively used for analyzing survival data with censored observations. However, with the development of high-throughput technologies for producing "omics" data, more advanced statistical methods, such as regularization, should be required to construct the predictive survival model with high-dimensional genomic data. Furthermore, machine learning approaches have been adapted for survival analysis, to fit nonlinear and complex interaction effects between predictors, and achieve more accurate prediction of individual survival probability. Presently, since most clinicians and medical researchers can easily assess statistical programs for analyzing survival data, a review article is helpful for understanding statistical methods used in survival analysis. We review traditional survival methods and regularization methods, with various penalty functions, for the analysis of high-dimensional genomics, and describe machine learning techniques that have been adapted to survival analysis.