• Title/Summary/Keyword: 로그 생성

Search Result 333, Processing Time 0.021 seconds

Recovery Method Using Extendable Hashing Based Log in A Shared-Nothing Spatial Database Cluster (비공유 공간 데이터베이스 클러스터에서 확장성 해싱 기반의 로그를 이용한 회복 기법)

  • 장일국;장용일;박순영;배해영
    • Proceedings of the Korean Association of Geographic Inforamtion Studies Conference
    • /
    • 2004.03a
    • /
    • pp.7-10
    • /
    • 2004
  • 회복기법은 비공유 공간 데이터베이스 클러스터에서 고가용성을 위해 매우 중요하게 고려되고 있다. 일반적으로 데이터베이스 클러스터의 회복기법은 노드의 오류가 발생한 경우 로컬 로그와는 별도로 클러스터 로그를 생성하며, 이를 기반으로 해당 노드에서의 회복과정을 수행한다. 그러나, 기존의 기법은 하나의 레코드를 위해 다수의 갱신정보를 유지함으로써 클러스터 로그의 크기가 증가되고, 전송비용이 증가된다. 이는 회복노드에서 하나의 레코드에 대해 털러 번의 불필요한 연산을 실행하여 회복시간이 증가되고, 전체적인 시스템의 부하를 증가시키는 문제를 발생시킨다. 본 논문에서는 비공유 공간 데이터베이스 클러스터에서 확장성 해싱 기반의 로그를 이용한 회복기법을 제안한다. 제안기법에서의 클러스터 로그는 레코드 키값을 이용한 확장성 해싱을 기반으로 레코드의 변경사항과 실제 데이터를 가리키는 포인터 정보로 구성된다. 확장성 해싱 기반의 클러스터 로그는 크기와 전송비용이 감소하며, 회복노드는 하나의 레코드에 대해 한번의 갱신연산만 실행하므로 빠른 회복이 가능하다. 따라서 제안 기법은 확장성 해싱 기반의 클러스터 로그를 이용하여 효율적인 회복처리를 수행하며, 시스템의 가용성을 향상시킨다.

  • PDF

An Implementation of the Report View Generator using Program Performance Log Information (프로그램 성능 평가 로그 정보를 이용한 레포트 뷰 생성기 구현)

  • Cho Yong-Yoon;Yoo Chae-Woo
    • Journal of the Korea Society of Computer and Information
    • /
    • v.10 no.3 s.35
    • /
    • pp.35-44
    • /
    • 2005
  • A software developer can use a performance evaluation tool to elevate development speed and improve qualify of softwares. But, evaluation results that most performance evaluation tools offer are complicated strings. Therefore, a developer cannot intuitively understand the meanings of the results and must make much times and efforts in analysing the result. In this paper, we propose a report view generator that can transform and provide the text-based performance evaluation results for softwares with various graphic-based views. Our proposed generator consists of a screen generator that creates a structural XML document about the text-based performance evaluation results and a log analyzer that makes various report view through the created XML evaluation document. Because the XML evaluation result document can express the result information structured according to performance evaluation items for resources of softwares, it can have flexibility in offering and integrating the result information for the items. Through the suggested report view generator, developers can intuitively understand and analysis performance evaluation results of embedded software. And they can easily and quickly improve software quality and improve development efficiency of softwares.

  • PDF

CERES: A Log-based, Interactive Web Analytics System for Backbone Networks (CERES: 백본망 로그 기반 대화형 웹 분석 시스템)

  • Suh, Ilhyun;Chung, Yon Dohn
    • KIISE Transactions on Computing Practices
    • /
    • v.21 no.10
    • /
    • pp.651-657
    • /
    • 2015
  • The amount of web traffic has increased as a result of the rapid growth of the use of web-based applications. In order to obtain valuable information from web logs, we need to develop systems that can support interactive, flexible, and efficient ways to analyze and handle large amounts of data. In this paper, we present CERES, a log-based, interactive web analytics system for backbone networks. Since CERES focuses on analyzing web log records generated from backbone networks, it is possible to perform a web analysis from the perspective of a network. CERES is designed for deployment in a server cluster using the Hadoop Distributed File System (HDFS) as the underlying storage. We transform and store web log records from backbone networks into relations and then allow users to use a SQL-like language to analyze web log records in a flexible and interactive manner. In particular, we use the data cube technique to enable the efficient statistical analysis of web log. The system provides users a web-based, multi-modal user interface.

Contextual-Bandit Based Log Level Setting for Video Wall Controller (Contextual Bandit에 기반한 비디오 월 컨트롤러의 로그레벨)

  • Kim, Sung-jin
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2022.05a
    • /
    • pp.633-635
    • /
    • 2022
  • If an error occurs during operation of the video wall controller, the control system creates a log file and records the log. In order to minimize the load on the system due to log recording, the log level is set so that the log is not recorded as much as possible under normal operating conditions. When an error occurs, detailed logs are recorded by changing the log level to analyze and respond to the cause of the error. So work efficiency is reduced, and operator intervention is inevitable to change the log level. In this paper, we propose a model that automatically sets the log level according to the operating situation using Contextual Bandit.

  • PDF

An Efficient Log Data Management Architecture for Big Data Processing in Cloud Computing Environments (클라우드 환경에서의 효율적인 빅 데이터 처리를 위한 로그 데이터 수집 아키텍처)

  • Kim, Julie;Bahn, Hyokyung
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.13 no.2
    • /
    • pp.1-7
    • /
    • 2013
  • Big data management is becoming increasingly important in both industry and academia of information science community. One of the important categories of big data generated from software systems is log data. Log data is generally used for better services in various service providers and can also be used as information for qualification. This paper presents a big data management architecture specialized for log data. Specifically, it provides the aggregation of log messages sent from multiple clients and provides intelligent functionalities such as analyzing log data. The proposed architecture supports an asynchronous process in client-server architectures to prevent the potential bottleneck of accessing data. Accordingly, it does not affect the client performance although using remote data store. We implement the proposed architecture and show that it works well for processing big log data. All components are implemented based on open source software and the developed prototypes are now publicly available.

Multi-Factor Authentication System based on Software Secure Card-on-Matching For Secure Login (안전한 로그인을 위한 소프트 보안카드 기반 다중 인증 시스템)

  • Lee, Hyung-Woo
    • The Journal of the Korea Contents Association
    • /
    • v.9 no.3
    • /
    • pp.28-38
    • /
    • 2009
  • Login process uses both ID and password information to authenticate someone and to permit its access privilege on system. However, an attacker can get those ID and password information by using existing packet sniffing or key logger programs. It cause privacy problem as those information can be used as a hacking and network attack on web server and web e-mail system. Therefore, a more secure and advanced authentication mechanism should be required to enhance the authentication process on existing system. In this paper, we propose a multi-factor authentication process by using software form of secure card system combined with existing ID/Password based login system. Proposed mechanism uses a random number generated from the his/her own handset with biometric information. Therefore, we can provide a one-time password function on web login system to authenticate the user using multi-factor form. Proposed scheme provide enhanced authentication function and security because it is a 'multi-factor authentication mechanism' combined with handset and biometric information on web login system.

Generating Multidimensional Random Tables (다차원 임의 분할표 생성)

  • Choi, Hyun-Jip
    • The Korean Journal of Applied Statistics
    • /
    • v.19 no.3
    • /
    • pp.545-554
    • /
    • 2006
  • We suggest a method for generating multidimensional random tables based on the log-linear models. A linear combination approach by Lee(1997) is applied to get the joint distribution with the well known Pearson chi-squared statistics. We can generate completely associated joint distributions which have the fixed association among three variables by using the suggested method. Therefore the method can be extended to more higher dimension than the three dimensional tables.

Anomalous Pattern Analysis of Large-Scale Logs with Spark Cluster Environment

  • Sion Min;Youyang Kim;Byungchul Tak
    • Journal of the Korea Society of Computer and Information
    • /
    • v.29 no.3
    • /
    • pp.127-136
    • /
    • 2024
  • This study explores the correlation between system anomalies and large-scale logs within the Spark cluster environment. While research on anomaly detection using logs is growing, there remains a limitation in adequately leveraging logs from various components of the cluster and considering the relationship between anomalies and the system. Therefore, this paper analyzes the distribution of normal and abnormal logs and explores the potential for anomaly detection based on the occurrence of log templates. By employing Hadoop and Spark, normal and abnormal log data are generated, and through t-SNE and K-means clustering, templates of abnormal logs in anomalous situations are identified to comprehend anomalies. Ultimately, unique log templates occurring only during abnormal situations are identified, thereby presenting the potential for anomaly detection.

Anomaly Intrusion Detection based on Clustering in Network Environment (클러스터링 기법을 활용한 네트워크 비정상행위 탐지)

  • 오상현;이원석
    • Proceedings of the Korea Institutes of Information Security and Cryptology Conference
    • /
    • 2003.12a
    • /
    • pp.179-184
    • /
    • 2003
  • 컴퓨터를 통한 침입을 탐지하기 위해서 많은 연구들이 오용탐지 기법을 개발하였다. 최근에는 오용 탐지 기법을 개선하기 위해서 비정상행위 탐지 기법에 관련된 연구들이 진행중이다. 본 논문에서는 클러스터링 기법을 응용한 새로운 네트워크 비정상행위 탐지 기법을 제안한다. 이를 위해서 정상 행위를 다양한 각도에서 분석될 수 있도록 네트워크 로그로부터 여러 특징들을 추출하고 각 특징에 대해서 클러스터링 알고리즘을 이용하여 정상행위 패턴을 생성한다. 제안된 방법에서는 정상행위 패턴 즉 클러스터를 축약된 프로파일로 생성하는 방법을 제시하며 제안된 방법의 성능을 평가하기 위해서 DARPA에서 수집된 네트워크 로그를 이용하였다.

  • PDF

A Security Log Analysis System using Logstash based on Apache Elasticsearch (아파치 엘라스틱서치 기반 로그스태시를 이용한 보안로그 분석시스템)

  • Lee, Bong-Hwan;Yang, Dong-Min
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.22 no.2
    • /
    • pp.382-389
    • /
    • 2018
  • Recently cyber attacks can cause serious damage on various information systems. Log data analysis would be able to resolve this problem. Security log analysis system allows to cope with security risk properly by collecting, storing, and analyzing log data information. In this paper, a security log analysis system is designed and implemented in order to analyze security log data using the Logstash in the Elasticsearch, a distributed search engine which enables to collect and process various types of log data. The Kibana, an open source data visualization plugin for Elasticsearch, is used to generate log statistics and search report, and visualize the results. The performance of Elasticsearch-based security log analysis system is compared to the existing log analysis system which uses the Flume log collector, Flume HDFS sink and HBase. The experimental results show that the proposed system tremendously reduces both database query processing time and log data analysis time compared to the existing Hadoop-based log analysis system.