• Title/Summary/Keyword: Map-Reduce

Search Result 852, Processing Time 0.027 seconds

APPLICATION OF LOGISTIC REGRESS10N A MODEL FOR LANDSLIDE SUSCEPTIBILITY MAPPING USING GIS AT JANGHUNG, KOREA

  • Saro, Lee;Choi, Jae-Won;Yu, Young-Tae
    • Proceedings of the Korean Association of Geographic Inforamtion Studies Conference
    • /
    • 2003.04a
    • /
    • pp.64-64
    • /
    • 2003
  • The aim of this study is to apply and verify of logistic regression at Janghung, Korea, using a Geographic Information System (GIS). Landslide locations were identified in the study area from interpretation of IRS satellite images, field surveys, and maps of the topography, soil type, forest cover, geology and land use were constructed to spatial database. The factors that influence landslide occurrence, such as slope, aspect and curvature of topography were calculated from the topographic database.13${\times}$1ure, material, drainage and effective soil thickness were extracted from the soil database, and type, diameter and density of forest were extracted from the forest database. Land use was classified from the Landsat TM image satellite image. As each factor's ratings, the logistic regression coefficient were overlaid for landslide susceptibility mapping. Then the landslide susceptibility map was verified and compared using the existing landslide location. The results can be used to reduce hazards associated with landslides management and to plan land use and construction.

  • PDF

APPLICATION OF LIKELIHOOD RATIO A MODEL FOR LANDSLIDE SUSCEPTIBILITY MAPPING USING GIS AT JANGHUNG, KOREA

  • Choi, Jae-Won;Lee, Saro;Yu, Young-Tae
    • Proceedings of the Korean Association of Geographic Inforamtion Studies Conference
    • /
    • 2003.04a
    • /
    • pp.63-63
    • /
    • 2003
  • The aim of this study is to apply and verify of Bayesian probability model, the likelihood ratio and statistical model, at Janghung, Korea, using a Geographic Information System (GIS). Landslide locations were identified in the study area from interpretation of IRS satellite images, field surveys, and maps of the topography, soil type, forest cover, geology and land use were constructed to spatial database. The factors that influence landslide occurrence, such as slope, aspect and curvature of topography were calculated from the topographic database. Texture, material, drainage and effective soil thickness were extracted from the soil database, and type, diameter and density of forest were extracted from the forest database. Land use was classified from the Landsat TM image satellite image. As each factor's ratings, the likelihood ratio coefficient were overlaid for landslide susceptibility mapping, Then the landslide susceptibility map was verified and compared using the existing landslide location. The results can be used to reduce hazards associated with landslides management and to plan land use and construction.

  • PDF

Integration of AutoCAD and Microsoft Excel for Forest Survey Application

  • Mamat, Mohd Rizuwan;Hamzah, Khali Aziz;Rashid, Muhammad Farid;Faidi, Mohd Azahari;Norizan, Azharizan Mohd
    • Journal of Forest and Environmental Science
    • /
    • v.29 no.4
    • /
    • pp.307-313
    • /
    • 2013
  • Forest Survey consists of road survey, topographic survey, tree mapping survey, stream survey and also ridge survey. Information from forest survey is important and become essential in preparing base map to be used for forest harvesting planning and control. With the current technologies technique of data processing and mapping from traditionally hand drawn method had shifted to a computer system particularly the use of Computer Aided Design (CAD). This gives great advantages to the forest managers and logging operators. However data processing and mapping duration could be further reduced by integrating CAD with other established software such as Microsoft Excel. This time study to show that there is significance difference in term of duration for data processing and efficiency using AutoCAD in combination with Microsoft Excel program as compare to the use of AutoCAD program alone. From the study, it shows that the integration of AutoCAD and Microsoft Excel is able to reduce 70% of duration for data processing and mapping as compared to the use of AutoCAD program alone.

A Study for properties of Mapping processes to 3D game modeling (3D 게임 MAP을 위한 MAPPING 방법 특성 연구)

  • Cho, Hyung-ik
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2012.10a
    • /
    • pp.447-449
    • /
    • 2012
  • There are many essential elements in making the realistic 3D games, of which one of the most important is mapping. Mapping can add the details to 3D objects easily, which are impossible or difficult by modeling alone, and can reduce file data capacities to run the game engine. The theory of the mapping is to throw 2D image on 3D object correctly, but they do not match each other at the rate of 1:1 100 percent completely due to the difference of the fundamental property of them. To solve this problem, various kinds of mapping methods are developed and used. This paper will compare and analyze the characteristics, and merits and demerits of these various kinds of mapping methods and will inquire which method is the most efficient one.

  • PDF

AAW-based Cell Image Segmentation Method (적응적 관심윈도우 기반의 세포영상 분할 기법)

  • Seo, Mi-Suk;Ko, Byoung-Chul;Nam, Jae-Yeal
    • The KIPS Transactions:PartB
    • /
    • v.14B no.2
    • /
    • pp.99-106
    • /
    • 2007
  • In this paper, we present an AAW(Adaptive Attention Window) based cell image segmentation method. For semantic AAW detection we create an initial Attention Window by using a luminance map. Then the initial AW is reduced to the optimal size of the real ROI(Region of Interest) by using a quad tree segmentation. The purpose of AAW is to remove the background and to reduce the amount of processing time for segmenting ROIs. Experimental results show that the proposed method segments one or more ROIs efficiently and gives the similar segmentation result as compared with the human perception.

Design and Implementation of Big Data Platform for Image Processing in Agriculture (농업 이미지 처리를 위한 빅테이터 플랫폼 설계 및 구현)

  • Nguyen, Van-Quyet;Nguyen, Sinh Ngoc;Vu, Duc Tiep;Kim, Kyungbaek
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2016.10a
    • /
    • pp.50-53
    • /
    • 2016
  • Image processing techniques play an increasingly important role in many aspects of our daily life. For example, it has been shown to improve agricultural productivity in a number of ways such as plant pest detecting or fruit grading. However, massive quantities of images generated in real-time through multi-devices such as remote sensors during monitoring plant growth lead to the challenges of big data. Meanwhile, most current image processing systems are designed for small-scale and local computation, and they do not scale well to handle big data problems with their large requirements for computational resources and storage. In this paper, we have proposed an IPABigData (Image Processing Algorithm BigData) platform which provides algorithms to support large-scale image processing in agriculture based on Hadoop framework. Hadoop provides a parallel computation model MapReduce and Hadoop distributed file system (HDFS) module. It can also handle parallel pipelines, which are frequently used in image processing. In our experiment, we show that our platform outperforms traditional system in a scenario of image segmentation.

An Hybrid Probe Detection Model using FCM and Self-Adaptive Module (자가적응모듈과 퍼지인식도가 적용된 하이브리드 침입시도탐지모델)

  • Lee, Seyul
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.13 no.3
    • /
    • pp.19-25
    • /
    • 2017
  • Nowadays, networked computer systems play an increasingly important role in our society and its economy. They have become the targets of a wide array of malicious attacks that invariably turn into actual intrusions. This is the reason computer security has become an essential concern for network administrators. Recently, a number of Detection/Prevention System schemes have been proposed based on various technologies. However, the techniques, which have been applied in many systems, are useful only for the existing patterns of intrusion. Therefore, probe detection has become a major security protection technology to detection potential attacks. Probe detection needs to take into account a variety of factors ant the relationship between the various factors to reduce false negative & positive error. It is necessary to develop new technology of probe detection that can find new pattern of probe. In this paper, we propose an hybrid probe detection using Fuzzy Cognitive Map(FCM) and Self Adaptive Module(SAM) in dynamic environment such as Cloud and IoT. Also, in order to verify the proposed method, experiments about measuring detection rate in dynamic environments and possibility of countermeasure against intrusion were performed. From experimental results, decrease of false detection and the possibilities of countermeasures against intrusions were confirmed.

A Hot-Data Replication Scheme Based on Data Access Patterns for Enhancing Processing Speed of MapReduce (맵리듀스의 처리 속도 향상을 위한 데이터 접근 패턴에 따른 핫-데이터 복제 기법)

  • Son, Ingook;Ryu, Eunkyung;Park, Junho;Bok, Kyoungsoo;Yoo, Jaesoo
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2013.05a
    • /
    • pp.11-12
    • /
    • 2013
  • 최근 대규모 데이터의 처리와 관리를 위한 분산 저장 및 처리 시스템의 연구 및 활용이 중요해지고 있다. 대표적인 분산 저장 및 처리시스템으로써 하둡(Hadoop)이 널리 활용되고 있다. 하둡 분산 파일 시스템을 기반으로 수행되는 맵-리듀스에서 테스크 할당은 데이터의 로컬리티를 고려하여 최대한 가깝게 할당한다. 하지만 맵-리듀스에서의 데이터 분석 작업에서 작업 형태에 따라 빈번하게 요청되는 데이터가 존재한다. 이러한 경우, 해당 데이터의 낮은 로컬리티로 인해 수행시간 증가 및 데이터 전송의 지연의 문제점을 야기 시킨다. 본 논문에서는 맵-리듀스의 처리 속도 향상을 위한 데이터 접근 패턴에 따른 핫-데이터 복제 기법을 제안한다. 제안하는 기법에서는 데이터 접근 패턴에 따라 높은 접근 빈도를 보이는 핫-데이터에 대한 복제본 최적화 알고리즘을 활용하여 데이터 로컬리티를 향상시키고 결과적으로 작업 수행시간을 감소시킨다. 제안하는 기법은 기존 기법에 비해 모든 노드의 데이터 이동이 감소하여 접근빈도의 분포가 균형적인 것을 확인하였다. 성능평가 결과, 기존 기법에 비해 접근 빈도의 부하가 약 8% 감소하는 것을 확인하였다.

  • PDF

An Efficient Data Replacement Algorithm for Performance Optimization of MapReduce in Non-Dedicated Distributed Computing Environments (비-전용 분산 컴퓨팅 환경에서 맵-리듀스 처리 성능 최적화를 위한효율적인 데이터 재배치 알고리즘)

  • Ryu, Eunkyung;Son, Ingook;Park, Junho;Bok, Kyoungsoo;Yoo, Jaesoo
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2013.05a
    • /
    • pp.39-40
    • /
    • 2013
  • 최근 소셜 미디어의 성정과 모바일 장치와 같은 디지털 기기의 활용이 증가함에 따라 데이터가 기하급수적으로 증가하였다. 이러한 대용량의 데이터를 처리하기 위한 대표적인 프레임워크로 맵-리듀스가 등장하였다. 하지만 전용 분산 컴퓨팅 환경에서의 균등한 데이터 배치를 기반으로 수행되는 기존 맵-리듀스는 가용성이 다른 비-전용 분산 컴퓨팅 환경에서는 적합하지 않다. 이를 고려한 비-전용 분산 컴퓨팅 환경에 최적화된 데이터 재배치 알고리즘이 제안되었지만, 데이터 재배치 알고리즘을 수행함으로써 재배치에 많은 시간을 필요로 하고, 불필요한 데이터 전송에 의한 네트워크 부하가 발생한다. 본 논문에서는 비-전용 분산 컴퓨팅 환경에서 맵-리듀스의 성능 최적화를 위한 효율적인 데이터 재배치 알고리즘을 제안한다. 제안하는 기법에서는 노드의 가용성 분석 모델을 기반으로 노드의 데이터 블록 비율을 연산하고, 기존의 데이터 배치를 고려하여 전송함으로써 네트워크 부하를 감소시킨다. 성능평가 결과 기존 기법에 비해 데이터 재배치 블록 비율이 약 75% 감소하였다.

  • PDF

A Case Study of a Six Sigma Project for Improving Assembly Line of Auto-Part Manufacturing Company (자동차 부품 제조업체의 조립라인 개선을 위한 6시그마 프로젝트 사례 연구)

  • Jung, Min-Young;Lee, Young-Nam;Hong, Sung-Hoon
    • Journal of Korean Society for Quality Management
    • /
    • v.38 no.3
    • /
    • pp.439-448
    • /
    • 2010
  • Since the six sigma strategy was first introduced to Motorola in 1987, it has been taken as an important business strategy to strengthen the competitiveness of leading companies in the global competitiveness environm ent. This paper presents a six sigma project to reduce the cycle time of assembly line in a medium size automotive part company. The project follows a structured methodology of DMAIC cycle which consists of Define, Measure, Analyze, Improve, and Control. A CTQ is determined based on COPQ analysis, and a process map is utilized for identifying process input variables. As a result of the project, two assembly lines are converted to cell line production lines. The cycle times become 55 sec./unit and 64 sec./unit from 64 sec./unit and 83 sec./unit at the beginning of the project, respectively.