• Title/Summary/Keyword: 탐지 확률

Search Result 263, Processing Time 0.026 seconds

Analysis of Code Block Reuse in Android Systems (안드로이드 시스템에서 코드 블록 재사용 분석)

  • Ho, Jun-Won;Choi, Nayeon;Song, Jiyeon;Kim, Seoyoung;Lee, Jinju;Cha, Boyeon;Jeong, Wonjee
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2016.10a
    • /
    • pp.241-242
    • /
    • 2016
  • 안드로이드 시스템은 공개적인 구조 때문에 다양한 공격에 노출될 수 있다. 특히 공개된 앱의 코드를 재사용하는 앱 재사용(reuse) 공격에 취약하다. 안드로이드 앱 재사용 공격에서 공격자는 역공학을 통해서 파악한 기존 앱의 유용한 코드 블록을 재사용해서 악성앱을 만든다. 이러한 안드로이드 앱 재사용 공격에 대항하기 위해서 다양한 방어기법들이 제안되었다. 기존에 제안된 기법들이 앱 전체 코드에 대한 재사용 공격을 탐지하는데 반해, 본 논문에서는 앱에서 코드 블록 재사용에 대한 분석기법을 제안하고자 한다. 기본 아이디어는 Birthday paradox을 이용해서 앱에서 재사용되는 코드 블록에 대한 수리적 분석을 수행하는 것이다. 분석을 통해서 동일 코드 블록 재사용 확률은 전체 코드 블록중에서 재사용 코드 블록이 차지하는 비율과 코드 블록 재사용에 참여하는 악성앱들의 개수에 영향을 받는다는 것을 파악하였다.

A study on intra-pulse modulation recognition using fearture parameters (특징인자를 활용한 펄스 내 변조 형태 식별방법에 관한 연구)

  • Yu, KiHun;Han, JinWoo;Park, ByungKoo;Lee, DongWon
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2013.10a
    • /
    • pp.754-756
    • /
    • 2013
  • The modern Electronic Warfare Receivers are required to the current radar technologies like the Low Probability of Intercept(LPI) radars to avoid detection. LPI radars have features of intra-pulse modulation differ from existing radar signals. This features require counterworks such as signal confirmation and identification. Hence this paper presents a study on intra-pulse modulation recognition. The proposed method automatically recognizes intra-pulse modulation types such as LFM and NLFM using classifiers extracted from the features of each intra-pulse modulation. Several simulations are also conducted and the simulation results indicate the performance of the given method.

  • PDF

A Review on the DACS Design from the Perspective of Flight Performance Requirements (비행성능 요구 관점에서 DACS 형상 설계에 관한 고찰)

  • Park, Iksoo;Jin, Jungkun;Ha, Dongsung;Lim, Seongtaek
    • Proceedings of the Korean Society of Propulsion Engineers Conference
    • /
    • 2017.05a
    • /
    • pp.358-363
    • /
    • 2017
  • The high intercept probability depends on optimization of the system, which consists of target detection, tracking system, missile system and so on. To reduce the complexity of global optimization of the system performance, simplification of the relative dependances of each sub-system is done and design parameters for DACS configuration are identified. The conceptual design process is addressed based on the requirement of the design parameters and new methodology is suggested for higher performance.

  • PDF

Approximated Modeling Technique of Weibull Distributed Radar Clutter (Weibull 분포 레이더 클러터의 근사적 모델링 기법)

  • Nam, Chang-Ho;Ra, Sung-Woong
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.23 no.7
    • /
    • pp.822-830
    • /
    • 2012
  • Clutters are all unwanted radar returns to affect on detection of targets. Radar clutter is characterized by amplitude distributions, spectrum, etc. Clutter is modelled with considering these kinds of characteristics. In this paper, a Weibull distribution function approximated by uniform distribution function is suggested. Weibull distribution function is used to model the various clutters. This paper shows that the data generated by the approximated solution of Weibull distribution function satisfy the Weibull probability density function. This paper shows that the data generation time of approximated Weibull distribution function solution is reduced by 20 % compared with the generation time of original Weibull probability density function.

Prewhitening Method for LFM Reverberation by Linear Dechirping (선형 Dechirping 기법을 이용한 LFM 잔향의 백색화 기법)

  • Choi, Byung-Woong;Kim, Jeong-Soo;Lee, Kyun-Kyung
    • The Journal of the Acoustical Society of Korea
    • /
    • v.26 no.3
    • /
    • pp.129-135
    • /
    • 2007
  • In this paper. we propose a prewhitening method for the km reverberation to enhance the target signal. The proposed algorithm uses the dechirping method which inversely compensates the frequency chirp rate of LFM and transforms the LFM reverberation to have stationary frequency property in each data block. Also, using the left and right adjacent beam signals as reference signals. we model frequency response of each data block by AR coefficients. From these coefficients, we implement inverse filter and prewhiten the LFM reverberation of the center beam efficiently.

Landslide Susceptibility Mapping and Verification Using the GIS and Bayesian Probability Model in Boun (지리정보시스템(GIS) 및 베이지안 확률 기법을 이용한 보은지역의 산사태 취약성도 작성 및 검증)

  • Choi, Jae-Won;Lee, Sa-Ro;Min, Kyung-Duk;Woo, Ik
    • Economic and Environmental Geology
    • /
    • v.37 no.2
    • /
    • pp.207-223
    • /
    • 2004
  • The purpose of this study is to reveal spatial relationships between landslide and geospatial data set, to map the landslide susceptibility using the relationship and to verify the landslide susceptibility using the landslide occurrence data in Boun area in 1998. Landslide locations were detected from aerial photography and field survey, and then topography, soil, forest, and land cover data set were constructed as a spatial database using GIS. Various spatial parameters were used as the landslide occurrence factors. They are slope, aspect, curvature and type of topography, texture, material, drainage and effective thickness of soil. type, age, diameter and density of wood, lithology, distance from lineament and land cover. To calculate the relationship between landslides and geospatial database, Bayesian probability methods, weight of evidence. were applied and the contrast value that is >$W^{+}$->$W^{-}$ were calculated. The landslide susceptibility index was calculated by summation of the contrast value and the landslide susceptibility maps were generated using the index. The landslide susceptibility map can be used to reduce associated hazards, and to plan land cover and construction.

Robust determination of control parameters in K chart with respect to data structures (데이터 구조에 강건한 K 관리도의 관리 모수 결정)

  • Park, Ingkeun;Lee, Sungim
    • Journal of the Korean Data and Information Science Society
    • /
    • v.26 no.6
    • /
    • pp.1353-1366
    • /
    • 2015
  • These days Shewhart control chart for evaluating stability of the process is widely used in various field. But it must follow strict assumption of distribution. In real-life problems, this assumption is often violated when many quality characteristics follow non-normal distribution. Moreover, it is more serious in multivariate quality characteristics. To overcome this problem, many researchers have studied the non-parametric control charts. Recently, SVDD (Support Vector Data Description) control chart based on RBF (Radial Basis Function) Kernel, which is called K-chart, determines description of data region on in-control process and is used in various field. But it is important to select kernel parameter or etc. in order to apply the K-chart and they must be predetermined. For this, many researchers use grid search for optimizing parameters. But it has some problems such as selecting search range, calculating cost and time, etc. In this paper, we research the efficiency of selecting parameter regions as data structure vary via simulation study and propose a new method for determining parameters so that it can be easily used and discuss a robust choice of parameters for various data structures. In addition, we apply it on the real example and evaluate its performance.

The Effect of Mean Brightness and Contrast of Digital Image on Detection of Watermark Noise (워터 마크 잡음 탐지에 미치는 디지털 영상의 밝기와 대비의 효과)

  • Kham Keetaek;Moon Ho-Seok;Yoo Hun-Woo;Chung Chan-Sup
    • Korean Journal of Cognitive Science
    • /
    • v.16 no.4
    • /
    • pp.305-322
    • /
    • 2005
  • Watermarking is a widely employed method tn protecting copyright of a digital image, the owner's unique image is embedded into the original image. Strengthened level of watermark insertion would help enhance its resilience in the process of extraction even from various distortions of transformation on the image size or resolution. However, its level, at the same time, should be moderated enough not to reach human visibility. Finding a balance between these two is crucial in watermarking. For the algorithm for watermarking, the predefined strength of a watermark, computed from the physical difference between the original and embedded images, is applied to all images uniformal. The mean brightness or contrast of the surrounding images, other than the absolute brightness of an object, could affect human sensitivity for object detection. In the present study, we examined whether the detectability for watermark noise might be attired by image statistics: mean brightness and contrast of the image. As the first step to examine their effect, we made rune fundamental images with varied brightness and control of the original image. For each fundamental image, detectability for watermark noise was measured. The results showed that the strength ot watermark node for detection increased as tile brightness and contrast of the fundamental image were increased. We have fitted the data to a regression line which can be used to estimate the strength of watermark of a given image with a certain brightness and contrast. Although we need to take other required factors into consideration in directly applying this formula to actual watermarking algorithm, an adaptive watermarking algorithm could be built on this formula with image statistics, such as brightness and contrast.

  • PDF

Comparative Analysis of Effective RCS Prediction Methods for Chaff Clouds (효과적인 채프 구름의 RCS 예측 방법 비교 분석 연구)

  • Kim, Min;Lee, Myung-Jun;Lee, Seong-Hyeon;Park, Sung-ho;Kong, Young-Joo;Woo, Seon-Keol;Kim, Hong-Rak;Kim, Kyung-Tae
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.29 no.3
    • /
    • pp.233-240
    • /
    • 2018
  • Radar cross section (RCS) analysis of chaff clouds is essential for the accurate detection and tracking of missile targets using radar. For this purpose, we compare the performance of two existing methods of predicting RCS of chaff clouds. One method involves summing up the RCS values of individual chaffs in a cloud, while the other method predicts the RCS values using aerodynamic models based on the probability density function. In order to compare and analyze the two techniques more precisely, the RCS of a single chaff computer-aided design model consisting of a half wavelength dipole was calculated using the commercial electromagnetic numerical analysis software, FEKO 7.0, to estimate the RCS values of chaff clouds via simulation. Thus, we verified that our method using the probability density distribution model is capable of analyzing the RCS of chaff clouds more efficiently.

Estimating the Accuracy of Polygraph Test (폴리그라프 검사의 정확도 추정)

  • Jin-Sup Eom ;Hyung-Ki Ji ;Kwangbai Park
    • Korean Journal of Culture and Social Issue
    • /
    • v.14 no.4
    • /
    • pp.1-18
    • /
    • 2008
  • The present study examined the accuracy of polygraph tests through two types of statistical methods with unknown ground truth. One method evaluated the accuracy based on the rates of agreements between polygraph test results of crime suspects and prosecutors' indictment decisions for them. Those crime suspects were tested with polygraph by the Prosecutors' Office of the Republic of Korea between 2000 and 2004. The other method estimated the accuracy by using the latent class analysis based on the frequency distributions of the polygraph results and indictments during 2006. Excluding cases that were 'inconclusive' on the polygraph test, the study showed that the accuracy of the polygraph tests is .914 (SE=.004) for the 2000-2004 data, and .885 (SE=.021) for the 2006 data. With the inclusion of 'inconclusive' cases in the 2006 data, the results from the latent class analysis showed the accuracy in the range between .707 and .734 (SE=.027~.031), with false positives between .078 and .087 (SE=.019~.023), and false negatives between .029 and .078 (SE=.010~.023). The probability that the polygraph test correctly classifies subjects appeared to be in the range between .912 and .925 (SE=.013-.016) for those who lie, and in the range between .867 to .955 (SE=.011-.040) for those who tell the truth.

  • PDF