• Title/Summary/Keyword: Maximum Entropy Method

Search Result 156, Processing Time 0.028 seconds

Accurate Detection of a Defective Area by Adopting a Divide and Conquer Strategy in Infrared Thermal Imaging Measurement

  • Jiangfei, Wang;Lihua, Yuan;Zhengguang, Zhu;Mingyuan, Yuan
    • Journal of the Korean Physical Society
    • /
    • v.73 no.11
    • /
    • pp.1644-1649
    • /
    • 2018
  • Aiming at infrared thermal images with different buried depth defects, we study a variety of image segmentation algorithms based on the threshold to develop global search ability and the ability to find the defect area accurately. Firstly, the iterative thresholding method, the maximum entropy method, the minimum error method, the Ostu method and the minimum skewness method are applied to image segmentation of the same infrared thermal image. The study shows that the maximum entropy method and the minimum error method have strong global search capability and can simultaneously extract defects at different depths. However none of these five methods can accurately calculate the defect area at different depths. In order to solve this problem, we put forward a strategy of "divide and conquer". The infrared thermal image is divided into several local thermal maps, with each map containing only one defect, and the defect area is calculated after local image processing of the different buried defects one by one. The results show that, under the "divide and conquer" strategy, the iterative threshold method and the Ostu method have the advantage of high precision and can accurately extract the area of different defects at different depths, with an error of less than 5%.

Probability Distribution of Nonlinear Random Wave Heights Using Maximum Entropy Method (최대 엔트로피 방법을 이용한 비선형 불규칙 파고의 확률분포함수)

  • 안경모
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.10 no.4
    • /
    • pp.204-210
    • /
    • 1998
  • This paper presents the development of the probability density function applicable for wave heights (peak-to-trough excursions) in finite water depth including shallow water depth. The probability distribution applicable to wave heights of a non-Gaussian random process is derived based on the concept of the maximum entropy method. When wave heights are limited by breaking wave heights (or water depth) and only first and second moments of wave heights are given, the probability density function developed is closed form and expressed in terms of wave parameters such as $H_m$(mean wave height), $H_{rms}$(root-mean-square wave height), $H_b$(breaking wave height). When higher than third moment of wave heights are given, it is necessary to solve the system of nonlinear integral equations numerically using Newton-Raphson method to obtain the parameters of probability density function which is maximizing the entropy function. The probability density function thusly derived agrees very well with the histogram of wave heights in finite water depth obtained during storm. The probability density function of wave heights developed using maximum entropy method appears to be useful in estimating extreme values and statistical properties of wave heights for the design of coastal structures.

  • PDF

Intra-Sentence Segmentation using Maximum Entropy Model for Efficient Parsing of English Sentences (효율적인 영어 구문 분석을 위한 최대 엔트로피 모델에 의한 문장 분할)

  • Kim Sung-Dong
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.5
    • /
    • pp.385-395
    • /
    • 2005
  • Long sentence analysis has been a critical problem in machine translation because of high complexity. The methods of intra-sentence segmentation have been proposed to reduce parsing complexity. This paper presents the intra-sentence segmentation method based on maximum entropy probability model to increase the coverage and accuracy of the segmentation. We construct the rules for choosing candidate segmentation positions by a teaming method using the lexical context of the words tagged as segmentation position. We also generate the model that gives probability value to each candidate segmentation positions. The lexical contexts are extracted from the corpus tagged with segmentation positions and are incorporated into the probability model. We construct training data using the sentences from Wall Street Journal and experiment the intra-sentence segmentation on the sentences from four different domains. The experiments show about $88\%$ accuracy and about $98\%$ coverage of the segmentation. Also, the proposed method results in parsing efficiency improvement by 4.8 times in speed and 3.6 times in space.

Rule-Based Classification Analysis Using Entropy Distribution (엔트로피 분포를 이용한 규칙기반 분류분석 연구)

  • Lee, Jung-Jin;Park, Hae-Ki
    • Communications for Statistical Applications and Methods
    • /
    • v.17 no.4
    • /
    • pp.527-540
    • /
    • 2010
  • Rule-based classification analysis is widely used for massive datamining because it is easy to understand and its algorithm is uncomplicated. In this classification analysis, majority vote of rules or weighted combination of rules using their supports are frequently used in order to combine rules. We propose a method to combine rules by using the multinomial distribution in this paper. Iterative proportional fitting algorithm is used to estimate the multinomial distribution which maximizes entropy constrained on rules' support. Simulation experiments show that this method can compete with other well known classification models in the case of two similar populations.

Design of High Speed Binary Arithmetic Encoder for CABAC Encoder (CABAC 부호화기를 위한 고속 이진 산술 부호화기의 설계)

  • Park, Seungyong;Jo, Hyungu;Ryoo, Kwangki
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.21 no.4
    • /
    • pp.774-780
    • /
    • 2017
  • This paper proposes an efficient binary arithmetic encoder hardware architecture for CABAC encoding, which is an entropy coding method of HEVC. CABAC is an entropy coding method that is used in HEVC standard. Entropy coding removes statistical redundancy and supports a high compression ratio of images. However, the binary arithmetic encoder causes a delay in real time processing and parallel processing is difficult because of the high dependency between data. The operation of the proposed CABAC BAE hardware structure is to separate the renormalization and process the conventional iterative algorithm in parallel. The new scheme was designed as a four-stage pipeline structure that can reduce critical path optimally. The proposed CABAC BAE hardware architecture was designed with Verilog HDL and implemented in 65nm technology. Its gate count is 8.07K and maximum operating speed of 769MHz. It processes the four bin per clock cycle. Maximum processing speed increased by 26% from existing hardware architectures.

Adaptive local histogram modification method for dynamic range compression of infrared images

  • Joung, Jihye
    • Journal of the Korea Society of Computer and Information
    • /
    • v.24 no.6
    • /
    • pp.73-80
    • /
    • 2019
  • In this paper, we propose an effective dynamic range compression (DRC) method of infrared images. A histogram of infrared images has narrow dynamic range compared to visible images. Hence, it is important to apply the effective DRC algorithm for high performance of an infrared image analysis. The proposed algorithm for high dynamic range divides an infrared image into the overlapped blocks and calculates Shannon's entropy of overlapped blocks. After that, we classify each block according to the value of entropy and apply adaptive histogram modification method each overlapped block. We make an intensity mapping function through result of the adaptive histogram modification method which is using standard-deviation and maximum value of histogram of classified blocks. Lastly, in order to reduce block artifact, we apply hanning window to the overlapped blocks. In experimental result, the proposed method showed better performance of dynamic range compression compared to previous algorithms.

A Study on the Maximum Velocity and the Surface Velocity (최대유속과 표면유속에 관한 연구)

  • Choo, Tai Ho;Je, Sung Jin
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2006.05a
    • /
    • pp.351-355
    • /
    • 2006
  • The purpose of this study is to develop an efficient and useful equation of discharge measurement which can calculate easily discharge using only the surface velocity in both channels and rivers. The research results show: (1) Natural river have a propensity to establish and maintain an equilibrium state the corresponds to a value of the entropy parameter M; (2) Velocity distribution estimated by the method using surface velocity was compared with that of actual survey. It shows fairly close agreements between the estimated and the observed; (3) Developed equations for calculating the discharge using the surface velocity at the spot of the maximum velocity in a river section were established and show that the method of using fairly acceptable. An entropy based method for determining the discharge using only surface velocity in the rivers has been developed. The method presented is also efficient and applicable in estimating the discharge in high flows during the flood season that are very difficult or impossible to measure before, due to technical or theoretical reasons.

  • PDF

Korean Sentence Classification System Using GloVe and Maximum Entropy Model (GloVe와 최대 엔트로피 모델을 이용한 한국어 문장 분류 시스템)

  • Park, IlNam;Choi, DongHyun;Shin, MyeongCheol;Kim, EungGyun
    • Annual Conference on Human and Language Technology
    • /
    • 2018.10a
    • /
    • pp.522-526
    • /
    • 2018
  • 본 연구는 수많은 챗봇이 생성될 수 있는 챗봇 빌더 시스템에서 저비용 컴퓨팅 파워에서도 구동 가능한 가벼운 문장 분류 시스템을 제안하며, 미등록어 처리를 위해 워드 임베딩 기법인 GloVe를 이용하여 문장 벡터를 생성하고 이를 추가 자질로 사용하는 방법을 소개한다. 제안한 방법으로 자체 구축한 테스트 말뭉치를 이용하여 성능을 평가해본 결과 최대 93.06% 성능을 보였으며, 자체 보유한 CNN 모델과의 비교 평가 결과 성능은 2.5% 낮지만, 모델 학습 속도는 25배, 학습 시 메모리 사용량은 6배, 생성된 모델 파일 크기는 302배나 효율성 있음을 보였다.

  • PDF

Application of Generalized Maximum Entropy Estimator to the Two-way Nested Error Component Model with III-Posed Data

  • Cheon, Soo-Young
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.4
    • /
    • pp.659-667
    • /
    • 2009
  • Recently Song and Cheon (2006) and Cheon and Lim (2009) developed the generalized maximum entropy(GME) estimator to solve ill-posed problems for the regression coefficients in the simple panel model. The models discussed consider the individual and a spatial autoregressive disturbance effects. However, in many application in economics the data may contain nested groupings. This paper considers a two-way error component model with nested groupings for the ill-posed data and proposes the GME estimator of the unknown parameters. The performance of this estimator is compared with the existing methods on the simulated dataset. The results indicate that the GME method performs the best in estimating the unknown parameters in terms of its quality when the data are ill-posed.

Korean Noun Phrase Identification Using Maximum Entropy Method (최대 엔트로피 모델을 이용한 한국어 명사구 추출)

  • 강인호;전수영;김길창
    • Proceedings of the Korean Society for Cognitive Science Conference
    • /
    • 2000.06a
    • /
    • pp.127-132
    • /
    • 2000
  • 본 논문에서는 격조사의 구문적인 특성을 이용하여, 수식어까지 포함한 명사구 추출 방법을 연구한다. 명사구 판정을 위해 연속적인 형태소열을 문맥정보로 사용하던 기존의 방법과 달리, 명사구의 처음과 끝 그리고 명사구 주변의 형태소를 이용하여 명사구의 수식 부분과 중심 명사를 문맥정보로 사용한다. 다양한 형태의 문맥 정보들은 최대 엔트로피 원리(Maximum Entropy Principle)에 의해 하나의 확률 분포로 결합된다. 본 논문에서 제안하는 명사구 추출 방법은 먼저 구문 트리 태깅된 코퍼스에서 품사열로 표현되는 명사구 문법 규칙을 얻어낸다. 이렇게 얻어낸 명사구 규칙을 이용하여 격조사와 인접한 명사구 후보들을 추출한다. 추출된 각 명사구 후보는 학습 코퍼스에서 얻어낸 확률 분포에 기반하여 명사구로 해석될 확률값을 부여받는다. 이 중 제일 확률값이 높은 것을 선택하는 형태로 각 격조사와 관계있는 명사구를 추출한다. 본 연구에서 제시하는 모델로 시험을 한 결과 평균 4.5개의 구를 포함하는 명사구를 추출할 수 있었다.

  • PDF