• Title/Summary/Keyword: Improved entropy

Search Result 126, Processing Time 0.028 seconds

Compression and Enhancement of Medical Images Using Opposition Based Harmony Search Algorithm

  • Haridoss, Rekha;Punniyakodi, Samundiswary
    • Journal of Information Processing Systems
    • /
    • v.15 no.2
    • /
    • pp.288-304
    • /
    • 2019
  • The growth of telemedicine-based wireless communication for images-magnetic resonance imaging (MRI) and computed tomography (CT)-leads to the necessity of learning the concept of image compression. Over the years, the transform based and spatial based compression techniques have attracted many types of researches and achieve better results at the cost of high computational complexity. In order to overcome this, the optimization techniques are considered with the existing image compression techniques. However, it fails to preserve the original content of the diagnostic information and cause artifacts at high compression ratio. In this paper, the concept of histogram based multilevel thresholding (HMT) using entropy is appended with the optimization algorithm to compress the medical images effectively. However, the method becomes time consuming during the measurement of the randomness from the image pixel group and not suitable for medical applications. Hence, an attempt has been made in this paper to develop an HMT based image compression by utilizing the opposition based improved harmony search algorithm (OIHSA) as an optimization technique along with the entropy. Further, the enhancement of the significant information present in the medical images are improved by the proper selection of entropy and the number of thresholds chosen to reconstruct the compressed image.

Improving a Test for Normality Based on Kullback-Leibler Discrimination Information (쿨백-라이블러 판별정보에 기반을 둔 정규성 검정의 개선)

  • Choi, Byung-Jin
    • The Korean Journal of Applied Statistics
    • /
    • v.20 no.1
    • /
    • pp.79-89
    • /
    • 2007
  • A test for normality introduced by Arizono and Ohta(1989) is based on fullback-Leibler discrimination information. The test statistic is derived from the discrimination information estimated using sample entropy of Vasicek(1976) and the maximum likelihood estimator of the variance. However, these estimators are biased and so it is reasonable to make use of unbiased estimators to accurately estimate the discrimination information. In this paper, Arizono-Ohta test for normality is improved. The derived test statistic is based on the bias-corrected entropy estimator and the uniformly minimum variance unbiased estimator of the variance. The properties of the improved KL test are investigated and Monte Carlo simulation is performed for power comparison.

Groundwater vulnerability assessment in the southern coastal sedimentary basin of Benin using DRASTIC, modified DRASTIC, Entropy Weight DRASTIC and AVI

  • Agossou, Amos;Yang, Jeong-Seok
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2021.06a
    • /
    • pp.152-152
    • /
    • 2021
  • The importance of groundwater has long been recognized, but the ground water potential to become contaminated as a result of human activities has only been recognized in recently. Before 1980 it was thought that soils served as filters, preventing harmful substances deposited at the surface from migrating into groundwater. Today it is known that soils have a finite capacity to protect groundwater. It can be contaminated from divers sources. Therefore, Assessment of aquifer vulnerability to pollution is essential for the protection and management of groundwater and land use planning. In this study, we used DRASTIC and AVI for groundwater vulnerability to contamination assessment. the different methods were applied to the southern coastal sedimentary basin of Benin and DRASTIC method was modified in two different steps. First, we modified DRASTIC by adding land use parameter to include the actual pollution sources (DRASTICLcLu) and second, classic DRASTIC weights was modified using Shannon's entropy (Entropy weight DRASTIC). The reliability of the applied approaches was verified using nitrate (NO3-) concentration and by comparing the overall vulnerability maps to the previous researches in the study area and in the world. The results from validation showed that the addition of landcover/land use parameter to the classic DRASTIC helps to improve the method for better definition of the vulnerable areas in the basin and also, the weight modification using entropy improved better the method because Entropy weight DRASTICLcLu showed the highest correlation with nitrate concentration in the study basin. In summary the weight modification using entropy approach reduced the uncertainty of the human subjectivity in assigning weights and ratings in the standard DRASTIC.

  • PDF

An Improved Level Set Method to Image Segmentation Based on Saliency

  • Wang, Yan;Xu, Xianfa
    • Journal of Information Processing Systems
    • /
    • v.15 no.1
    • /
    • pp.7-21
    • /
    • 2019
  • In order to improve the edge segmentation effect of the level set image segmentation and avoid the influence of the initial contour on the level set method, a saliency level set image segmentation model based on local Renyi entropy is proposed. Firstly, the saliency map of the original image is extracted by using saliency detection algorithm. And the outline of the saliency map can be used to initialize the level set. Secondly, the local energy and edge energy of the image are obtained by using local Renyi entropy and Canny operator respectively. At the same time, new adaptive weight coefficient and boundary indication function are constructed. Finally, the local binary fitting energy model (LBF) as an external energy term is introduced. In this paper, the contrast experiments are implemented in different image database. The robustness of the proposed model for segmentation of images with intensity inhomogeneity and complicated edges is verified.

Anomaly-based Alzheimer's disease detection using entropy-based probability Positron Emission Tomography images

  • Husnu Baris Baydargil;Jangsik Park;Ibrahim Furkan Ince
    • ETRI Journal
    • /
    • v.46 no.3
    • /
    • pp.513-525
    • /
    • 2024
  • Deep neural networks trained on labeled medical data face major challenges owing to the economic costs of data acquisition through expensive medical imaging devices, expert labor for data annotation, and large datasets to achieve optimal model performance. The heterogeneity of diseases, such as Alzheimer's disease, further complicates deep learning because the test cases may substantially differ from the training data, possibly increasing the rate of false positives. We propose a reconstruction-based self-supervised anomaly detection model to overcome these challenges. It has a dual-subnetwork encoder that enhances feature encoding augmented by skip connections to the decoder for improving the gradient flow. The novel encoder captures local and global features to improve image reconstruction. In addition, we introduce an entropy-based image conversion method. Extensive evaluations show that the proposed model outperforms benchmark models in anomaly detection and classification using an encoder. The supervised and unsupervised models show improved performances when trained with data preprocessed using the proposed image conversion method.

A New Method for Selecting Thresholding on Wavelet Packet Denoising for Speech Enhancement

  • Kim, I-jae;Kim, Hyoung-soo;Koh, Kwang-hyun;Yang, Sung-il;Y. Kwon
    • The Journal of the Acoustical Society of Korea
    • /
    • v.20 no.2E
    • /
    • pp.25-29
    • /
    • 2001
  • In this paper, we propose a new method for selecting the threshold on wavelet packet denoising. In selecting threshold, the method using median is not efficient. Because this method can not recover unvoiced signal corrupted by noise. So we partition a speech signal corrupted by noise into the pure noise section and voiced section using autocorrelation and entropy. The autocorrelation and entropy can reflect disorder of noise. The new method yields more improved denoising effect. Especially unvoiced signal is very nicely reconstructed, and SNR is improved.

  • PDF

Improved FCM Algorithm using Entropy-based Weight and Intercluster (엔트로피 기반의 가중치와 분포크기를 이용한 향상된 FCM 알고리즘)

  • Kwak Hyun-Wook;Oh Jun-Taek;Sohn Young-Ho;Kim Wook-Hyun
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.43 no.4 s.310
    • /
    • pp.1-8
    • /
    • 2006
  • This paper proposes an improved FCM(Fuzzy C-means) algorithm using intercluster and entropy-based weight in gray image. The fuzzy clustering methods have been extensively used in the image segmentation since it extracts feature information of the region. Most of fuzzy clustering methods have used the FCM algorithm. But, FCM algorithm is still sensitive to noise, as it does not include spatial information. In addition, it can't correctly classify pixels according to the feature-based distributions of clusters. To solve these problems, we applied a weight and intercluster to the traditional FCM algorithm. A weight is obtained from the entropy information based on the cluster's number of neighboring pixels. And a membership for one pixel is given based on the information considering the feature-based intercluster. Experiments has confirmed that the proposed method was more tolerant to noise and superior to existing methods.

A Preprocessing Algorithm for Efficient Lossless Compression of Gray Scale Images

  • Kim, Sun-Ja;Hwang, Doh-Yeun;Yoo, Gi-Hyoung;You, Kang-Soo;Kwak, Hoon-Sung
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.2485-2489
    • /
    • 2005
  • This paper introduces a new preprocessing scheme to replace original data of gray scale images with particular ordered data so that performance of lossless compression can be improved more efficiently. As a kind of preprocessing technique to maximize performance of entropy encoder, the proposed method converts the input image data into more compressible form. Before encoding a stream of the input image, the proposed preprocessor counts co-occurrence frequencies for neighboring pixel pairs. Then, it replaces each pair of adjacent gray values with particular ordered numbers based on the investigated co-occurrence frequencies. When compressing ordered image using entropy encoder, we can expect to raise compression rate more highly because of enhanced statistical feature of the input image. In this paper, we show that lossless compression rate increased by up to 37.85% when comparing results from compressing preprocessed and non-preprocessed image data using entropy encoder such as Huffman, Arithmetic encoder.

  • PDF

The Effectiveness of CRM Approach in Improving the Profitability of Korea Professional Baseball Industry Measured by Entropy of ID3 Decision Tree Algorithm

  • Oh, Se-Kyung;Gwak, Chung-Lee;Lee, Mi-Young
    • Journal of Information Technology Applications and Management
    • /
    • v.18 no.3
    • /
    • pp.91-110
    • /
    • 2011
  • Korea professional baseball industry has grown to take the lion's share of the domestic sports industry, but still does not make break even. The purpose of this study is to examine the financial impact of adopting the Customer Relation Management (CRM) approach on the profitability of Korea professional baseball industry. We use a measuring tool called entropy used in ID3 decision tree algorithm. In the paper, we specify five the most important factors that affect spectator satisfaction based on the previous literature, perform survey analysis, calculate entropy values, and find the results. We predicted the change in revenues when we adopt CRM by checking the spectators' willingness to pay more when the conditions of each factor are improved. We find that we can reap significant fruits of the effect of CRM introduction through enhancing 'game content factor' and 'game promotion factor' among the five factors. We also find that we can increase the revenues of domestic professional baseball teams to 2.4 times or 2.1 times the current level if we manage intensively those two factors respectively. It is very surprising to see that the improvement in total revenues makes both ends meet for domestic professional baseball teams. This clearly demonstrates the effectiveness of CRM approach in improving the profitability of organizations.

A study on the estimation of potential yield for Korean west coast fisheries using the holistic production method (HPM) (통합생산량분석법에 의한 한국 서해 어획대상 잠재생산량 추정 연구)

  • KIM, Hyun-A;SEO, Yong-Il;CHA, Hyung Kee;KANG, Hee-Joong;ZHANG, Chang-Ik
    • Journal of the Korean Society of Fisheries and Ocean Technology
    • /
    • v.54 no.1
    • /
    • pp.38-53
    • /
    • 2018
  • The purpose of this study is to estimate potential yield (PY) for Korean west coast fisheries using the holistic production method (HPM). HPM involves the use of surplus production models to apply input data of catch and standardized fishing efforts. HPM compared the estimated parameters of the surplus production from four different models: the Fox model, CYP model, ASPIC model, and maximum entropy model. The PY estimates ranged from 174,232 metric tons (mt) using the CYP model to 238,088 mt using the maximum entropy model. The highest coefficient of determination ($R^2$), the lowest root mean square error (RMSE), and the lowest Theil's U statistic (U) for Korean west coast fisheries were obtained from the maximum entropy model. The maximum entropy model showed relatively better fits of data, indicating that the maximum entropy model is statistically more stable and accurate than other models. The estimate from the maximum entropy model is regarded as a more reasonable estimate of PY. The quality of input data should be improved for the future study of PY to obtain more reliable estimates.