• Title/Summary/Keyword: Equalization Noise

Search Result 192, Processing Time 0.022 seconds

Magnetic Resonance Brain Image Contrast Enhancement Using Histogram Equalization Techniques (히스토그램 평형 기법을 이용한 자기 공명 두뇌 영상 콘트라스트 향상)

  • Ullah, Zahid;Lee, Su-Hyun
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2019.01a
    • /
    • pp.83-86
    • /
    • 2019
  • Histogram equalization is extensively used for image contrast enhancement in various applications due to its effectiveness and its modest functions. In image research, image enhancement is one of the most significant and arduous technique. The image enhancement aim is to improve the visual appearance of an image. Different kinds of images such as satellite images, medical images, aerial images are affected from noise and poor contrast. So it is important to remove the noise and improve the contrast of the image. Therefore, for this purpose, we apply a median filter on MR image as the median filter remove the noise and preserve the edges effectively. After applying median filter on MR image we have used intensity transformation function on the filtered image to increase the contrast of the image. Than applied the histogram equalization (HE) technique on the filtered image. The simple histogram equalization technique over enhances the brightness of the image due to which the important information can be lost. Therefore, adaptive histogram equalization (AHE) and contrast limited histogram equalization (CLAHE) techniques are used to enhance the image without losing any information.

  • PDF

Robust Histogram Equalization Using Compensated Probability Distribution

  • Kim, Sung-Tak;Kim, Hoi-Rin
    • MALSORI
    • /
    • v.55
    • /
    • pp.131-142
    • /
    • 2005
  • A mismatch between the training and the test conditions often causes a drastic decrease in the performance of the speech recognition systems. In this paper, non-linear transformation techniques based on histogram equalization in the acoustic feature space are studied for reducing the mismatched condition. The purpose of histogram equalization(HEQ) is to convert the probability distribution of test speech into the probability distribution of training speech. While conventional histogram equalization methods consider only the probability distribution of a test speech, for noise-corrupted test speech, its probability distribution is also distorted. The transformation function obtained by this distorted probability distribution maybe bring about miss-transformation of feature vectors, and this causes the performance of histogram equalization to decrease. Therefore, this paper proposes a new method of calculating noise-removed probability distribution by using assumption that the CDF of noisy speech feature vectors consists of component of speech feature vectors and component of noise feature vectors, and this compensated probability distribution is used in HEQ process. In the AURORA-2 framework, the proposed method reduced the error rate by over $44\%$ in clean training condition compared to the baseline system. For multi training condition, the proposed methods are also better than the baseline system.

  • PDF

Mixture Filtering Approaches to Blind Equalization Based on Estimation of Time-Varying and Multi-Path Channels

  • Lim, Jaechan
    • Journal of Communications and Networks
    • /
    • v.18 no.1
    • /
    • pp.8-18
    • /
    • 2016
  • In this paper, we propose a number of blind equalization approaches for time-varying andmulti-path channels. The approaches employ cost reference particle filter (CRPF) as the symbol estimator, and additionally employ either least mean squares algorithm, recursive least squares algorithm, or $H{\infty}$ filter (HF) as a channel estimator such that they are jointly employed for the strategy of "Rao-Blackwellization," or equally called "mixture filtering." The novel feature of the proposed approaches is that the blind equalization is performed based on direct channel estimation with unknown noise statistics of the received signals and channel state system while the channel is not directly estimated in the conventional method, and the noise information if known in similar Kalman mixture filtering approach. Simulation results show that the proposed approaches estimate the transmitted symbols and time-varying channel very effectively, and outperform the previously proposed approach which requires the noise information in its application.

A study on 1 & 2 dimensional minimum mean-squared-error equalization for digital holographic data storage system (디지털 홀로그래픽 데이터 저장 시스템을 위한 1차원 및 2차원 최소 평균-제곱-에러 등화에 관한 연구)

  • 최안식;전영식;정종래;백운식
    • Korean Journal of Optics and Photonics
    • /
    • v.13 no.6
    • /
    • pp.486-492
    • /
    • 2002
  • In this paper. we presented 1 & 2 dimensional minimum mean-squared-error (MMSE) equalization scheme in a digital holographic data storage system to improve bit-error-rate (BER) and to mitigate inter-symbol interference (ISI) which were generated during the data storage and retrieval processes. We showed experimentally for ten data pages retrieved from the holographic storage system that BER and signal-to-noise ratio (SNR) were improved by adopting MMSE equalization.

Lagged Cross-Correlation of Probability Density Functions and Application to Blind Equalization

  • Kim, Namyong;Kwon, Ki-Hyeon;You, Young-Hwan
    • Journal of Communications and Networks
    • /
    • v.14 no.5
    • /
    • pp.540-545
    • /
    • 2012
  • In this paper, the lagged cross-correlation of two probability density functions constructed by kernel density estimation is proposed, and by maximizing the proposed function, adaptive filtering algorithms for supervised and unsupervised training are also introduced. From the results of simulation for blind equalization applications in multipath channels with impulsive and slowly varying direct current (DC) bias noise, it is observed that Gaussian kernel of the proposed algorithm cuts out the large errors due to impulsive noise, and the output affected by the DC bias noise can be effectively controlled by the lag ${\tau}$ intrinsically embedded in the proposed function.

Evaluation and Comparison of Signal to Noise Ratio According to Histogram Equalization of Heart Shadow on Chest Image (흉부영상에서 평활화 시 심장저부 음영의 신호 대 잡음비 비교평가)

  • Kim, Ki-Won;Lee, Eul-Kyu;Jeong, Hoi-Woun;Son, Jin-Hyun;Kang, Byung-Sam;Kim, Hyun-Soo;Min, Jung-Whan
    • Journal of radiological science and technology
    • /
    • v.40 no.2
    • /
    • pp.197-203
    • /
    • 2017
  • The purpose of this study was to measure signal to noise ratio (SNR) according to change of equalization from region of interest (ROI) of heart shadow in chest image. We examined images of chest image of 87 patients in a University-affiliated hospital, Seoul, Korea. Chest images of each patient were calculated by using ImageJ. We have analysis socio-demographical variables, SNR according to images, 95% confidence according to SNR of difference in a mean of SNR. Differences of SNR among change of equalization were tested by SPSS Statistics21 ANOVA test for there was statistical significance 95%(p < 0.05). In SNR results, with the quality of distributions in the order of original chest image, original chest image heart shadow and equalization chest image, equalization chest image heart shadow(p < 0.001). In conclusion, this study would be that quantitative evaluation of heart shadow on chest image can be used as an adjunct to the histogram equalization chest image.

Performance of Noise-Predictive Turbo Equalization for PMR Channel (수직자기기록 채널에서 잡음 예측 터보 등화기의 성능)

  • Kim, Jin-Young;Lee, Jae-Jin
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.33 no.10C
    • /
    • pp.758-763
    • /
    • 2008
  • We introduce a noise-predictive turbo equalization using noise filter in perpendicular magnetic recording(PMR) channel. The noise filter mitigates the colored noise in high-density PMR channel. In this paper, the channel detectors used are SOVA (Soft Output Viterbi Algorithm) and BCJR algorithm which proposed by Bahl et al., and the outer decoder used is LDPC (Low Density Parity Check) code that is implemented by sum-product algorithm. Two kinds of LDPC codes are experimented. One is the 0.5Kbyte (4336,4096) LDPC code with the code rate of 0.94, and the other is 1Kbyte (8432,8192) LDPC code with the code rate of 0.97.

Feature Compensation Combining SNR-Dependent Feature Reconstruction and Class Histogram Equalization

  • Suh, Young-Joo;Kim, Hoi-Rin
    • ETRI Journal
    • /
    • v.30 no.5
    • /
    • pp.753-755
    • /
    • 2008
  • In this letter, we propose a new histogram equalization technique for feature compensation in speech recognition under noisy environments. The proposed approach combines a signal-to-noise-ratio-dependent feature reconstruction method and the class histogram equalization technique to effectively reduce the acoustic mismatch present in noisy speech features. Experimental results from the Aurora 2 task confirm the superiority of the proposed approach for acoustic feature compensation.

  • PDF

Blind linear/nonlinear equalization for heavy noise-corrupted channels

  • Han, Soo- Whan;Park, Sung-Dae
    • Journal of information and communication convergence engineering
    • /
    • v.7 no.3
    • /
    • pp.383-391
    • /
    • 2009
  • In this paper, blind equalization using a modified Fuzzy C-Means algorithm with Gaussian Weights (MFCM_GW) is attempted to the heavy noise-corrupted channels. The proposed algorithm can deal with both of linear and nonlinear channels, because it searches for the optimal channel output states of a channel instead of estimating the channel parameters in a direct manner. In contrast to the common Euclidean distance in Fuzzy C-Means (FCM), the use of the Bayesian likelihood fitness function and the Gaussian weighted partition matrix is exploited in its search procedure. The selected channel states by MFCM_GW are always close to the optimal set of a channel even the additive white Gaussian noise (AWGN) is heavily corrupted in it. Simulation studies demonstrate that the performance of the proposed method is relatively superior to existing genetic algorithm (GA) and conventional FCM based methods in terms of accuracy and speed.

A PDF-distance minimization algorithm for blind equalization for underwater communication channels with multipath and impulsive noise (다중경로와 임펄스 잡음이 있는 수중 통신 채널의 블라인드 등화를 위한 확률분포-거리 최소화 알고리듬)

  • Kim, Nam-Yong
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.15 no.2
    • /
    • pp.299-306
    • /
    • 2011
  • In this paper, a blind adaptive equalization algorithm based on PDF-distance minimization and a set of Delta functions is introduced and its superior robustness against impulsive noise and multipath characteristics of underwater communication channels is proved. The conventional CMA based on MSE has shown to be incapable of coping with impulsive noise, and correntropy blind algorithm has also revealed to yield not satisfying performance for the mission. On the other hand, the blind adaptive equalization algorithm based on PDF-distance minimization and a set of Delta functions has been proved to solve effectively the problem of impulsive noise and multipath characteristics of underwater communication channels through theoretical and simulation analysis.