• Title/Summary/Keyword: Kernel Density Function

Search Result 98, Processing Time 0.025 seconds

Multi-focus Image Fusion Technique Based on Parzen-windows Estimates (Parzen 윈도우 추정에 기반한 다중 초점 이미지 융합 기법)

  • Atole, Ronnel R.;Park, Daechul
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.8 no.4
    • /
    • pp.75-88
    • /
    • 2008
  • This paper presents a spatial-level nonparametric multi-focus image fusion technique based on kernel estimates of input image blocks' underlying class-conditional probability density functions. Image fusion is approached as a classification task whose posterior class probabilities, P($wi{\mid}Bikl$), are calculated with likelihood density functions that are estimated from the training patterns. For each of the C input images Ii, the proposed method defines i classes wi and forms the fused image Z(k,l) from a decision map represented by a set of $P{\times}Q$ blocks Bikl whose features maximize the discriminant function based on the Bayesian decision principle. Performance of the proposed technique is evaluated in terms of RMSE and Mutual Information (MI) as the output quality measures. The width of the kernel functions, ${\sigma}$, were made to vary, and different kernels and block sizes were applied in performance evaluation. The proposed scheme is tested with C=2 and C=3 input images and results exhibited good performance.

  • PDF

Generation of emulsions due to the impact of surfactant-laden droplet on a viscous oil layer on water (벤츄리 노즐 출구 형상과 작동 조건에 따른 캐비테이션 기포 발생 특성 연구)

  • Changhoon Oh;Joon Hyun Kim;Jaeyong Sung
    • Journal of the Korean Society of Visualization
    • /
    • v.21 no.1
    • /
    • pp.94-102
    • /
    • 2023
  • Three design parameters were considered in this study: outlet nozzle angle (30°, 60°, 80°), neck length (1 mm, 3 mm), and flow rate (0.5, 0.6, 0.7, 0.8 lpm). A neck diameter of 0.5 mm induced cavitation flow at a venture nozzle. A secondary transparent chamber was connected after ejection to increase bubble duration and shape visibility. The bubble size was estimated using a Gaussian kernel function to identify bubbles in the acquired images. Data on bubble size were used to obtain Sauter's mean diameter and probability density function to obtain specific bubble state conditions. The degree of bubble generation according to the bubble size was compared for each design variable. The bubble diameter increased as the flow rate increased. The frequency of bubble generation was highest around 20 ㎛. With the same neck length, the smaller the CV number, the larger the average bubble diameter. It is possible to increase the generation frequency of smaller bubbles by the cavitation method by changing the magnification angle and length of the neck. However, if the flow rate is too large, the average bubble diameter tends to increase, so an appropriate flow rate should be selected.

A Study on Target Standardized Precipitation Index in Korea (한반도 목표 표준강수지수(SPI) 산정에 관한 연구)

  • Kim, Min-Seok;Moon, Young-Il
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.34 no.4
    • /
    • pp.1117-1123
    • /
    • 2014
  • Water is a necessary condition of plants, animals and human. The state of the water shortage, that drought is globally one of the most feared disasters. This study was calculated target standardized precipitation index with unit of region for judgment and preparation of drought in consideration of the regional characteristics. First of all, Standardized Precipitation Index (3) were calculated by monthly rainfall data from rainfall data more than 30 years of 88 stations. Parametric frequency and nonparametric frequency using boundary kernel density function were analysed using annual minimum data that were extracted from calculated SPI (3). Also, Target return period sets up 30 year and target SPI analysed unit of region using thiessen by result of nonparametric frequency. Analyzed result, Drought was entirely different from severity and frequency by region. This study results will contribute to a national water resources plan and disaster prevention measures with data foundation for judgment and preparation of drought in korea.

Long-term Wave Monitoring and Analysis Off the Coast of Sokcho (속초 연안의 장기 파랑관측 및 분석)

  • Jeong, Weon Mu;Ryu, Kyung-Ho;Cho, Hongyeon
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.27 no.4
    • /
    • pp.274-279
    • /
    • 2015
  • Wave data acquired over eleven years near Sokcho Harbor located in the central area of the east coast were analyzed using spectral method and wave-by-wave analysis method and its major wave characteristics were examined. Significant wave heights were found to be high in winter and low in summer, and peak periods were also found to be long in winter and short in summer. The maximum significant wave height observed was 8.95 m caused by the East Sea twister. The distributional pattern of the significant wave heights and peak periods were both fitted better by Kernel distribution function than by Generalized Gamma distribution function and Generalized Extreme Value distribution function. The wave data were compiled to subdivide the wave height into intervals for each month, and the cumulative occurrence rates of wave heights were calculated to be utilized for the design and construction works in nearby construction works.

Analysis of the Long-term Wave Characteristics off the Coast of Daejin (대진 연안의 장기 파랑 특성 분석)

  • Jeong, Weon Mu;Cho, Hongyeon;Baek, Wondae
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.27 no.2
    • /
    • pp.142-147
    • /
    • 2015
  • Wave data acquired over seven years near Daejin Harbor located in the north central area of the east coast was analyzed using spectral method and wave-by-wave analysis method and its major wave characteristics were examined. Significant wave heights were found to be high in winter and low in summer, and peak periods were also found to be long in winter and short in summer. The maximum significant wave height observed was 6.59 m and was caused by Typhoon No. 1216, SANBA. The distributional pattern of the significant wave heights and peak periods were both reproduced better by Kernel distribution function than by Generalized Gamma distribution function and Generalized Extreme Value distribution function. Meanwhile, the wave data was subdivided by month and wave height level and the cumulative appearance rate was proposed to aid designing and constructing works in nearby coastal areas.

Problems Occurred with Histogram and a Resolution

  • Park, Byeong Uk;Park, Hong Nae;Song, Moon Sup;Song, Jae Kee
    • Journal of Korean Society for Quality Management
    • /
    • v.18 no.2
    • /
    • pp.127-133
    • /
    • 1990
  • In this article, several problems inherent in histogram estimate of unknown probability density function are discussed. Those include so called sharp comers and bin edge effect. A resolution for these problems occurred with histogram is discussed. The resulting estimate is called kernel density estimate which is most widely used by data analysts. One of the most recent and reliable data-based choices of scale factor (bandwidth) of the estimate, which has been known to be most crucial, is also discussed.

  • PDF

Convergence Properties of a Spectral Density Estimator

  • Gyeong Hye Shin;Hae Kyung Kim
    • Communications for Statistical Applications and Methods
    • /
    • v.3 no.3
    • /
    • pp.271-282
    • /
    • 1996
  • this paper deal with the estimation of the power spectral density function of time series. A kernel estimator which is based on local average is defined and the rates of convergence of the pointwise, $$L_2$-norm; and; $L{\infty}$-norm associated with the estimator are investigated by restricting as to kernels with suitable assumptions. Under appropriate regularity conditions, it is shown that the optimal rate of convergence for 0$N^{-r}$ both in the pointwiseand $$L_2$-norm, while; $N^{r-1}(logN)^{-r}$is the optimal rate in the $L{\infty}-norm$. Some examples are given to illustrate the application of main results.

  • PDF

Bootstrap methods for long-memory processes: a review

  • Kim, Young Min;Kim, Yongku
    • Communications for Statistical Applications and Methods
    • /
    • v.24 no.1
    • /
    • pp.1-13
    • /
    • 2017
  • This manuscript summarized advances in bootstrap methods for long-range dependent time series data. The stationary linear long-memory process is briefly described, which is a target process for bootstrap methodologies on time-domain and frequency-domain in this review. We illustrate time-domain bootstrap under long-range dependence, moving or non-overlapping block bootstraps, and the autoregressive-sieve bootstrap. In particular, block bootstrap methodologies need an adjustment factor for the distribution estimation of the sample mean in contrast to applications to weak dependent time processes. However, the autoregressive-sieve bootstrap does not need any other modification for application to long-memory. The frequency domain bootstrap for Whittle estimation is provided using parametric spectral density estimates because there is no current nonparametric spectral density estimation method using a kernel function for the linear long-range dependent time process.

Blind Signal Processing for Impulsive Noise Channels

  • Kim, Nam-Yong;Byun, Hyung-Gi;You, Young-Hwan;Kwon, Ki-Hyeon
    • Journal of Communications and Networks
    • /
    • v.14 no.1
    • /
    • pp.27-33
    • /
    • 2012
  • In this paper, a new blind signal processing scheme for equalization in fading and impulsive-noise channel environments is introduced based on probability density functionmatching method and a set of Dirac-delta functions. Gaussian kernel of the proposed blind algorithm has the effect of cutting out the outliers on the difference between the desired level values and impulse-infected outputs. And also the proposed algorithm has relatively less sensitivity to channel eigenvalue ratio and has reduced computational complexity compared to the recently introduced correntropy algorithm. According to these characteristics, simulation results show that the proposed blind algorithm produces superior performance in multi-path communication channels corrupted with impulsive noise.

Uncertainty analysis of containment dose rate for core damage assessment in nuclear power plants

  • Wu, Guohua;Tong, Jiejuan;Gao, Yan;Zhang, Liguo;Zhao, Yunfei
    • Nuclear Engineering and Technology
    • /
    • v.50 no.5
    • /
    • pp.673-682
    • /
    • 2018
  • One of the most widely used methods to estimate core damage during a nuclear power plant accident is containment radiation measurement. The evolution of severe accidents is extremely complex, leading to uncertainty in the containment dose rate (CDR). Therefore, it is difficult to accurately determine core damage. This study proposes to conduct uncertainty analysis of CDR for core damage assessment. First, based on source term estimation, the Monte Carlo (MC) and point-kernel integration methods were used to estimate the probability density function of the CDR under different extents of core damage in accident scenarios with late containment failure. Second, the results were verified by comparing the results of both methods. The point-kernel integration method results were more dispersed than the MC results, and the MC method was used for both quantitative and qualitative analyses. Quantitative analysis indicated a linear relationship, rather than the expected proportional relationship, between the CDR and core damage fraction. The CDR distribution obeyed a logarithmic normal distribution in accidents with a small break in containment, but not in accidents with a large break in containment. A possible application of our analysis is a real-time core damage estimation program based on the CDR.