• Title/Summary/Keyword: Weighted scale

Search Result 272, Processing Time 0.023 seconds

HDR image display combines weighted least square filtering with color appearance model

  • Piao, Meixian;Lee, Kyungjun;Jeong, Jechang
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2016.06a
    • /
    • pp.260-263
    • /
    • 2016
  • Recently high dynamic range imaging technique is hot issue in computer graphic area. We present a progressive tone mapping algorithm, which is based on weighted least squares optimization framework. Our approach combines weighted leastsquaresfiltering with iCAM06, for showing more perceptual high dynamic range images in conventional display, while avoiding visual halo artifacts. We decompose high dynamic range image into base layer and detail layer. The base layer has large scale variation, it is obtained by using weighted least squares filtering, and then the base layer incorporates iCAM06 model. Then, adaptive compression on the base layer according to human visual system. Only the base layer reduces contrast, and preserving detail. The resultshows more perceptual color appearance and preserve fine detail, while avoiding common artifacts.

  • PDF

2D Shape Recognition System Using Fuzzy Weighted Mean by Statistical Information

  • Woo, Young-Woon;Han, Soo-Whan
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2009.01a
    • /
    • pp.49-54
    • /
    • 2009
  • A fuzzy weighted mean method on a 2D shape recognition system is introduced in this paper. The bispectrum based on third order cumulant is applied to the contour sequence of each image for the extraction of a feature vector. This bispectral feature vector, which is invariant to shape translation, rotation and scale, represents a 2D planar image. However, to obtain the best performance, it should be considered certain criterion on the calculation of weights for the fuzzy weighted mean method. Therefore, a new method to calculate weights using means by differences of feature values and their variances with the maximum distance from differences of feature values. is developed. In the experiments, the recognition results with fifteen dimensional bispectral feature vectors, which are extracted from 11.808 aircraft images based on eight different styles of reference images, are compared and analyzed.

  • PDF

Automatic Vowel Onset Point Detection Based on Auditory Frequency Response (청각 주파수 응답에 기반한 자동 모음 개시 지점 탐지)

  • Zang, Xian;Kim, Hag-Tae;Chong, Kil-To
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.13 no.1
    • /
    • pp.333-342
    • /
    • 2012
  • This paper presents a vowel onset point (VOP) detection method based on the human auditory system. This method maps the "perceptual" frequency scale, i.e. Mel scale onto a linear acoustic frequency, and then establishes a series of Triangular Mel-weighted Filter Bank simulate the function of band pass filtering in human ear. This nonlinear critical-band filter bank helps greatly reduce the data dimensionality, and eliminate the effect of harmonic waves to make the formants more prominent in the nonlinear spaced Mel spectrum. The sum of mel spectrum peaks energy is extracted as feature for each frame, and the instinct at which the energy amplitude starts rising sharply is detected as VOP, by convolving with Gabor window. For the single-word database which contains 12 vowels articulated with different kinds of consonants, the experimental results showed a good average detection rate of 72.73%, higher than other vowel detection methods based on short-time energy and zero-crossing rate.

Frequency analysis of nonidentically distributed large-scale hydrometeorological extremes for South Korea

  • Lee, Taesam;Jeong, Changsam;Park, Taewoong
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2015.05a
    • /
    • pp.537-537
    • /
    • 2015
  • In recent decades, the independence and identical distribution (iid) assumption for extreme events has been shown to be invalid in many cases because long-term climate variability resulting from phenomena such as the Pacific decadal variability and El Nino-Southern Oscillation may induce varying meteorological systems such as persistent wet years and dry years. Therefore, in the current study we propose a new parameter estimation method for probability distribution models to more accurately predict the magnitude of future extreme events when the iid assumption of probability distributions for large-scale climate variability is not adequate. The proposed parameter estimation is based on a metaheuristic approach and is derived from the objective function of the rth power probability-weighted sum of observations in increasing order. The combination of two distributions, gamma and generalized extreme value (GEV), was fitted to the GEV distribution in a simulation study. In addition, a case study examining the annual hourly maximum precipitation of all stations in South Korea was performed to evaluate the performance of the proposed approach. The results of the simulation study and case study indicate that the proposed metaheuristic parameter estimation method is an effective alternative for accurately selecting the rth power when the iid assumption of extreme hydrometeorological events is not valid for large-scale climate variability. The maximum likelihood estimate is more accurate with a low mixing probability, and the probability-weighted moment method is a moderately effective option.

  • PDF

On the Effects of Plotting Positions to the Probability Weighted Moments Method for the Generalized Logistic Distribution

  • Kim, Myung-Suk
    • Communications for Statistical Applications and Methods
    • /
    • v.14 no.3
    • /
    • pp.561-576
    • /
    • 2007
  • Five plotting positions are applied to the computation of probability weighted moments (PWM) on the parameters of the generalized logistic distribution. Over a range of parameter values with some finite sample sizes, the effects of five plotting positions are investigated via Monte Carlo simulation studies. Our simulation results indicate that the Landwehr plotting position frequently tends to document smaller biases than others in the location and scale parameter estimations. On the other hand, the Weibull plotting position often tends to cause larger biases than others. The plotting position (i - 0.35)/n seems to report smaller root mean square errors (RMSE) than other plotting positions in the negative shape parameter estimation under small samples. In comparison to the maximum likelihood (ML) method under the small sample, the PWM do not seem to be better than the ML estimators in the location and scale parameter estimations documenting larger RMSE. However, the PWM outperform the ML estimators in the shape parameter estimation when its magnitude is near zero. Sensitivity of right tail quantile estimation regarding five plotting positions is also examined, but superiority or inferiority of any plotting position is not observed.

Analysis on Image Compression using Weighted Finite Automata (WFA를 이용한 이미지 압축 알고리즘에 대한 분석)

  • 엄준형;김태환
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2002.04a
    • /
    • pp.727-729
    • /
    • 2002
  • 본 논문에서 우리는 grey scale 영상을 weighted finite automata(WFA)로써 기술하는 두개의 알고리즘(2, 4)을 분석하였다. 또한 원영상과 WFA를 이용하여 압축된 영상간의 error를 분석하고 그 결과를 제시하였다. 구체적으로, 영상복원 tolerance $\delta$를 이용하여 찾아진 atomatone에 의해 복원된 영상과 원영상의 ι$^2$-norm의 차이가 $\delta$보다 작거나 같음을 증명하였다.

  • PDF

Application of Bayesian Computational Techniques in Estimation of Posterior Distributional Properties of Lognormal Distribution

  • Begum, Mun-Ni;Ali, M. Masoom
    • Journal of the Korean Data and Information Science Society
    • /
    • v.15 no.1
    • /
    • pp.227-237
    • /
    • 2004
  • In this paper we presented a Bayesian inference approach for estimating the location and scale parameters of the lognormal distribution using iterative Gibbs sampling algorithm. We also presented estimation of location parameter by two non iterative methods, importance sampling and weighted bootstrap assuming scale parameter as known. The estimates by non iterative techniques do not depend on the specification of hyper parameters which is optimal from the Bayesian point of view. The estimates obtained by more sophisticated Gibbs sampler vary slightly with the choices of hyper parameters. The objective of this paper is to illustrate these tools in a simpler setup which may be essential in more complicated situations.

  • PDF

Volatility and Z-Type Jumps of Euro Exchange Rates Using Outlying Weighted Quarticity Statistics in the 2010s

  • Yi, Chae-Deug
    • Journal of Korea Trade
    • /
    • v.23 no.2
    • /
    • pp.110-126
    • /
    • 2019
  • Purpose - This paper examines the recently realized continuous volatility and discrete jumps of US Dollar/Euro returns using the frequency of five minute returns spanning the period from February 2010 through February 2018with periodicity filters. Design/Methodology - This paper adopts the nonparametric estimation. The realized volatility and Realized Outlying Weighted variations show non-Gaussian, fat-tailed, and leptokurtic distributions. Some significant volatility jumps in returns occurred from 2010 through 2018, and the very exceptionally large and irregular jumps occurred around 2010-2011, after the EU financial crisis, and 2015-2016. The outliers occurred somewhat frequently around the years of 2015 and 2016. Originality/value - When we include periodicity filters of volatility such as MAD, Short Half Scale, and WSD, the five minute returns of US Dollar/Euro exchange rates have smaller daily jump probabilities by 20-30% than when we do not include the periodicity filters of volatility. Thus, when we consider the periodicity filters of volatility such as MAD, Short Half Scale, and WSD, the five minute returns of US Dollar/Euro have considerably smaller jump probabilities.

Measuring Efficiency of Global Electricity Companies Using Data Envelopment Analysis Model (DEA모형을 이용한 전력회사의 효율성 분석에 관한 연구)

  • Kim, Tae Ung;Jo, Sung Han
    • Environmental and Resource Economics Review
    • /
    • v.9 no.2
    • /
    • pp.349-371
    • /
    • 2000
  • Data Envelopment Analysis model is a linear programming based technique for measuring the relative performance of organizational units where the presence of multiple inputs and outputs makes comparison difficult. A common measure for relative efficiency is weighted sum of outputs divided by weighted sum of inputs. DEA model allows each unit to adopt a set of weight that shows it in the most favorable light in comparison to the other unit. In this paper, we present the mathematical background and characteristics of DEA model, and give a short case study where we apply the DEA model to evaluate the relative efficiencies of 51 global electricity companies. The technical efficiency and scale efficiency are also to be investigated. Generating capacity and the number of employees are used for input data, and revenue, net profit and electricity sales are used for output data. We find that the companies with 100% relative efficiency are only 9 among 51 electricity companies. And the technical and scale efficiency of KEPCO is 98.7% and 78.89%, respectively. This means that the inefficiency of KEPCO is caused by the scale inefficiency. The analysis shows that the employees should be decreased by 15% at minimum to get the 100% efficiency. The result suggests that KEPCO needs the structural reform to improve the efficiency.

  • PDF

Families of Distributions Arising from Distributions of Ordered Data

  • Ahmadi, Mosayeb;Razmkhah, M.;Mohtashami Borzadaran, G.R.
    • Communications for Statistical Applications and Methods
    • /
    • v.22 no.2
    • /
    • pp.105-120
    • /
    • 2015
  • A large family of distributions arising from distributions of ordered data is proposed which contains other models studied in the literature. This extension subsume many cases of weighted random variables such as order statistics, records, k-records and many others in variety. Such a distribution can be used for modeling data which are not identical in distribution. Some properties of the theoretical model such as moment, mean deviation, entropy criteria, symmetry and unimodality are derived. The proposed model also studies the problem of parameter estimation and derives maximum likelihood estimators in a weighted gamma distribution. Finally, it will be shown that the proposed model is the best among the previously introduced distributions for modeling a real data set.