• Title/Summary/Keyword: Information Criterion

Search Result 1,508, Processing Time 0.045 seconds

Noninformative priors for the scale parameter in the generalized Pareto distribution

  • Kang, Sang Gil
    • Journal of the Korean Data and Information Science Society
    • /
    • v.24 no.6
    • /
    • pp.1521-1529
    • /
    • 2013
  • In this paper, we develop noninformative priors for the generalized Pareto distribution when the scale parameter is of interest. We developed the rst order and the second order matching priors. We revealed that the second order matching prior does not exist. It turns out that the reference prior and Jeffrey's prior do not satisfy a first order matching criterion, and Jeffreys' prior, the reference prior and the matching prior are different. Some simulation study is performed and a real example is given.

Noninformative priors for the common mean in log-normal distributions

  • Kang, Sang-Gil
    • Journal of the Korean Data and Information Science Society
    • /
    • v.22 no.6
    • /
    • pp.1241-1250
    • /
    • 2011
  • In this paper, we develop noninformative priors for the log-normal distributions when the parameter of interest is the common mean. We developed Jeffreys' prior, th reference priors and the first order matching priors. It turns out that the reference prior and Jeffreys' prior do not satisfy a first order matching criterion, and Jeffreys' pri the reference prior and the first order matching prior are different. Some simulation study is performed and a real example is given.

A Study on the Effectiveness of Information Retrieval (정보검색효율에 관한 연구)

  • Yoon Koo-ho
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.8
    • /
    • pp.73-101
    • /
    • 1981
  • Retrieval effectiveness is the principal criterion for measuring the performance of an information retrieval system. The effectiveness of a retrieval system depends primarily on the extent to which it can retrieve wanted documents without retrieving unwanted ones. So, ultimately, effectiveness is a function of the relevant and nonrelevant documents retrieved. Consequently, 'relevance' of information to the user's request has become one of the most fundamental concept encountered in the theory of information retrieval. Although there is at present no consensus as to how this notion should be defined, relevance has been widely used as a meaningful quantity and an adequate criterion for measures of the evaluation of retrieval effectiveness. The recall and precision among various parameters based on the 'two-by-two' table (or, contingency table) were major considerations in this paper, because it is assumed that recall and precision are sufficient for the measurement of effectiveness. Accordingly, different concepts of 'relevance' and 'pertinence' of documents to user requests and their proper usages were investigated even though the two terms have unfortunately been used rather loosely in the literature. In addition, a number of variables affecting the recall and precision values were discussed. Some conclusions derived from this study are as follows: Any notion of retrieval effectiveness is based on 'relevance' which itself is extremely difficult to define. Recall and precision are valuable concepts in the study of any information retrieval system. They are, however, not the only criteria by which a system may be judged. The recall-precision curve represents the average performance of any given system, and this may vary quite considerably in particular situations. Therefore, it is possible to some extent to vary the indexing policy, the indexing policy, the indexing language, or the search methodology to improve the performance of the system in terms of recall and precision. The 'inverse relationship' between average recall and precision could be accepted as the 'fundamental law of retrieval', and it should certainly be used as an aid to evaluation. Finally, there is a limit to the performance(in terms of effectiveness) achievable by an information retrieval system. That is : "Perfect retrieval is impossible."

  • PDF

한정고장집단의 출하품질 보증을 위한 샘플링검사방식 설계

  • 권영일
    • Proceedings of the Korean Reliability Society Conference
    • /
    • 2002.06a
    • /
    • pp.275-279
    • /
    • 2002
  • A Bayesian acceptance sampling plan for limited failure populations are developed. We consider a situation where defective products have short lifetimes and non-defective ones never fail during the technological life of the products. An acceptance criterion which guarantees the out going quality of accepted products is derived using the prior information on the quality of lots submitted for inspection. Numerical examples are provided.

  • PDF

New Stop Criterion for Reduce of Decoding Delay of Turbo Codes (터보부호의 복호지연 감소를 위한 새로운 반복중단 알고리즘)

  • Shim B. S.;Lee W. B.;Jeong D. H.;Lim S. J.;Kim T. H.;Kim H. Y.
    • Proceedings of the IEEK Conference
    • /
    • 2004.06a
    • /
    • pp.39-42
    • /
    • 2004
  • Turbo codes, proposed by Berrou, that increase the interleaver size and the number of iteration have better than conventional convolutional codes, in case of BER performance. However, because turbo codes has been required much decoding delay to increase iteration number, it demands unnecessary iterative decoding. Therefore, in this paper, we propose iterative decoding stop criterion that uses the variance of absolute value of LLR. This algorithm can be reduced average iterative decoding number and had lossless performance of BER, because of decreasing unnecessary iterative decoding.

  • PDF

The Modeling of the Optimal Data Format for JPEG2000 CODEC on the Fixed Compression Ratio (고정 압축률에서의 JPEG2000 코덱을 위한 최적의 데이터 형식 모델링)

  • Kang, Chang-Soo;Seo, Choon-Weon
    • Proceedings of the IEEK Conference
    • /
    • 2005.11a
    • /
    • pp.1257-1260
    • /
    • 2005
  • This paper is related to optimization in the image data format, which can make a great effect in performance of data compression and is based on the wavelet transform and JPEG2000. This paper established a criterion to decide the data format to be used in wavelet transform, which is on the bases of the data errors in frequency transform and quantization. This criterion has been used to extract the optimal data format experimentally. The result were (1, 9) of 10-bit fixed-point format for filter coefficients and (9, 7) of 16-bit fixed-point data format for wavelet coefficients and their optimality was confirmed.

  • PDF

Feature Vector Extraction using Time-Frequency Analysis and its Application to Power Quality Disturbance Classification (시간-주파수 해석 기법을 이용한 특징벡터 추출 및 전력 외란 신호 식별에의 응용)

  • 이주영;김기표;남상원
    • Proceedings of the IEEK Conference
    • /
    • 2001.09a
    • /
    • pp.619-622
    • /
    • 2001
  • In this paper, an efficient approach to classification of transient and harmonic disturbances in power systems is proposed. First, the Stop-and-Go CA CFAR Detector is utilized to detect a disturbance from the power signals which are mixed with other disturbances and noise. Then, (i) Wigner Distribution, SVD(Singular Value Decomposition) and Fisher´s Criterion (ii) DWT and Fisher´s Criterion, are applied to extract an efficient feature vector. For the classification procedure, a combined neural network classifier is proposed to classify each corresponding disturbance class. Finally, the 10 class data simulated by Matlab power system blockset are used to demonstrate the performance of the proposed classification system.

  • PDF

Decision Feedback Equalizer Algorithms based on Error Entropy Criterion (오차 엔트로피 기준에 근거한 결정 궤환 등화 알고리듬)

  • Kim, Nam-Yong
    • Journal of Internet Computing and Services
    • /
    • v.12 no.4
    • /
    • pp.27-33
    • /
    • 2011
  • For compensation of channel distortion from multipath fading and impulsive noise, a decision feedback equalizer (DFE) algorithm based on minimization of Error entropy (MEE) is proposed. The MEE criterion has not been studied for DFE structures and impulsive noise environments either. By minimizing the error entropy with respect to equalizer weight based on decision feedback structures, the proposed decision feedback algorithm has shown to have superior capability of residual intersymbol interference cancellation in simulation environments with severe multipath and impulsive noise.

Complex-Channel Blind Equalization Using Cross-Correntropy (상호 코렌트로피를 이용한 복소 채널 블라인드 등화)

  • Kim, Nam-Yong
    • Journal of Internet Computing and Services
    • /
    • v.11 no.5
    • /
    • pp.19-26
    • /
    • 2010
  • The criterionmaximizing cross-correntropy (MCC) of two different random variables has yielded superior performance comparing to mean squared error criterion. In this paper we present a complex-valued blind equalizer algorithm for QAM and complex channel environments based on cross-correntropy criterion which uses, as two variables, equalizer output PDF and Parzen PDF estimate of a self-generated symbol set. Simulation results show significantly enhanced performance of symbol-point concentration with no phase rotation in complex-channel communication.