• Title/Summary/Keyword: level matching

Search Result 589, Processing Time 0.03 seconds

Contour Shape Matching based Motion Vector Estimation for Subfield Gray-scale Display Devices (서브필드계조방식 디스플레이 장치를 위한 컨투어 쉐이프 매칭 기반의 모션벡터 추정)

  • Choi, Im-Su;Kim, Jae-Hee
    • Proceedings of the IEEK Conference
    • /
    • 2007.07a
    • /
    • pp.327-328
    • /
    • 2007
  • A contour shape matching based pixel motion estimation is proposed. The pixel motion information is very useful to compensate the motion artifact generated at the specific gray level contours in the moving image for subfield gray-scale display devices. In this motion estimation method, the gray level boundary contours are extracted from the input image. Then using contour shape matching, the most similar contour in next frame is found, and the contour is divided into segment unit. The pixel motion vector is estimated from the displacement of the each segment in the contour by segment matching. From this method, more precise motion vector can be estimated and this method is more robust to image motion with rotation or from illumination variations.

  • PDF

Measuring Level of Difficulty of Fingerprint Database based on Sample Quality (영상 품질 기반의 지문 데이터베이스의 난이도 정량화)

  • Ryu, Ji-Eun;Jang, Ji-Hyeon;Kim, Hak-Il
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.18 no.5
    • /
    • pp.59-69
    • /
    • 2008
  • The purpose of this paper is to measure the level of difficulty of fingerprint database based on sample quality. This paper proposes distribution of a sample quality analyzer and a difference of sample quality analyzer to measure the level of difficulty. Experimental results demonstrate that there are stronger correlation between matching performance and level of difficulty based on difference of sample quality than other measure. Especially, level of difficulty based on OQ Block of MPQ co-occurrence matrix shows highest correlation with matching performance, and moreover it can predict the matching performance of unknown databases.

Robust $H_{\infty}$ Controller Design for Steam Generator Water Level Control using Mixed $H_{\infty}$ Optimization Method (혼합 $H_{\infty}$ 최적화 기법을 이용한 견실 $H_{\infty}$ 증기발생기 수위제어기 설계)

  • 서성환;조희수;박홍배
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.5 no.3
    • /
    • pp.363-369
    • /
    • 1999
  • In this paper, we design the robust $H_{\infty}$ controller for water level control of steam generator using a mixed $H_{\infty}$ optimization with model-matching method. Firstly we choose the desired model which has good disturbance rejection performance. Secondly we design a stabilizing controller to keep the model-matching error small and also provide sufficiently large stability margin against additive perturbations of the nominal plant. Simulation results show that proposed robust $H_{\infty}$ controller at specific power operation has satisfactory performances against the variations of load power, steam flow rate, primary circuit coolant temperature, and feedwater temperature. It can be also observed that the proposed robust $H_{\infty}$ controller exhibits better robust stability than conventional PI controller.

  • PDF

Optimization Driven MapReduce Framework for Indexing and Retrieval of Big Data

  • Abdalla, Hemn Barzan;Ahmed, Awder Mohammed;Al Sibahee, Mustafa A.
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.14 no.5
    • /
    • pp.1886-1908
    • /
    • 2020
  • With the technical advances, the amount of big data is increasing day-by-day such that the traditional software tools face a burden in handling them. Additionally, the presence of the imbalance data in big data is a massive concern to the research industry. In order to assure the effective management of big data and to deal with the imbalanced data, this paper proposes a new indexing algorithm for retrieving big data in the MapReduce framework. In mappers, the data clustering is done based on the Sparse Fuzzy-c-means (Sparse FCM) algorithm. The reducer combines the clusters generated by the mapper and again performs data clustering with the Sparse FCM algorithm. The two-level query matching is performed for determining the requested data. The first level query matching is performed for determining the cluster, and the second level query matching is done for accessing the requested data. The ranking of data is performed using the proposed Monarch chaotic whale optimization algorithm (M-CWOA), which is designed by combining Monarch butterfly optimization (MBO) [22] and chaotic whale optimization algorithm (CWOA) [21]. Here, the Parametric Enabled-Similarity Measure (PESM) is adapted for matching the similarities between two datasets. The proposed M-CWOA outperformed other methods with maximal precision of 0.9237, recall of 0.9371, F1-score of 0.9223, respectively.

An Efficient DNA Sequence Compression using Small Sequence Pattern Matching

  • Murugan., A;Punitha., K
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.8
    • /
    • pp.281-287
    • /
    • 2021
  • Bioinformatics is formed with a blend of biology and informatics technologies and it employs the statistical methods and approaches for attending the concerning issues in the domains of nutrition, medical research and towards reviewing the living environment. The ceaseless growth of DNA sequencing technologies has resulted in the production of voluminous genomic data especially the DNA sequences thus calling out for increased storage and bandwidth. As of now, the bioinformatics confronts the major hurdle of management, interpretation and accurately preserving of this hefty information. Compression tends to be a beacon of hope towards resolving the aforementioned issues. Keeping the storage efficiently, a methodology has been recommended which for attending the same. In addition, there is introduction of a competent algorithm that aids in exact matching of small pattern. The DNA representation sequence is then implemented subsequently for determining 2 bases to 6 bases matching with the remaining input sequence. This process involves transforming of DNA sequence into an ASCII symbols in the first level and compress by using LZ77 compression method in the second level and after that form the grid variables with size 3 to hold the 100 characters. In the third level of compression, the compressed output is in the grid variables. Hence, the proposed algorithm S_Pattern DNA gives an average better compression ratio of 93% when compared to the existing compression algorithms for the datasets from the UCI repository.

Development of Matching Priors for P(X < Y) in Exprnential dlstributions

  • Lee, Gunhee
    • Journal of the Korean Statistical Society
    • /
    • v.27 no.4
    • /
    • pp.421-433
    • /
    • 1998
  • In this paper, matching priors for P(X < Y) are investigated when both distributions are exponential distributions. Two recent approaches for finding noninformative priors are introduced. The first one is the verger and Bernardo's forward and backward reference priors that maximizes the expected Kullback-Liebler Divergence between posterior and prior density. The second one is the matching prior identified by matching the one sided posterior credible interval with the frequentist's desired confidence level. The general forms of the second- order matching prior are presented so that the one sided posterior credible intervals agree with the frequentist's desired confidence levels up to O(n$^{-1}$ ). The frequentist coverage probabilities of confidence sets based on several noninformative priors are compared for small sample sizes via the Monte-Carlo simulation.

  • PDF

Automatic generation of reliable DEM using DTED level 2 data from high resolution satellite images (고해상도 위성영상과 기존 수치표고모델을 이용하여 신뢰성이 향상된 수치표고모델의 자동 생성)

  • Lee, Tae-Yoon;Jung, Jae-Hoon;Kim, Tae-Jung
    • Spatial Information Research
    • /
    • v.16 no.2
    • /
    • pp.193-206
    • /
    • 2008
  • If stereo images is used for Digital Elevation Model (DEM) generation, a DEM is generally made by matching left image against right image from stereo images. In stereo matching, tie-points are used as initial match candidate points. The number and distribution of tie-points influence the matching result. DEM made from matching result has errors such as holes, peaks, etc. These errors are usually interpolated by neighbored pixel values. In this paper, we propose the DEM generation method combined with automatic tie-points extraction using existing DEM, image pyramid, and interpolating new DEM using existing DEM for more reliable DEM. For test, we used IKONOS, QuickBird, SPOT5 stereo images and a DTED level 2 data. The test results show that the proposed method automatically makes reliable DEMs. For DEM validation, we compared heights of DEM by proposed method with height of existing DTED level 2 data. In comparison result, RMSE was under than 15 m.

  • PDF

A New Carrier Recovery Algorithm Usign $\theta$-matching method for QAM Demodulator ($\theta$-정합을 이용한 QAM 복조용 Carrier Recovery)

  • 박휘원;장일순정차근조경록
    • Proceedings of the IEEK Conference
    • /
    • 1998.10a
    • /
    • pp.179-182
    • /
    • 1998
  • Carrier recovery, the process of recoverying the carrier in receiver, removes the phase difference between VCO and the received signal. However, the conventional structure of carrier recovery cannot be applied to multi-level QAM demodulator because of the increasing decision interval and the complexity of control as the number of symbol increases. In this paper, we suggest a new carrier recovery algorithm using $\theta-matching$ algorithm for multi-level QAM demodulation to overcome this problem and analysis the performance and implement it.

  • PDF

A Multi-Level Accumulation-Based Rectification Method and Its Circuit Implementation

  • Son, Hyeon-Sik;Moon, Byungin
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.6
    • /
    • pp.3208-3229
    • /
    • 2017
  • Rectification is an essential procedure for simplifying the disparity extraction of stereo matching algorithms by removing vertical mismatches between left and right images. To support real-time stereo matching, studies have introduced several look-up table (LUT)- and computational logic (CL)-based rectification approaches. However, to support high-resolution images, the LUT-based approach requires considerable memory resources, and the CL-based approach requires numerous hardware resources for its circuit implementation. Thus, this paper proposes a multi-level accumulation-based rectification method as a simple CL-based method and its circuit implementation. The proposed method, which includes distortion correction, reduces addition operations by 29%, and removes multiplication operations by replacing the complex matrix computations and high-degree polynomial calculations of the conventional rectification with simple multi-level accumulations. The proposed rectification circuit can rectify $1,280{\times}720$ stereo images at a frame rate of 135 fps at a clock frequency of 125 MHz. Because the circuit is fully pipelined, it continuously generates a pair of left and right rectified pixels every cycle after 13-cycle latency plus initial image buffering time. Experimental results show that the proposed method requires significantly fewer hardware resources than the conventional method while the differences between the results of the proposed and conventional full rectifications are negligible.

The Impacts of Knowledge Level and Need for Closure and on Overall Evaluations : Considering the Moderating Role of Situational Severity (지식수준과 종결욕구가 전반적 평가에 미치는 영향 : 상황적 심각성의 조절효과를 중심으로)

  • Kim, Cheongil
    • Knowledge Management Research
    • /
    • v.10 no.4
    • /
    • pp.115-131
    • /
    • 2009
  • This paper attempts to show that consumers' own information processing mode can play an important role in inducing favorable product evaluations, which is the most key goal of marketing. Th elaboration likelihood model contends that consumers' motivation and knowledge, in addition to the outside marketing information, affects the evaluation process. On the other hand, The resource matching hypothesis suggests that an excessively high level of information processing may lead to negative evaluations. In this study, Need for closure exacerbated overall evaluations of consumers. Such relationship was more salient in the condition of low severity that in the condition of high severity. Also under the situation of low severity, consumers with high level of relevant knowledge made evaluations more favorable, compared to the consumers of low knowledge. On contrast under the situation of high severity, relevant knowledge leaded to less favorable evaluations. This experiment identifies the appropriateness of the elaboration likelihood model and the resource matching hypothesis. Especially This study suggests an rare example that consumers' knowledge may not paly an desirable role in making their judgments.

  • PDF