• Title/Summary/Keyword: computer based estimation

Search Result 1,366, Processing Time 0.026 seconds

인터넷 질의 처리를 위한 웨이블릿 변환에 기반한 통합 요약정보의 관리

  • Joe, Moon-Jeung;Whang, Kyu-Young;Kim, Sang-Wook;Shim, Kyu-Seok
    • Journal of KIISE:Databases
    • /
    • v.28 no.4
    • /
    • pp.702-714
    • /
    • 2001
  • As Internet technology evolves, there is growing need of Internet queries involving multiple information sources. Efficient processing of such queries necessitates the integrated summary data that compactly represents the data distribution of the entire database scattered over many information sources. This paper presents an efficient method of managing the integrated summary data based on the wavelet transform and addresses Internet query processing using the integrated summary data. The simplest method for creating the integrated summary data would be to summarize the integrated data sidtribution obtained by merging the data distributions in multiple information sources. However, this method suffers from the high cost of transmitting storing and merging a large amount of data distribution. To overcome the drawbacks, we propose a new wavelet transform based method that creates the integrated summary data by merging multiple summary data and effective method for optimizing Internet queries using it A wavelet transformed summary data is converted to satisfy conditions for merging. Moreover i the merging process is very simpe owing to the properties of the wavelet transform. we formally derive the upper bound of the error of the wavelet transformed intergrated summary data. Compared with the histogram-based integrated summary data the wavelet transformedintegrated summary data provesto be 1.6~5.5 time more accurate when used for selectivity estimation in experiments. In processing Internet top-N queries involving 56 information sources using the integrated summary data reduces the processing cost to 1/44 of the cost of not using it.

  • PDF

Fast Algorithm for Disparity Estimation in ATSC-M/H based Hybrid 3DTV (ATSC-M/H 기반의 융합형 3DTV를 위한 양안시차 고속 추정 알고리즘)

  • Lee, Dong-Hee;Kim, Sung-Hoon;Lee, Jooyoung;Kang, Dongwook;Jung, Kyeong-Hoon
    • Journal of Broadcast Engineering
    • /
    • v.19 no.4
    • /
    • pp.521-532
    • /
    • 2014
  • ATSC-M/H based hybrid 3DTV, which is one of the service compatible 3DTV system, has considerable quality gap between the left and right views. And CRA(Conditional Replenishment Algorithm) has been proposed to deal with the issue of resolution mismatch and improve the visual quality. In CRA, the disparity vectors of stereoscopic images are estimated. The disparity compensated left view and simply enlarged right view are compared and conditionally selected for generating the enhanced right view. In order to implement CRA, a fast algorithm is strongly required because the disparity vectors need to be obtained at every layer and the complexity of CRA is quite high. In this paper, we adopted SDSP(Small Diamond Search Pattern) instead of full search and predicted the initial position of search pattern by examining the spatio-temporal correlation of disparity vectors and also suggested the SKIP mode to limit the number of processing units. The computer simulation showed that the proposed fast algorithm could greatly reduce the processing time while minimizing the quality degradation of reconstructed right view.

The Study for Performance Analysis of Software Reliability Model using Fault Detection Rate based on Logarithmic and Exponential Type (로그 및 지수형 결함 발생률에 따른 소프트웨어 신뢰성 모형에 관한 신뢰도 성능분석 연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.9 no.3
    • /
    • pp.306-311
    • /
    • 2016
  • Software reliability in the software development process is an important issue. Infinite failure NHPP software reliability models presented in the literature exhibit either constant, monotonic increasing or monotonic decreasing failure occurrence rates per fault. In this paper, reliability software cost model considering logarithmic and exponential fault detection rate based on observations from the process of software product testing was studied. Adding new fault probability using the Goel-Okumoto model that is widely used in the field of reliability problems presented. When correcting or modifying the software, finite failure non-homogeneous Poisson process model. For analysis of software reliability model considering the time-dependent fault detection rate, the parameters estimation using maximum likelihood estimation of inter-failure time data was made. The logarithmic and exponential fault detection model is also efficient in terms of reliability because it (the coefficient of determination is 80% or more) in the field of the conventional model can be used as an alternative could be confirmed. From this paper, the software developers have to consider life distribution by prior knowledge of the software to identify failure modes which can be able to help.

A Study on the performance improvement by loop interference cancellation and adaptive equalizer in OFDMA based Wibro relay station (OFDMA 기반 Wibro 중계국에서 루프 간섭 제거 및 적응 등화기를 이용한 성능 개선에 관한 연구)

  • Lee, Chong-Hyun;Lim, Seung-Gag
    • Journal of the Institute of Electronics Engineers of Korea TC
    • /
    • v.43 no.11 s.353
    • /
    • pp.141-148
    • /
    • 2006
  • This paper deals with the performance improvement by eliminating loop interference signal and inserting adaptive equalizer for phase compensation in OFDMA based Wibro relay station. The Wibro relay station is used for the extension of communication service area and for throughput improvement of base station. The loop interference is important factor of performance determination of relay station when transmitter and receiver is very closely located. In order to design interference canceller, we generated base-band OFDMA signal and then transmitted the signal along with pilot tones alined with two different combinations for training mode. And then, we generated received fading signal due to the loop interference added noise to the received signal. In the receiver, the transmitted signal is recovered by elimination of the interference signal with channel estimate and compensating phase by adaptive equalizer. The performance improvement was verified by computer simulation which show channel estimation, constellation of signal and BER characteristics according to the variation of SNR ratio.

CNN-based Image Rotation Correction Algorithm to Improve Image Recognition Rate (이미지 인식률 개선을 위한 CNN 기반 이미지 회전 보정 알고리즘)

  • Lee, Donggu;Sun, Young-Ghyu;Kim, Soo-Hyun;Sim, Issac;Lee, Kye-San;Song, Myoung-Nam;Kim, Jin-Young
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.20 no.1
    • /
    • pp.225-229
    • /
    • 2020
  • Recently, convolutional neural network (CNN) have been showed outstanding performance in the field of image recognition, image processing and computer vision, etc. In this paper, we propose a CNN-based image rotation correction algorithm as a solution to image rotation problem, which is one of the factors that reduce the recognition rate in image recognition system using CNN. In this paper, we trained our deep learning model with Leeds Sports Pose dataset to extract the information of the rotated angle, which is randomly set in specific range. The trained model is evaluated with mean absolute error (MAE) value over 100 test data images, and it is obtained 4.5951.

Region-based Spectral Correlation Estimator for Color Image Coding (컬러 영상 부호화를 위한 영역 기반 스펙트럴 상관 추정기)

  • Kwak, Noyoon
    • Journal of Digital Contents Society
    • /
    • v.17 no.6
    • /
    • pp.593-601
    • /
    • 2016
  • This paper is related to the Region-based Spectral Correlation Estimation(RSCE) coding method that makes it possible to achieve the high-compression ratio by estimating color component images from luminance image. The proposed method is composed of three steps. First, Y/C bit-plane summation image is defined using normalized chrominance summation image and luminance image, and then the Y/C bit-plane summation image is segmented for extracting the shape information of the regions. Secondly, the scale factor and the offset factor minimizing the approximation square errors between luminance image and R, B images by the each region are calculated. Finally, the scale factor and the offset factor for the each region are encoded into bit stream. Referring to the results of computer simulation, the proposed method provides more than two or three times higher compression ratio than JPEG/Baseline or JPEG2000/EBCOT algorithm in terms of bpp needed for encoding two color component images with the same PSNR.

Reliability-Based Service Life Estimation of Concrete in Marine Environment (신뢰성이론에 기반한 해양환경 콘크리트의 내구수명 평가)

  • Kim, Ki-Hyun;Cha, Soo-Won
    • Journal of the Korea Concrete Institute
    • /
    • v.22 no.4
    • /
    • pp.595-603
    • /
    • 2010
  • Monte-Carlo simulation technique is often used in order to predict service life of concrete structure subjected to chloride penetration in marine environment based on probability theory. Monte-Carlo simulation method, however, the method gives different results every time that the simulation is run. On the other hand, moment method, which is frequently used in reliability analysis, needs negligible computational cost compared with simulation technique and gives a constant result for the same problem. Thus, in this study, moment method was applied to the calculation of corrosion-initiation probability. For this purpose, computer programs to calculate failure probabilities are developed using first-order second moment (FOSM) and second-order second moment (SOSM) methods, respectively. From the analysis examples with the developed programs, SOSM was found to give a more accurate result than FOSM does. The sensitivity analysis has shown that the factor affecting the corrosion-initiation probability the most was the cover depth, and the corrosion-initiation probability was influenced more by its coefficient of variation than its mean value.

Performance Improvement of Adaptive Hierarchical Hexagon Search by Extending the Search Patterns (탐색 패턴 확장에 의한 적응형 계층 육각 탐색의 성능 개선)

  • Kwak, No-Yoon
    • Journal of Digital Contents Society
    • /
    • v.9 no.2
    • /
    • pp.305-315
    • /
    • 2008
  • Pre-proposed AHHS(Adaptive Hierarchical Hexagon Search) is a kind of the fast hierarchical block matching algorithm based on the AHS(Adaptive Hexagon Search). It is characterized as keeping the merits of the AHS capable of fast estimating motion vectors and also adaptively reducing the local minima often occurred in the video sequences with higher spatio-temporal motion activity. The objective of this paper is to propose the method effectively extending the horizontal biased pattern and the vertical biased pattern of the AHHS to improve its predictive image quality. In the paper, based on computer simulation results for multiple video sequences with different motion characteristics, the performance of the proposed method was analysed and assessed in terms of the predictive image quality and the computational time. The simulation results indicated that the proposed method was both suitable for (quasi-) stationary and large motion searches. While the proposed method increased the computational load on the process extending the hexagon search patterns, it could improve the predictive image quality so as to cancel out the increase of the computational load.

  • PDF

Enhanced VLC-TDoA Algorithm for Indoor Positioning Without LED-ID (LED-ID 없이 실내 위치 추정이 가능한 개선된 VLC-TDoA 알고리즘)

  • Do, Trong-Hop;Hwang, Junho;Yoo, Myungsik
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.38B no.8
    • /
    • pp.672-678
    • /
    • 2013
  • In the recent year, along with the rapid development of LED technology, many applications using LEDs have been researched and indoor positioning is one of them. In particular, previous indoor positioning systems based on visible light communication combine triangulation manner such as AoA, ToA TDoA. But most of them needs transmitting unique ID of each LED panel. In this paper, we propose an non LED-ID based indoor positioning system in which the visible light radiated from LEDs is used to locate the position of receiver. Compared to current indoor positioning systems using LED light, our system has the advantages of simple implementation, low cost and high accuracy. Through the computer simulation, our system can achieve a high estimation accuracy of 3.6cm in average in the $5{\times}5{\times}3m^3$ room.

Double Talk Detection Based on the Fuzzy Rules in Adaptive Echo Canceller (적응 반향제거기에서 퍼지규칙에 기초한 동시통화 검출)

  • 류근택;김대성;배현덕
    • The Journal of the Acoustical Society of Korea
    • /
    • v.19 no.7
    • /
    • pp.34-41
    • /
    • 2000
  • This paper proposes a new double-talk detection algorithm which is based on the fuzzy rules, in the adaptive echo canceller of telecommunication system. In this method, the two inputs of the fuzzy inference for detecting double-talk condition are used. One is the cross-correlation coefficient between the error signal and the primary signal which is the summation of the real echo signal and the near-end signal. The other one is the cross-correlation coefficient between the estimation error signal and the primary signal. The fuzzy controller makes a fuzzification for two inputs by the membership functions of trapezoid does the max-min composition using if-then rules. The composed result is defuzzificated by the center gravity method. And by defuzzificated values, the double-talt the echo path variance, and the echo path variance during the double-talk are detected. It is confirmed by computer simulation that this fuzzy double-talk detector is able to estimate the double talk and the echo path variation condition, and even track echo path variation more accurately than the conventional algorithm during the double-talk period.

  • PDF