• 제목/요약/키워드: two-step search

검색결과 154건 처리시간 0.024초

고속 동영상 부호기를 위한 부호화 방법에 관한 연구 (A study on the Encoding Method for High Performance Moving Picture Encoder)

  • 김용욱;허도근
    • 한국정보통신학회논문지
    • /
    • 제8권2호
    • /
    • pp.352-358
    • /
    • 2004
  • 본 논문은 움직임 벡터의 분포특성을 이용한 새로운 움직임 벡터 탐색 알고리즘과 정수 연산만을 필요로 하는 정수형 DCT(Discrete Cosine Transform)를 사용하여 H.263 환경에서 동영상 부호기의 성능향상에 대해 연구한다. 정수형 DCT는 Un(Walsh-Hadamard Transform)와 정수 lifting을 이용하여 정수의 덧셈 연산만으로 DCT연산을 수행하므로 부동소수점수의 곱셈을 포함하는 기존 DCT에 비하여 연산량은 줄이면서도 동일한 PSNR을 얻는다. 새로운 움직임 벡터 탐색 알고리즘은 기존의 움직임 벡터 탐색 알고리즘인 3SS(Step Search)나 4SS에 비하여 움직임 추정에 필요한 연산량을 감소시키면서 거의 비슷한 PSNR을 보인다. 또한 모의 실험에서 H.263 부호기 환경에서 정수형 DCT와 기존 DCT는 서로 호환됨을 보인다. 따라서 본 논문에서 제안한 부호화 방법은 H.263 부호화 과정에서 동영상 정보의 효율적인 실시간 처리를 가능하게 하며 다른 동영상 부호기에도 적용하여 부호화 성능을 향상시킬 수 있다.

A Form-finding Technique for Three-dimensional Spatial Structures

  • Lee, Sang Jin
    • Architectural research
    • /
    • 제15권4호
    • /
    • pp.207-214
    • /
    • 2013
  • A form-finding technique is proposed for three-dimensional spatial structures. Two-step discrete finite element (FE) mesh generator based on computer aided geometric design (CAGD) is introduced and used to control the shape of three-dimensional spatial structures. Mathematical programming technique is adopted to search new forms (or shapes) of spatial structures. For this purpose, the strain energy is introduced as the objective function to be minimized and the initial volume (or the initial weight) is considered as constraint function. Numerical examples are carried out to test the capability of the proposed form-finding techniques and provided as benchmark tests.

콤플렉스 시스템의 신뢰도 최적화를 위한 발견적 합성해법의 개발 (A Hybrid-Heuristic for Reliability Optimization in Complex Systems)

  • 김재환
    • 해양환경안전학회지
    • /
    • 제5권2호
    • /
    • pp.87-97
    • /
    • 1999
  • This study is concerned with developing a hybrid heuristic algorithm for solving the redundancy optimization problem which is very important in system safety, This study develops a HH(Hybrid Heuristic) method combined with two strategies to alleviate the risks of being trapped at a local optimum. One of them is to construct the populations of the initial solutions randomly. The other is the additional search with SA(Simulated Annealing) method in final step. Computational results indicate that HH performs consistently better than the KY method proposed in Kim[8]. Therefore, the proposed HH is believed to an attractive to other heuristic methods.

  • PDF

가변 블록을 고려한 블록 정합 알고리즘에 관한 연구 (A Study on Block Matching Algorithm with Variable-Block Size)

  • 김진태;주창희;최종수
    • 대한전자공학회논문지
    • /
    • 제26권9호
    • /
    • pp.1420-1427
    • /
    • 1989
  • A new block matching algorithm that improved the existing block matching algorithm in terms of image quality is proposed in this paper. The subblock of image including the vertical edge of object is subdivided into new two subblocks, and the moving vector found. The result of computer simulation shows on real image that the image quality by the algorithm becomes higher than that of the three step search algorithm by 1.1dB.

  • PDF

Low-delay Node-disjoint Multi-path Routing using Complementary Trees for Industrial Wireless Sensor Networks

  • Liu, Luming;Ling, Zhihao;Zuo, Yun
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제5권11호
    • /
    • pp.2052-2067
    • /
    • 2011
  • Complementary trees are two spanning trees rooted at the sink node satisfying that any source node's two paths to the sink node on the two trees are node-disjoint. Complementary trees routing strategy is a special node-disjoint multi-path routing approach. Several complementary trees routing algorithms have been proposed, in which path discovery methods based on depth first search (DFS) or Dijkstra's algorithm are used to find a path for augmentation in each round of path augmentation step. In this paper, a novel path discovery method based on multi-tree-growing (MTG) is presented for the first time to our knowledge. Based on this path discovery method, a complementary trees routing algorithm is developed with objectives of low average path length on both spanning trees and low complexity. Measures are employed in our complementary trees routing algorithm to add a path with nodes near to the sink node in each round of path augmentation step. The simulation results demonstrate that our complementary trees routing algorithm can achieve low average path length on both spanning trees with low running time, suitable for wireless sensor networks in industrial scenarios.

강인한 크로마그램 성분 추출을 통한 커버곡 검색 성능 개선 (Improving Cover Song Search Accuracy by Extracting Salient Chromagram Components)

  • 서진수
    • 한국멀티미디어학회논문지
    • /
    • 제22권6호
    • /
    • pp.639-645
    • /
    • 2019
  • This paper proposes a salient chromagram components extraction method based on the temporal discrete cosine transform of a chromagram block to improve cover song retrieval accuracy. The proposed salient chromagram emphasizes tonal contents of music, which are well-preserved between an original song and its cover version, while reducing the effects of timbre difference. We apply the proposed salient chromagram extraction method as a preprocessing step for the Fourier-transform based cover song matching. Experiments on two cover song datasets confirm that the proposed salient chromagram improves the cover song search accuracy.

Application of Correlation-Aided DSA(CDSA) Technique to Fast Cell Search in IMT-2000 W-CDMA Systems.

  • Kim, Byoung-Hoon;Jeong, Byeong-Kook;Lee, Byeong-Gi
    • Journal of Communications and Networks
    • /
    • 제2권1호
    • /
    • pp.58-68
    • /
    • 2000
  • In this paper we introduce the correlation-aided distributed sample acquisition (CDSA) scheme for fast cell search in IMT-2000 W-CDMA cellular system. The proposed scheme incorporates the state symbol correlation process into the comparison-correction based synchronization process of the original DSA scheme to enable fast acquisition even under very poor channel environment. for its realization, each mobile station (MS) has to store in its memory a set of state sample sequences. which are determined by the long-period scrambling sequences used in the system and the sampling interval of the state samples. CDSA based cell search is carried out in two stages : First, the MS first acquires the slot timing by using the primary synch code (PSC) and then identifies the igniter code which conveys the state samples of the current cell . Secondly. the MS identifies the scrambling code and frame timing by taking the comparison-correction based synchronization approach and, if the identification is not done satisfactorily within preset time. it initiates the state symbol correlation process which correlates the received symbol sequence with the pre-stored state sample sequences for a successful identification. As the state symbol SNR is relatively high. the state symbol correlation process enables reliable synchronization even in very low chip-SNR environment. Simulation results show that the proposed CDSA scheme outperforms the 3GPP 3-step approach, requiring the signal power of about 7 dB less for achieving the same acquisition time performance in low-SNR environments. Furthermore, it turns out very robust in the typical synchronization environment where large frequency offset exists.

  • PDF

배경영상에서 유전자 알고리즘을 이용한 얼굴의 각 부위 추출 (Facial Feature Extraction using Genetic Algorithm from Original Image)

  • 이형우;이상진;박석일;민홍기;홍승홍
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 2000년도 하계종합학술대회 논문집(4)
    • /
    • pp.214-217
    • /
    • 2000
  • Many researches have been performed for human recognition and coding schemes recently. For this situation, we propose an automatic facial feature extraction algorithm. There are two main steps: the face region evaluation from original background image such as office, and the facial feature extraction from the evaluated face region. In the face evaluation, Genetic Algorithm is adopted to search face region in background easily such as office and household in the first step, and Template Matching Method is used to extract the facial feature in the second step. We can extract facial feature more fast and exact by using over the proposed Algorithm.

  • PDF

제로-로드 슬라이더의 부상특성에 관한 연구 (A Study on the Flying Characteristics of Zero-Load Sliders)

  • 윤상준;강태식;최동훈
    • Tribology and Lubricants
    • /
    • 제11권2호
    • /
    • pp.15-23
    • /
    • 1995
  • A zero-load slider is composed of two outside rails which produce a lift force pushing up the slider from the disk surface and a wide reverse step region which produces a suction force attracting the slider to the disk surface. In this paper, the flying characteristics of zero-load sliders are obtained by using an optimization technique. In the pressure calculation module, the FIFD scheme is used to solve the modified Reynolds equation. The BFGS method and a line search algorithm is employed to predict the static flying attitude. To investigate the effect of the geometric- parameters of zero-load sliders on the flying characteristics, recess depth, front step width, rail width, and taper height are varied and the corresponding flying attitudes are obtained. Simulation results demonstrate that recess depth and rail width have significant influences on the flying characteristics.

강건성 지수를 이용한 강건설계 기법의 개발 (Development of a Robust Design Process Using a Robustness Index)

  • 황광현;박경진
    • 대한기계학회논문집A
    • /
    • 제27권8호
    • /
    • pp.1426-1435
    • /
    • 2003
  • Design goal is to find the one that has the highest probability of success and the smallest variation. A robustness index has been proposed to satisfy these conditions. The two-step optimization process of the target problem requires a scaling factor. The search process of a scaling factor is replaced with the making of the decoupled design between the mean and the standard deviation. The decoupled design matrix is formed from the sensitivity or the sum of squares. After establishing the design matrix, the robust design process has a new three-step one. The first is ″reduce variability,″ the second is ″make the candidate designs that satisfy constraints and move the mean on the target,″ and the final is ″select the best robust design using the proposed robustness index.″ The robust design process is verified by three examples and the results using the robustness index are compared with those of other indices.