• Title/Summary/Keyword: 중복제거기법

Search Result 221, Processing Time 0.026 seconds

A Study on the New Motion Estimation Algorithm of Binary Operation for Real Time Video Communication (실시간 비디오 통신에 적합한 새로운 이진 연산 움직임 추정 알고리즘에 관한 연구)

  • Lee, Wan-Bum;Shim, Byoung-Sup;Kim, Hwan-Yong
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.14 no.4
    • /
    • pp.418-423
    • /
    • 2004
  • The motion estimation algorithm based block matching is a widely used in the international standards related to video compression, such as the MPEG series and H.26x series. Full search algorithm(FA) ones of this block matching algorithms is usually impractical because of the large number of computations required for large search region. Fast search algorithms and conventional binary block matching algorithms reduce computational complexity and data processing time but this algorithms have disadvantages that is less performance than full search algorithm. This paper presents new Boolean matching algorithm, called BCBM(Bit Converted Boolean Matching). Proposed algorithm has performance closed to the FA by Boolean only block matching that may be very efficiently implemented in hardware for real time video communication. Simulation results show that the PSNR of the proposed algorithm is about 0.08㏈ loss than FA but is about 0.96∼2.02㏈ gain than fast search algorithm and conventional Boolean matching algorithm.

Evaluation of Ground-Water Sampling Techniques for Analysis of Chlorofluorocarbons (지하수의 CFCs(Chlorofluorocarbons) 조사를 위한 시료 채취 방법의 평가)

  • 고동찬;이대하
    • Journal of Soil and Groundwater Environment
    • /
    • v.8 no.2
    • /
    • pp.1-8
    • /
    • 2003
  • Two types of ground-water sampling techniques for CFCs (chlorofluorocarbons) analysis, the cold-welded copper tube method and flame-sealed borosilicate glass ampule method, were compared and evaluated. CFCs concentrations by the copper tube method showed a poor reproducibility among triplicates whereas those by the glass ampule method showed a good agreement and relative standard deviations of triplicates were less than 5%. The poor reproducibility of the copper tube method appears to be attributed to the incomplete sealing in connection between faucets of wellhead and the sampling apparatus. The copper tube method also showed higher CFCs concentrations than the glass ampule method, which is more pronounced for CFC-11 than for CFC-12. The plastic tubings and rubber gasket of faucets in case of the copper tube method possibly contaminated the samples with CFC-11 and CFC-12. The potential of CFCs contamination for the glass ampule method was eliminated by using stainless steel and Nylon only and by connecting the sampling equipment directly to the main discharge pipe of wellhead. The validity of the glass ampule method were also verified by detecting very low level of CFCs for the ground-water sample which is old enough to have negligible CFCs.

Context cognition technology through integrated cyber security context analysis (통합 사이버 보안 상황분석을 통한 관제 상황인지 기술)

  • Nam, Seung-Soo;Seo, Chang-Ho;Lee, Joo-Young;Kim, Jong-Hyun;Kim, Ik-Kyun
    • Smart Media Journal
    • /
    • v.4 no.4
    • /
    • pp.80-85
    • /
    • 2015
  • As the number of applications using the internet the rapidly increasing incidence of cyber attacks made on the internet has been increasing. In the equipment of L3 DDoS attack detection equipment in the world and incomplete detection of application layer based intelligent. Next-generation networks domestic product in high-performance wired and wireless network threat response techniques to meet the diverse requirements of the security solution is to close one performance is insufficient compared to the situation in terms of functionality foreign products, malicious code detection and signature generation research primarily related to has progressed malware detection and analysis of the research center operating in Window OS. In this paper, we describe the current status survey and analysis of the latest variety of new attack techniques and analytical skills with the latest cyber-attack analysis prejudice the security situation.

Context cognition technology through integrated cyber security context analysis (통합 사이버 보안 상황분석을 통한 관제 상황인지 기술)

  • Nam, Seung-Soo;Seo, Chang-Ho;Lee, Joo-Young;Kim, Jong-Hyun;Kim, Ik-Kyun
    • Journal of Digital Convergence
    • /
    • v.13 no.1
    • /
    • pp.313-319
    • /
    • 2015
  • As the number of applications using the internet the rapidly increasing incidence of cyber attacks made on the internet has been increasing. In the equipment of L3 DDoS attack detection equipment in the world and incomplete detection of application layer based intelligent. Next-generation networks domestic product in high-performance wired and wireless network threat response techniques to meet the diverse requirements of the security solution is to close one performance is insufficient compared to the situation in terms of functionality foreign products, malicious code detection and signature generation research primarily related to has progressed malware detection and analysis of the research center operating in Window OS. In this paper, we describe the current status survey and analysis of the latest variety of new attack techniques and analytical skills with the latest cyber-attack analysis prejudice the security situation.

The Use of Linearly Transformed LANDSAT Data in Landuse Classification (선형 변환된 LANDSAT 데이타를 이용한 토지이용분류(낙동강 하구역을 중심으로))

  • 안철호;박병욱;김종인
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.7 no.2
    • /
    • pp.85-92
    • /
    • 1989
  • The aim of this study is to find out the combination of effective transformed data, applying Remote Sensing techniques, as to the classification and particular objects by transforming the MSS data and TM data of the satellite LANDSAT into several linearly transformed data. Since one of the problems in the processing of the LANDSAT data is the vastness of the data, the Linear Transformation could be a method to perform analysis of those vast data, more efficiently and economically. This method is carried out as follows : (1) offering the simplicity over complex data, (2) selectional processing over redundant data and removing unnecessary data, (3) emphasizing on the object of the study ; by transforming multispectral data through linear calculation and statistical transformation. In this study, the analysis and transformation of the data have been performed by means of Band Ratioing and Principal Component Analysis. As the classificatory consequence, Infrared/RED Ratioing which expands the characterization of green vegetation, has been useful for a distinctive classification among other classes. For the Principal Component Analysis, band 1,2,7 are efficient in the classification of the green vegetation.

  • PDF

Data hub system based on SQL/XMDR message using Wrapper for distributed data interoperability (분산 데이터 상호운용을 위한 SQL/XMDR 메시지 기반의 Wrapper를 이용한 데이터 허브 시스템)

  • Moon, Seok-Jae;Jung, Gye-Dong;Choi, Young-Keun
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.11 no.11
    • /
    • pp.2047-2058
    • /
    • 2007
  • The business environment of enterprises could be difficult to obviate redundancy to filtrate data source occurred on data integrated to standard rules and meta-data and to produce integration of data and single viewer in geographical and spatial distributed environment. Specially, To can interchange various data from a heterogeneous system or various applications without types and forms and synchronize continually exactly integrated information#s is of paramount concern. Therefore data hub system based on SQL/XMDR message to overcome a problem of meaning interoperability occurred on exchanging or jointing between each legacy systems are proposed in this paper. This system use message mapping technique of query transform system to maintain data modified in real-time on cooperating data. It can consistently maintain data modified in realtime on exchanging or jointing data for cooperating legacy systems, it improve clarity and availability of data by providing a single interface on data retrieval.

Efficient Computing Algorithm for Inter Prediction SAD of HEVC Encoder (HEVC 부호기의 Inter Prediction SAD 연산을 위한 효율적인 알고리즘)

  • Jeon, Sung-Hun;Ryoo, Kwangki
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2016.10a
    • /
    • pp.397-400
    • /
    • 2016
  • In this paper, we propose an efficient algorithm for computing architecture for high-performance Inter Prediction SAD HEVC encoder. HEVC Motion Estimation (ME) of the Inter Prediction is a process for searching for the currently high prediction block PU and the correlation in the interpolated reference picture in order to remove temporal redundancy. ME algorithm uses full search(FS) or fast search algorithm. Full search technique has the guaranteed optimal results but has many disadvantages which include high calculation and operational time due to the motion prediction with respect to all candidate blocks in a given search area. Therefore, this paper proposes a new algorithm which reduces the computational complexity by reusing the SAD operation in full search to reduce the amount of calculation and computational time of the Inter Prediction. The proposed algorithm is applied to an HEVC standard software HM16.12. There was an improved operational time of 61% compared to the traditional full search algorithm, BDBitrate was decreased by 11.81% and BDPSNR increased by about 0.5%.

  • PDF

Compression Method for MPEG CDVA Global Feature Descriptors (MPEG CDVA 전역 특징 서술자 압축 방법)

  • Kim, Joonsoo;Jo, Won;Lim, Guentaek;Yun, Joungil;Kwak, Sangwoon;Jung, Soon-heung;Cheong, Won-Sik;Choo, Hyon-Gon;Seo, Jeongil;Choi, Yukyung
    • Journal of Broadcast Engineering
    • /
    • v.27 no.3
    • /
    • pp.295-307
    • /
    • 2022
  • In this paper, we propose a novel compression method for scalable Fisher vectors (SCFV) which is used as a global visual feature description of individual video frames in MPEG CDVA standard. CDVA standard has adopted a temporal descriptor redundancy removal technique that takes advantage of the correlation between global feature descriptors for adjacent keyframes. However, due to the variable length property of SCFV, the temporal redundancy removal scheme often results in inferior compression efficiency. It is even worse than the case when the SCFVs are not compressed at all. To enhance the compression efficiency, we propose an asymmetric SCFV difference computation method and a SCFV reconstruction method. Experiments on the FIVR dataset show that the proposed method significantly improves the compression efficiency compared to the original CDVA Experimental Model implementation.

Towards Group-based Adaptive Streaming for MPEG Immersive Video (MPEG Immersive Video를 위한 그룹 기반 적응적 스트리밍)

  • Jong-Beom Jeong;Soonbin Lee;Jaeyeol Choi;Gwangsoon Lee;Sangwoon Kwak;Won-Sik Cheong;Bongho Lee;Eun-Seok Ryu
    • Journal of Broadcast Engineering
    • /
    • v.28 no.2
    • /
    • pp.194-212
    • /
    • 2023
  • The MPEG immersive video (MIV) coding standard achieved high compression efficiency by removing inter-view redundancy and merging the residuals of immersive video which consists of multiple texture (color) and geometry (depth) pairs. Grouping of views that represent similar spaces enables quality improvement and implementation of selective streaming, but this has not been actively discussed recently. This paper introduces an implementation of group-based encoding into the recent version of MIV reference software, provides experimental results on optimal views and videos per group, and proposes a decision method for optimal number of videos for global immersive video representation by using portion of residual videos.

Comparison of Association Rule Learning and Subgroup Discovery for Mining Traffic Accident Data (교통사고 데이터의 마이닝을 위한 연관규칙 학습기법과 서브그룹 발견기법의 비교)

  • Kim, Jeongmin;Ryu, Kwang Ryel
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.4
    • /
    • pp.1-16
    • /
    • 2015
  • Traffic accident is one of the major cause of death worldwide for the last several decades. According to the statistics of world health organization, approximately 1.24 million deaths occurred on the world's roads in 2010. In order to reduce future traffic accident, multipronged approaches have been adopted including traffic regulations, injury-reducing technologies, driving training program and so on. Records on traffic accidents are generated and maintained for this purpose. To make these records meaningful and effective, it is necessary to analyze relationship between traffic accident and related factors including vehicle design, road design, weather, driver behavior etc. Insight derived from these analysis can be used for accident prevention approaches. Traffic accident data mining is an activity to find useful knowledges about such relationship that is not well-known and user may interested in it. Many studies about mining accident data have been reported over the past two decades. Most of studies mainly focused on predict risk of accident using accident related factors. Supervised learning methods like decision tree, logistic regression, k-nearest neighbor, neural network are used for these prediction. However, derived prediction model from these algorithms are too complex to understand for human itself because the main purpose of these algorithms are prediction, not explanation of the data. Some of studies use unsupervised clustering algorithm to dividing the data into several groups, but derived group itself is still not easy to understand for human, so it is necessary to do some additional analytic works. Rule based learning methods are adequate when we want to derive comprehensive form of knowledge about the target domain. It derives a set of if-then rules that represent relationship between the target feature with other features. Rules are fairly easy for human to understand its meaning therefore it can help provide insight and comprehensible results for human. Association rule learning methods and subgroup discovery methods are representing rule based learning methods for descriptive task. These two algorithms have been used in a wide range of area from transaction analysis, accident data analysis, detection of statistically significant patient risk groups, discovering key person in social communities and so on. We use both the association rule learning method and the subgroup discovery method to discover useful patterns from a traffic accident dataset consisting of many features including profile of driver, location of accident, types of accident, information of vehicle, violation of regulation and so on. The association rule learning method, which is one of the unsupervised learning methods, searches for frequent item sets from the data and translates them into rules. In contrast, the subgroup discovery method is a kind of supervised learning method that discovers rules of user specified concepts satisfying certain degree of generality and unusualness. Depending on what aspect of the data we are focusing our attention to, we may combine different multiple relevant features of interest to make a synthetic target feature, and give it to the rule learning algorithms. After a set of rules is derived, some postprocessing steps are taken to make the ruleset more compact and easier to understand by removing some uninteresting or redundant rules. We conducted a set of experiments of mining our traffic accident data in both unsupervised mode and supervised mode for comparison of these rule based learning algorithms. Experiments with the traffic accident data reveals that the association rule learning, in its pure unsupervised mode, can discover some hidden relationship among the features. Under supervised learning setting with combinatorial target feature, however, the subgroup discovery method finds good rules much more easily than the association rule learning method that requires a lot of efforts to tune the parameters.