• Title/Summary/Keyword: Filtering efficiency

Search Result 409, Processing Time 0.026 seconds

De-blocking Filter for Improvement of Coding Efficiency and Computational Complexity Reduction on High Definition Video Coding (고화질 비디오의 부호화 효율성 증대와 연산 복잡도 감소를 위한 디블록킹 필터)

  • Jung, Kwang-Su;Nam, Jung-Hak;Jo, Hyun-Ho;Sim, Dong-Gyu;Oh, Seoung-Jun;Jeong, Sey-Yoon;Choi, Jin-Soo
    • Journal of Broadcast Engineering
    • /
    • v.15 no.4
    • /
    • pp.513-526
    • /
    • 2010
  • In this paper, we propose a de-blocking filter for improvement of coding efficiency and computational complexity reduction on a high definition video coding. Recently, the H.264/AVC standard-based research for high definition video coding method is under way because the amount of used of high definition videos is on the increase. The H.264/AVC de-blocking filter is designed for low bitrate video coding and it improves not only the subjective quality but also coding efficiency by minimizing the blocking artifact. However, the H.264/AVC de-blocking filter that strong filtering is performed is not suitable in a high definition video coding which occurs relatively low blocking artifact. Also, the conventional de-blocking filter has high computational complexity in decoder side. The computational complexity of the proposed method is reduced about maximum 8.8% than conventional method. Furthermore, the coding efficiency of the proposed method is about maximum 7.3% better than H.264/AVC de-blocking filter.

The Research about the Improvement of Design Process for Improving Quality of Product - With Emphasis on Decision Making Efficiency based on AHP Technique - (제품의 품질확보를 위한 디자인 프로세스 개선에 관한 연구 - AHP기법을 통한 디자인 의사결정 효율화를 중심으로 -)

  • Lee, Jong-Suk;Shin, Soo-Gil
    • Archives of design research
    • /
    • v.18 no.3 s.61
    • /
    • pp.15-24
    • /
    • 2005
  • There is a large waste of time, money, and production through the infelicitous product design process in small and medium enterprises. They don't possess enough career-manpower with respect to design. Especially, the objective and scientific approach process isn't presented very well on the 'establishment concept' of the embodiment phase or 'Sketch and Rendering' of the development phase which are the most important design processes. So, this research is applied to the conception of the AHP method. It uses the basic concept of relativity to decrease risk from the calculational quantity data, and supplement the decision making phase. Generally, human beings can conclude by relative judgement which is more influenceable than absolute judgement. So we must use the relative comparison concept rather than the comparison of two items with variable sketches based on characteristics of human beings. Thus, efficiency judgement is dependent on design sketch comparisons which help the consistency progress of variable alternative plans. We can decrease risk when we chose the final design and increase efficiency of the design decision making. That is now a perfect selection of each alternative's ranking and sensitive design result but this research will provide consistency criterion on filtering and lead to variable design alternatives. The significance of this research is the efficiency method that overcomes differences of character and sensitivity on many phases of the process. Finally, this research proposes a new ideal process that where applied improves quality and evidence of propriety through comparison to existing methods result in method application research for improvement quality.

  • PDF

Reducing error rates in general nuclear medicine imaging to increase patient satisfaction (핵의학 일반영상 검사업무 오류개선 활동에 따른 환자 만족도)

  • Kim, Ho-Sung;Im, In-Chul;Park, Cheol-Woo;Lim, Jong-Duek;Kim, Sun-Geun;Lee, Jae-Seung
    • Journal of the Korean Society of Radiology
    • /
    • v.5 no.5
    • /
    • pp.295-302
    • /
    • 2011
  • To n the field of nuclear medicine, with regard to checking regular patients, from the moment they register up to the doctor's diagnosis, the person in charge of the checks can find errors in the diagnosis, reexamine, reanalyze the results or save images to PACS. Through this process, the results obtained from the readings are delayed due to checks and additional tests which occur in hospitals, causing patient satisfaction and affected reliability. Accordingly, the purpose is to include visual inspection of the results to minimize error, improve efficiency and increase patient satisfaction. Nuclear medicine and imaging tests from examines at Asan Medical Center, Seoul, from March 2008 to December 2008, were analyzed for errors. The first stage, from January 2009 to December 2009, established procedures and know-how. The second stage from January 2010 until June 2010 conducted Pre-and Post-filtering assessment, and the third stage from July 2010 until October 2010 consisted of cross-checks and attaching stickers and comparing error cases. Of 92 errors, the 1st, 2nd and 3rd stage had 32 cases, and there were 46 cases after the 4th stage, with the overall errors reduced by 74.3% from 94.6%. In the field of general nuclear medicine, where various kinds of checks are performed according to the patient's needs, analysis, image composition, differing images in PACS, etc, all have the potential for mistakes to be made. In order to decrease error rates, the image can continuously Cross-Check and Confirm diagnosis.

Development of the Knowledge-based Systems for Anti-money Laundering in the Korea Financial Intelligence Unit (자금세탁방지를 위한 지식기반시스템의 구축 : 금융정보분석원 사례)

  • Shin, Kyung-Shik;Kim, Hyun-Jung;Kim, Hyo-Sin
    • Journal of Intelligence and Information Systems
    • /
    • v.14 no.2
    • /
    • pp.179-192
    • /
    • 2008
  • This case study shows constructing the knowledge-based system using a rule-based approach for detecting illegal transactions regarding money laundering in the Korea Financial Intelligence Unit (KoFIU). To better manage the explosive increment of low risk suspicious transactions reporting from financial institutions, the adoption of a knowledge-based system in the KoFIU is essential. Also since different types of information from various organizations are converged into the KoFIU, constructing a knowledge-based system for practical use and data management regarding money laundering is definitely required. The success of the financial information system largely depends on how well we can build the knowledge-base for the context. Therefore we designed and constructed the knowledge-based system for anti-money laundering by committing domain experts of each specific financial industry co-worked with a knowledge engineer. The outcome of the knowledge base implementation, measured by the empirical ratio of Suspicious Transaction Reports (STRs) reported to law enforcements, shows that the knowledge-based system is filtering STRs in the primary analysis step efficiently, and so has made great contribution to improve efficiency and effectiveness of the analysis process. It can be said that establishing the foundation of the knowledge base under the entire framework of the knowledge-based system for consideration of knowledge creation and management is indeed valuable.

  • PDF

Deblocking Filter for Low-complexity Video Decoder (저 복잡도 비디오 복호화기를 위한 디블록킹 필터)

  • Jo, Hyun-Ho;Nam, Jung-Hak;Jung, Kwang-Su;Sim, Dong-Gyu;Cho, Dae-Sung;Choi, Woong-Il
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.47 no.3
    • /
    • pp.32-43
    • /
    • 2010
  • This paper presents deblocking filter for low-complexity video decoder. Baseline profile of the H.264/AVC used for mobile devices such as mobile phones has two times higher compression performance than the MPEG-4 Visual but it has a problem of serious complexity as using 1/4-pel interpolation filter, adaptive entropy model and deblocking filter. This paper presents low-complexity deblocking filter for decreasing complexity of decoder with preserving the coding efficiency of the H.264/AVC. In this paper, the proposed low-complexity deblocking filter decreased 49% of branch instruction than conventional approach as calculating value of BS by using the CBP. In addition, a range of filtering of strong filter applied in intra macroblock boundaries was limited to two pixels. According to the experimental results, the proposed low-complexity deblocking filter decreased -0.02% of the BDBitrate comparison with baseline profile of the H.264/AVC, decreased 42% of the complexity of deblocking filter, and decreased 8.96% of the complexity of decoder.

Estimation of the Spillovers during the Global Financial Crisis (글로벌 금융위기 동안 전이효과에 대한 추정)

  • Lee, Kyung-Hee;Kim, Kyung-Soo
    • Management & Information Systems Review
    • /
    • v.39 no.2
    • /
    • pp.17-37
    • /
    • 2020
  • The purpose of this study is to investigate the global spillover effects through the existence of linear and nonlinear causal relationships between the US, European and BRIC financial markets after the period from the introduction of the Euro, the financial crisis and the subsequent EU debt crisis in 2007~2010. Although the global spillover effects of the financial crisis are well described, the nature of the volatility effects and the spread mechanisms between the US, Europe and BRIC stock markets have not been systematically examined. A stepwise filtering methodology was introduced to investigate the dynamic linear and nonlinear causality, which included a vector autoregressive regression model and a multivariate GARCH model. The sample in this paper includes the post-Euro period, and also includes the financial crisis and the Eurozone financial and sovereign crisis. The empirical results can have many implications for the efficiency of the BRIC stock market. These results not only affect the predictability of this market, but can also be useful in future research to quantify the process of financial integration in the market. The interdependence between the United States, Europe and the BRIC can reveal significant implications for financial market regulation, hedging and trading strategies. And the findings show that the BRIC has been integrated internationally since the sub-prime and financial crisis erupted in the United States, and the spillover effects have become more specific and remarkable. Furthermore, there is no consistent evidence supporting the decoupling phenomenon. Some nonlinear causality persists even after filtering during the investigation period. Although the tail distribution dependence and higher moments may be significant factors for the remaining interdependencies, this can be largely explained by the simple volatility spillover effects in nonlinear causality.

Model Test on the High Performance of the Midwater Pair Trawl Net (쌍끌이중층망의 전개성능 향상을 위한 모형실험)

  • 권병국
    • Journal of the Korean Society of Fisheries and Ocean Technology
    • /
    • v.31 no.4
    • /
    • pp.340-349
    • /
    • 1995
  • There are several problems in a fishing by the midwater pair trawl net which is used in Denmark; steeply decreasing of the net height with the towing speed increasing, the larger volume of the fishing gear in comparison with the size of the trawler, and catching of a float in a mesh, etc. To prevent steeply decreasing of the net height with the towing speed increasing and catching of a float in a mesh, it is sometimes more useful to use the kite instead of floats. This paper describes the hydrodynamic drag and the opening efficiency of the midwater pair trawl net and the midwater kite pair trawl net obtained by the model test in the circulation water channel. The results can be summarized as follows; 1. The hydrodynamic drag of the midwater kite pair trawl net is about 0.7 times smaller than that of the midwater pair trawl net. 2. The net height, mouth area and filtering volume of the midwater kite pair trawl net are smaller then those of the midwater pair trawl net when the towing speed is below 2.5knots, almost the same at 2.7knots, and are larger over 3.0knots. The net width of the midwater kite pair trawl net is same as that of the midwater pair trawl net. 3. The shapes of net mouth of both net are an oval steeply flatted with the towing speed increasing. The filtering volume of the midwater kite pair trawl net is larger then that of the midwater pair trawl net by 3% at 3.0knots, 11% at 4.0knots, and 16% at 5.0knots respectively.

  • PDF

User-Class based Service Acceptance Policy using Cluster Analysis (군집분석 (Cluster Analysis)을 활용한 사용자 등급 기반의 서비스 수락 정책)

  • Park Hea-Sook;Baik Doo-Kwon
    • The KIPS Transactions:PartD
    • /
    • v.12D no.3 s.99
    • /
    • pp.461-470
    • /
    • 2005
  • This paper suggests a new policy for consolidating a company's profits by segregating the clients using the contents service and allocating the media server's resources distinctively by clusters using the cluster analysis method of CRM, which is mainly applied to marketing. In this case, CRM refers to the strategy of consolidating a company's profits by efficiently managing the clients, providing them with a more effective, personalized service, and managing the resources more effectively. For the realization of a new service policy, this paper analyzes the level of contribution $vis-\acute{a}-vis$ the clients' service pattern (total number of visits to the homepage, service type, service usage period, total payment, average service period, service charge per homepage visit) and profits through the cluster analysis of clients' data applying the K-Means Method. Clients were grouped into 4 clusters according to the contribution level in terms of profits. Likewise, the CRFA (Client Request Filtering algorithm) was suggested per cluster to allocate media server resources. CRFA issues approval within the resource limit of the cluster where the client belongs. In addition, to evaluate the efficiency of CRFA within the Client/Server environment the acceptance rate per class was determined, and an evaluation experiment on network traffic was conducted before and after applying CRFA. The results of the experiments showed that the application of CRFA led to the decrease in network expenses and growth of the acceptance rate of clients belonging to the cluster as well as the significant increase in the profits of the company.

Simulation Study on E-commerce Recommender System by Use of LSI Method (LSI 기법을 이용한 전자상거래 추천자 시스템의 시뮬레이션 분석)

  • Kwon, Chi-Myung
    • Journal of the Korea Society for Simulation
    • /
    • v.15 no.3
    • /
    • pp.23-30
    • /
    • 2006
  • A recommender system for E-commerce site receives information from customers about which products they are interested in, and recommends products that are likely to fit their needs. In this paper, we investigate several methods for large-scale product purchase data for the purpose of producing useful recommendations to customers. We apply the traditional data mining techniques of cluster analysis and collaborative filtering(CF), and CF with reduction of product-dimensionality by use of latent semantic indexing(LSI). If reduced product-dimensionality obtained from LSI shows a similar latent trend of customers for buying products to that based on original customer-product purchase data, we expect less computational effort for obtaining the nearest-neighbor for target customer may improve the efficiency of recommendation performance. From simulation experiments on synthetic customer-product purchase data, CF-based method with reduction of product-dimensionality presents a better performance than the traditional CF methods with respect to the recall, precision and F1 measure. In general, the recommendation quality increases as the size of the neighborhood increases. However, our simulation results shows that, after a certain point, the improvement gain diminish. Also we find, as a number of products of recommendation increases, the precision becomes worse, but the improvement gain of recall is relatively small after a certain point. We consider these informations may be useful in applying recommender system.

  • PDF

A Middleware System for Efficient Acquisition and Management of Heterogeneous Geosensor Networks Data (이질적인 지오센서 네트워크 데이터의 효율적인 수집 및 관리를 위한 미들웨어 시스템)

  • Kim, Min-Soo;Lee, Chung-Ho
    • Spatial Information Research
    • /
    • v.20 no.1
    • /
    • pp.91-103
    • /
    • 2012
  • Recently, there has been much interest in the middleware that can smoothly acquire and analyze Geosensor information which includes sensor readings, location, and its surrounding spatial information. In relation to development of the middleware, researchers have proposed various algorithms for energy-efficient information filtering in Geosensor networks and have proposed Geosensor web technologies which can efficiently mash up sensor readings with spatial information on the web, also. The filtering algorithms and Geosensor Web technologies have contributions on energy-efficiency and OpenAPI, however the algorithms and technologies could not support easy and rapid development of u-GIS applications that need various Geosensor networks. Therefore, we propose a new Geosensor network middleware that can dramatically reduce the time and cost required for development of u-GIS applications that integrate heterogeneous Geosensor networks. The proposed middleware has several merits of being capable of acquiring heterogeneous Geosensor information using the standard SWE and an extended SQL, optimally performing various attribute and spatial operators, and easily integrating various Geosensor networks. Finally, we clarify our middleware's distinguished features by developing a prototype that can monitor environmental information in realtime using spatial information and various sensor readings of temperature, humidity, illumination, imagery, and location.