• Title/Summary/Keyword: probabilistic-based algorithm

Search Result 291, Processing Time 0.026 seconds

A Probabilistic Combination Method of Minimum Statistics and Soft Decision for Robust Noise Power Estimation in Speech Enhancement (강인한 음성향상을 위한 Minimum Statistics와 Soft Decision의 확률적 결합의 새로운 잡음전력 추정기법)

  • Park, Yun-Sik;Chang, Joon-Hyuk
    • The Journal of the Acoustical Society of Korea
    • /
    • v.26 no.4
    • /
    • pp.153-158
    • /
    • 2007
  • This paper presents a new approach to noise estimation to improve speech enhancement in non-stationary noisy environments. The proposed method combines the two separate noise power estimates provided by the minimum statistics (MS) for speech presence and soft decision (SD) for speech absence in accordance with SAP (Speech Absence Probability) on a separate frequency bin. The performance of the proposed algorithm is evaluated by the subjective test under various noise environments and yields better results compared with the conventional MS or SD-based schemes.

Low Cost and Acceptable Delay Unicast Routing Algorithm Based on Interval Estimation (구간 추정 기반의 지연시간을 고려한 저비용 유니캐스트 라우팅 방식)

  • Kim, Moon-Seong;Bang, Young-Cheol;Choo, Hyun-Seung
    • The KIPS Transactions:PartC
    • /
    • v.11C no.2
    • /
    • pp.263-268
    • /
    • 2004
  • The end-to-end characteristic Is an important factor for QoS support. Since network users and required bandwidths for applications increase, the efficient usage of networks has been intensively investigated for the better utilization of network resources. The distributed adaptive routing is the typical routing algorithm that is used in the current Internet. The DCLC(Delay Constrained 1.east Cost) path problem has been shown to be NP-hard problem. The path cost of LD path is relatively more expensive than that of LC path, and the path delay of LC path is relatively higher than that of LD path in DCLC problem. In this paper, we investigate the performance of heuristic algorithm for the DCLC problem with new factor which is probabilistic combination of cost and delay. Recently Dr. Salama proposed a polynomial time algorithm called DCUR. The algorithm always computes a path, where the cost of the path is always within 10% from the optimal CBF. Our evaluation showed that heuristic we propose is more than 38% better than DCUR with cost when number of nodes is more than 200. The new factor takes in account both cost and delay at the same time.

Formation Estimation of Shaly Sandstone Reservoir using Joint Inversion from Well Logging Data (복합역산을 이용한 물리검층자료로부터의 셰일성 사암 저류층의 지층 평가)

  • Choi, Yeonjin;Chung, Woo-Keen;Ha, Jiho;Shin, Sung-ryul
    • Geophysics and Geophysical Exploration
    • /
    • v.22 no.1
    • /
    • pp.1-11
    • /
    • 2019
  • Well logging technologies are used to measure the physical properties of reservoirs through boreholes. These technologies have been utilized to understand reservoir characteristics, such as porosity, fluid saturation, etc., using equations based on rock physics models. The analysis of well logs is performed by selecting a reliable rock physics model adequate for reservoir conditions or characteristics, comparing the results using the Archie's equation or simandoux method, and determining the most feasible reservoir properties. In this study, we developed a joint inversion algorithm to estimate physical properties in shaly sandstone reservoirs based on the pre-existing algorithm for sandstone reservoirs. For this purpose, we proposed a rock physics model with respect to shale volume, constructed the Jacobian matrix, and performed the sensitivity analysis for understanding the relationship between well-logging data and rock properties. The joint inversion algorithm was implemented by adopting the least-squares method using probabilistic approach. The developed algorithm was applied to the well-logging data obtained from the Colony gas sandstone reservoir. The results were compared with the simandox method and the joint inversion algorithms of sand stone reservoirs.

Development of Web-based Off-site Consequence Analysis Program and its Application for ILRT Extension (격납건물종합누설률시험 주기연장을 위한 웹기반 소외결말분석 프로그램 개발 및 적용)

  • Na, Jang-Hwan;Hwang, Seok-Won;Oh, Ji-Yong
    • Journal of the Korean Society of Safety
    • /
    • v.27 no.5
    • /
    • pp.219-223
    • /
    • 2012
  • For an off-site consequence analysis at nuclear power plant, MELCOR Accident Consequence Code System(MACCS) II code is widely used as a software tool. In this study, the algorithm of web-based off-site consequence analysis program(OSCAP) using the MACCS II code was developed for an Integrated Leak Rate Test (ILRT) interval extension and Level 3 probabilistic safety assessment(PSA), and verification and validation(V&V) of the program was performed. The main input data for the MACCS II code are meteorological, population distribution and source term information. However, it requires lots of time and efforts to generate the main input data for an off-site consequence analysis using the MACCS II code. For example, the meteorological data are collected from each nuclear power site in real time, but the formats of the raw data collected are different from each site. To reduce the efforts and time for risk assessments, the web-based OSCAP has an automatic processing module which converts the format of the raw data collected from each site to the input data format of the MACCS II code. The program also provides an automatic function of converting the latest population data from Statistics Korea, the National Statistical Office, to the population distribution input data format of the MACCS II code. For the source term data, the program includes the release fraction of each source term category resulting from modular accident analysis program(MAAP) code analysis and the core inventory data from ORIGEN. These analysis results of each plant in Korea are stored in a database module of the web-based OSCAP, so the user can select the defaulted source term data of each plant without handling source term input data.

Abnormal Behavior Recognition Based on Spatio-temporal Context

  • Yang, Yuanfeng;Li, Lin;Liu, Zhaobin;Liu, Gang
    • Journal of Information Processing Systems
    • /
    • v.16 no.3
    • /
    • pp.612-628
    • /
    • 2020
  • This paper presents a new approach for detecting abnormal behaviors in complex surveillance scenes where anomalies are subtle and difficult to distinguish due to the intricate correlations among multiple objects' behaviors. Specifically, a cascaded probabilistic topic model was put forward for learning the spatial context of local behavior and the temporal context of global behavior in two different stages. In the first stage of topic modeling, unlike the existing approaches using either optical flows or complete trajectories, spatio-temporal correlations between the trajectory fragments in video clips were modeled by the latent Dirichlet allocation (LDA) topic model based on Markov random fields to obtain the spatial context of local behavior in each video clip. The local behavior topic categories were then obtained by exploiting the spectral clustering algorithm. Based on the construction of a dictionary through the process of local behavior topic clustering, the second phase of the LDA topic model learns the correlations of global behaviors and temporal context. In particular, an abnormal behavior recognition method was developed based on the learned spatio-temporal context of behaviors. The specific identification method adopts a top-down strategy and consists of two stages: anomaly recognition of video clip and anomalous behavior recognition within each video clip. Evaluation was performed using the validity of spatio-temporal context learning for local behavior topics and abnormal behavior recognition. Furthermore, the performance of the proposed approach in abnormal behavior recognition improved effectively and significantly in complex surveillance scenes.

Non-Simultaneous Sampling Deactivation during the Parameter Approximation of a Topic Model

  • Jeong, Young-Seob;Jin, Sou-Young;Choi, Ho-Jin
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.7 no.1
    • /
    • pp.81-98
    • /
    • 2013
  • Since Probabilistic Latent Semantic Analysis (PLSA) and Latent Dirichlet Allocation (LDA) were introduced, many revised or extended topic models have appeared. Due to the intractable likelihood of these models, training any topic model requires to use some approximation algorithm such as variational approximation, Laplace approximation, or Markov chain Monte Carlo (MCMC). Although these approximation algorithms perform well, training a topic model is still computationally expensive given the large amount of data it requires. In this paper, we propose a new method, called non-simultaneous sampling deactivation, for efficient approximation of parameters in a topic model. While each random variable is normally sampled or obtained by a single predefined burn-in period in the traditional approximation algorithms, our new method is based on the observation that the random variable nodes in one topic model have all different periods of convergence. During the iterative approximation process, the proposed method allows each random variable node to be terminated or deactivated when it is converged. Therefore, compared to the traditional approximation ways in which usually every node is deactivated concurrently, the proposed method achieves the inference efficiency in terms of time and memory. We do not propose a new approximation algorithm, but a new process applicable to the existing approximation algorithms. Through experiments, we show the time and memory efficiency of the method, and discuss about the tradeoff between the efficiency of the approximation process and the parameter consistency.

A Study of Energy Efficient Clustering in Wireless Sensor Networks (무선 센서네트워크의 에너지 효율적 집단화에 관한 연구)

  • Lee Sang Hak;Chung Tae Choong
    • The KIPS Transactions:PartC
    • /
    • v.11C no.7 s.96
    • /
    • pp.923-930
    • /
    • 2004
  • Wireless sensor networks is a core technology of ubiquitous computing which enables the network to aware the different kind of context by integrating exiting wired/wireless infranet with various sensor devices and connecting collected environmental data with applications. However it needs an energy-efficient approach in network layer to maintain the dynamic ad hoc network and to maximize the network lifetime by using energy constrained node. Cluster-based data aggregation and routing are energy-efficient solution judging from architecture of sensor networks and characteristics of data. In this paper. we propose a new distributed clustering algorithm in using distance from the sink. This algorithm shows that it can balance energy dissipation among nodes while minimizing the overhead. We verify that our clustering is more en-ergy-efficient and thus prolongs the network lifetime in comparing our proposed clustering to existing probabilistic clustering for sensor network via simulation.

Fire-Smoke Detection Based on Video using Dynamic Bayesian Networks (동적 베이지안 네트워크를 이용한 동영상 기반의 화재연기감지)

  • Lee, In-Gyu;Ko, Byung-Chul;Nam, Jae-Yeol
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.34 no.4C
    • /
    • pp.388-396
    • /
    • 2009
  • This paper proposes a new fire-smoke detection method by using extracted features from camera images and pattern recognition technique. First, moving regions are detected by analyzing the frame difference between two consecutive images and generate candidate smoke regions by applying smoke color model. A smoke region generally has a few characteristics such as similar color, simple texture and upward motion. From these characteristics, we extract brightness, wavelet high frequency and motion vector as features. Also probability density functions of three features are generated using training data. Probabilistic models of smoke region are then applied to observation nodes of our proposed Dynamic Bayesian Networks (DBN) for considering time continuity. The proposed algorithm was successfully applied to various fire-smoke tasks not only forest smokes but also real-world smokes and showed better detection performance than previous method.

The Classification Using Probabilistic Neural Network and Redundancy Reduction on Very Large Scaled Chemical Gas Sensor Array (대규모 가스 센서 어레이에서 중복도의 제거와 확률신경회로망을 이용한 분류)

  • Kim, Jeong-Do;Lim, Seung-Ju;Park, Sung-Dae;Byun, Hyung-Gi;Persaud, K.C.;Kim, Jung-Ju
    • Journal of Sensor Science and Technology
    • /
    • v.22 no.2
    • /
    • pp.162-173
    • /
    • 2013
  • The purpose of this paper is to classify VOC gases by emulating the characteristics found in biological olfaction. For this purpose, we propose new signal processing method based a polymeric chemical sensor array consisting of 4096 sensors which is created by NEUROCHEM project. To remove unstable sensors generated in the manufacturing process of very large scaled chemical sensor array, we used discrete wavelet transformation and cosine similarity. And, to remove the supernumerary redundancy, we proposed the method of selecting candidates of representative sensor representing sensors with similar features by Fuzzy c-means algorithm. In addition, we proposed an improved algorithm for selecting representative sensors among candidates of representative sensors to better enhance classification ability. However, Classification for very large scaled sensor array has a great deal of time in process of learning because many sensors are used for learning though a redundancy is removed. Throughout experimental trials for classification, we confirmed the proposed method have an outstanding classification ability, at transient state as well as steady state.

Robust Head Tracking using a Hybrid of Omega Shape Tracker and Face Detector for Robot Photographer (로봇 사진사를 위한 오메가 형상 추적기와 얼굴 검출기 융합을 이용한 강인한 머리 추적)

  • Kim, Ji-Sung;Joung, Ji-Hoon;Ho, An-Kwang;Ryu, Yeon-Geol;Lee, Won-Hyung;Jin, Chung-Myung
    • The Journal of Korea Robotics Society
    • /
    • v.5 no.2
    • /
    • pp.152-159
    • /
    • 2010
  • Finding a head of a person in a scene is very important for taking a well composed picture by a robot photographer because it depends on the position of the head. So in this paper, we propose a robust head tracking algorithm using a hybrid of an omega shape tracker and local binary pattern (LBP) AdaBoost face detector for the robot photographer to take a fine picture automatically. Face detection algorithms have good performance in terms of finding frontal faces, but it is not the same for rotated faces. In addition, when the face is occluded by a hat or hands, it has a hard time finding the face. In order to solve this problem, the omega shape tracker based on active shape model (ASM) is presented. The omega shape tracker is robust to occlusion and illuminationchange. However, whenthe environment is dynamic,such as when people move fast and when there is a complex background, its performance is unsatisfactory. Therefore, a method combining the face detection algorithm and the omega shape tracker by probabilistic method using histograms of oriented gradient (HOG) descriptor is proposed in this paper, in order to robustly find human head. A robot photographer was also implemented to abide by the 'rule of thirds' and to take photos when people smile.