• Title/Summary/Keyword: Probabilistic environment

Search Result 286, Processing Time 0.024 seconds

A Study of New Data Association Method for Active Sonar Tracking and Track Initiation (능동형 소나의 표적추적 및 트랙초기화를 위한 새로운 자료결합 기법 연구)

  • Lim, Young-Taek;Lee, Yong-Oak;Song, Taek-Lyul
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.13 no.5
    • /
    • pp.739-747
    • /
    • 2010
  • In this paper, we propose new data association method called the Highest Probability Data Association(HPDA) using a Signal Amplitude information ordering method applied to active sonar tracking and track initiation in cluttered environment. The performance of HPDA is tested in a series of Monte Carlo simulations runs and is compared with the existing Probabilistic Data Association with Amplitude Information(PDA-AI) for active sonar tracking in clutter. The proposed HPDA algorithm is also applied to automatic track initiation in clutter and its performance is compared with the existing IPDA-AI algorithm.

Reliability Assessment in Hea-nam Cheju HVDC system using Well-being Method (Well-being기법을 이용한 해남-제주간 HVDC System신뢰도평가)

  • Son, Hyun-Il;Lee, Hyo-Sang;Shin, Dong-Joon;Kim, Jin-O
    • Proceedings of the KIEE Conference
    • /
    • 2004.11b
    • /
    • pp.227-229
    • /
    • 2004
  • In a new competitive market environment, it is very important to determines how much power can be transferred through the network. It's known as Available Transfer Capability(ATC). This paper presents a technique to evaluate Reliability Assesment and the Available Transfer Capability of Haenam-Cheju HVDC transmission system using Well-being Method which is based on the probabilistic method. The system Well-being is categorized in terms of the system Healthy and Marginal in addition to the conventional Risk index. Haenam-Cheju HVDC system has been studied for the optimal ATC based in well-being categories.

  • PDF

A Study on Assessment of Forced Outage Rates for Reliability Evaluation of Power System (전력계통의 확률론적 신뢰도평가를 위한 사고확률 추정에 관한 연구)

  • Lee Sang Sik;Tran Trung Tinh;Choi Jae Seok;Jeon D.;Kim T.;Cha S.;Choo J.
    • Proceedings of the KIEE Conference
    • /
    • summer
    • /
    • pp.195-198
    • /
    • 2004
  • In recent, the Importance and necessity of some studies on reliability evaluation of grid comes from the recent black-out accidents occurred in the world. The quantity evaluation of transmission system reliability is very important under competitive electricity environment. Accurate probabilistic reliability evaluation depends on assessment of forced outage rate of elements, generators, transmission lines. This paper describes basic theory of relationship between outage rates and reliability evaluation for assessing FOR(forced outage rate) of elements of power system. In case study, FORs assessed and supplied from Canada Electricity Association is introduced and FORs assessed from using actual historical data from 1997 to 2002 for transmission lines of KEPCO system.

  • PDF

Probabilistic Object Recognition in a Sequence of 3D Images (연속된 3차원 영상에서의 통계적 물체인식)

  • Jang Dae-Sik;Rhee Yang-Won;Sheng Guo-Rui
    • KSCI Review
    • /
    • v.14 no.1
    • /
    • pp.241-248
    • /
    • 2006
  • The recognition of a relatively big and rarely movable object. such as refrigerator and air conditioner, etc. is necessary because these objects can be crucial global stable features of Simultaneous Localization and Map building(SLAM) in the indoor environment. In this paper. we propose a novel method to recognize these big objects using a sequence of 3D scenes. The particles representing an object to be recognized are scattered to the environment and then the probability of each particles is calculated by the matching test with 3D lines of the environment. Based on the probability and degree of convergence of particles, we can recognize the object in the environment and the pose of object is also estimated. The experimental results show the feasibility of incremental object recognition based on particle filtering and the application to SLAM

  • PDF

Lightweight IP Traceback Mechanism on IPv6 Network Environment (IPv6 네트워크 환경에서의 경량화된 IP 역추적 기법)

  • Heo, Joon;Kang, Myung-Soo;Hong, Choong-Seon
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.17 no.2
    • /
    • pp.93-102
    • /
    • 2007
  • A serious problem to fight DDoS attacks is that attackers use incorrect or spoofed IP addresses in the attack packets. Due to the stateless nature of the internet, it is a difficult problem to determine the source of these spoofed IP packets. The most of previous studies to prevent and correspond to DDoS attacks using the traceback mechanism have been accomplished in IPv4 environment. Even though a few studies in IPv6 environment were introduced, those have no detailed mechanism to cope with DDoS attacks. The mechanisms for tracing the origin of attacks in IPv6 networks have so many differences from those of IPv4 networks. In this paper we proposed a lightweight IP traceback mechanism in IPv6 network environment. When marking for traceback is needed, the router can generate Hop-by-Hop option and transmit the marked packet. We measured the performance of this mechanism and at the same time meeting the efficient marking for traceback.

Research on PSNF-m algorithm applying track management technique (트랙관리 기법을 적용한 PSNF-m 표적추적 필터의 성능 분석 연구)

  • Yoo, In-Je
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.18 no.6
    • /
    • pp.681-691
    • /
    • 2017
  • In the clutter environment, it is necessary to update the target tracking filter by detecting the target signal among many measured value data obtained via the radar system, the track does not diverge, and tracking performance is maintained. The method of associating the measurement most relevant to the target track among numerous measurement values is referred to as data association. PSNF and PSNF-m are data association methods of SN-series. In this paper, we provide an IPSNF-m(Integrated Probabilistic Strongest Neighbor Filter-m) algorithm with a track management method based on the track existence probability in PSNF-m algorithm. This algorithm considers not only the presence of the target but also the case where the target is present but not detected. Calculating the probability of each caseenables efficient management. In order to verify the performance of the proposed IPSNF-m, the track existence probability of the IPSNF algorithm applying the track management technique to PSNF, which is known to have similar performance to PSNF-m, is derived. Through simulation in the same environment, we compare and analyze the proposed algorithm with RMSE, Confirmed True Track, and Track Existence Probability that show better performance in terms of track retention and estimation than the existing PSNF-m and IPSNF algorithms.

Assessment of Future Flood According to Climate Change, Rainfall Distribution and CN (기후변화와 강우분포 및 CN에 따른 미래 홍수량 평가)

  • Kwak, Jihye;Kim, Jihye;Jun, Sang Min;Hwang, Soonho;Lee, Sunghack;Lee, Jae Nam;Kang, Moon Seong
    • Journal of The Korean Society of Agricultural Engineers
    • /
    • v.62 no.6
    • /
    • pp.85-95
    • /
    • 2020
  • According to the standard guidelines of design flood (MLTM, 2012; MOE, 2019), the design flood is calculated based on past precipitation. However, due to climate change, the frequency of extreme rainfall events is increasing. Therefore, it is necessary to analyze future floods' volume by using climate change scenarios. Meanwhile, the standard guideline was revised by MOE (Ministry of Environment) recently. MOE proposed modified Huff distribution and new CN (Curve Number) value of forest and paddy. The objective of this study was to analyze the change of flood volume by applying the modified Huff and newly proposed CN to the probabilistic precipitation based on SSP and RCP scenarios. The probabilistic rainfall under climate change was calculated through RCP 4.5/8.5 scenarios and SSP 245/585 scenarios. HEC-HMS (Hydrologic Engineering Center - Hydrologic Modeling System) was simulated for evaluating the flood volume. When RCP 4.5/8.5 scenario was changed to SSP 245/585 scenario, the average flood volume increased by 627 ㎥/s (15%) and 523 ㎥/s (13%), respectively. By the modified Huff distribution, the flood volume increased by 139 ㎥/s (3.76%) on a 200-yr frequency and 171 ㎥/s (4.05%) on a 500-yr frequency. The newly proposed CN made the future flood value increase by 9.5 ㎥/s (0.30%) on a 200-yr frequency and 8.5 ㎥/s (0.25%) on a 500-yr frequency. The selection of climate change scenario was the biggest factor that made the flood volume to transform. Also, the impact of change in Huff was larger than that of CN about 13-16 times.

Development of Reliability-Based Design Program based on the MATLAB GUI Environment (MATLAB GUI 환경기반 신뢰성 설계기법의 개발)

  • Jeong, Shin-Taek;Ko, Dong-Hui;Park, Tae-Hun;Kim, Jeong-Dae;Cho, Hong-Yeon
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.22 no.6
    • /
    • pp.415-422
    • /
    • 2010
  • Development of the reliability-based design program in the GUI environment is inadequate for engineers familiar with the deterministic design to deal with the international design criterion based on the probabilistic design. In this study, the design program based on the GUI environment is developed in order to more efficiently input the design factor and more easily carry out the design works. The GUI environment is the GUIDE (Graphic User Interface Development Environment) tool supported by the latest MATALB version 7.1. In order to test the model reliability, the probabilities of failure (POF) on the breakwater armor block (AB) and gravity quay-wall (QW) in the sliding mode are computed using the model in the Level II and Level III. The POF are 55.4~55.7% for breakwater AB and 0.0006~0.0007% for gravity QW. A non-GUI environment program results of the POF are 55.6% for breakwater AB and 0.0018% for gravity QW. In comparison, the POF difference is negligible for breakwater AB because the exact input design parameters are available, whereas the large POF difference, but within the same order, for gravity QW can be explained by the difference of the input design factors because of the poor input data information.

Bayesian Sensor Fusion of Monocular Vision and Laser Structured Light Sensor for Robust Localization of a Mobile Robot (이동 로봇의 강인 위치 추정을 위한 단안 비젼 센서와 레이저 구조광 센서의 베이시안 센서융합)

  • Kim, Min-Young;Ahn, Sang-Tae;Cho, Hyung-Suck
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.16 no.4
    • /
    • pp.381-390
    • /
    • 2010
  • This paper describes a procedure of the map-based localization for mobile robots by using a sensor fusion technique in structured environments. A combination of various sensors with different characteristics and limited sensibility has advantages in view of complementariness and cooperation to obtain better information on the environment. In this paper, for robust self-localization of a mobile robot with a monocular camera and a laser structured light sensor, environment information acquired from two sensors is combined and fused by a Bayesian sensor fusion technique based on the probabilistic reliability function of each sensor predefined through experiments. For the self-localization using the monocular vision, the robot utilizes image features consisting of vertical edge lines from input camera images, and they are used as natural landmark points in self-localization process. However, in case of using the laser structured light sensor, it utilizes geometrical features composed of corners and planes as natural landmark shapes during this process, which are extracted from range data at a constant height from the navigation floor. Although only each feature group of them is sometimes useful to localize mobile robots, all features from the two sensors are simultaneously used and fused in term of information for reliable localization under various environment conditions. To verify the advantage of using multi-sensor fusion, a series of experiments are performed, and experimental results are discussed in detail.

ROLE OF COMPUTER SIMULATION MODELING IN PESTICIDE ENVIRONMENTAL RISK ASSESSMENT

  • Wauchope, R.Don;Linders, Jan B.H.J.
    • Proceedings of the Korea Society of Environmental Toocicology Conference
    • /
    • 2003.10a
    • /
    • pp.91-93
    • /
    • 2003
  • It has been estimated that the equivalent of approximately $US 50 billion has been spent on research on the behavior and fate of pesticides in the environment since Rachel Carson published “Silent Spring” in 1962. Much of the resulting knowledge has been summarized explicitly in computer algorithms in a variety of empirical, deterministic, and probabilistic simulation models. These models describe and predict the transport, degradation and resultant concentrations of pesticides in various compartments of the environment during and after application. In many cases the known errors of model predictions are large. For this reason they are typically designed to be “conservative”, i.e., err on the side of over-prediction of concentrations in order to err on the side of safety. These predictions are then compared with toxicity data, from tests of the pesticide on a series of standard representative biota, including terrestrial and aquatic indicator species and higher animals (e.g., wildlife and humans). The models' predictions are good enough in some cases to provide screening of those compounds which are very unlikely to do harm, and to indicate those compounds which must be investigated further. If further investigation is indicated a more detailed (and therefore more complicated) model may be employed to give a better estimate, or field experiments may be required. A model may be used to explore “what if” questions leading to possible alternative pesticide usage patterns which give lower potential environmental concentrations and allowable exposures. We are currently at a maturing stage in this research where the knowledge base of pesticide behavior in the environmental is growing more slowly than in the past. However, innovative uses are being made of the explosion in available computer technology to use models to take ever more advantage of the knowledge we have. In this presentation, current developments in the state of the art as practiced in North America and Europe will be presented. Specifically, we will look at the efforts of the ‘Focus’ consortium in the European Union, and the ‘EMWG’ consortium in North America. These groups have been innovative in developing a process and mechanisms for discussion amongst academic, agriculture, industry and regulatory scientists, for consensus adoption of research advances into risk management methodology.

  • PDF