• Title/Summary/Keyword: Probabilistic environment

Search Result 288, Processing Time 0.031 seconds

ROLE OF COMPUTER SIMULATION MODELING IN PESTICIDE ENVIRONMENTAL RISK ASSESSMENT

  • Wauchope, R.Don;Linders, Jan B.H.J.
    • Proceedings of the Korea Society of Environmental Toocicology Conference
    • /
    • 2003.10a
    • /
    • pp.91-93
    • /
    • 2003
  • It has been estimated that the equivalent of approximately $US 50 billion has been spent on research on the behavior and fate of pesticides in the environment since Rachel Carson published “Silent Spring” in 1962. Much of the resulting knowledge has been summarized explicitly in computer algorithms in a variety of empirical, deterministic, and probabilistic simulation models. These models describe and predict the transport, degradation and resultant concentrations of pesticides in various compartments of the environment during and after application. In many cases the known errors of model predictions are large. For this reason they are typically designed to be “conservative”, i.e., err on the side of over-prediction of concentrations in order to err on the side of safety. These predictions are then compared with toxicity data, from tests of the pesticide on a series of standard representative biota, including terrestrial and aquatic indicator species and higher animals (e.g., wildlife and humans). The models' predictions are good enough in some cases to provide screening of those compounds which are very unlikely to do harm, and to indicate those compounds which must be investigated further. If further investigation is indicated a more detailed (and therefore more complicated) model may be employed to give a better estimate, or field experiments may be required. A model may be used to explore “what if” questions leading to possible alternative pesticide usage patterns which give lower potential environmental concentrations and allowable exposures. We are currently at a maturing stage in this research where the knowledge base of pesticide behavior in the environmental is growing more slowly than in the past. However, innovative uses are being made of the explosion in available computer technology to use models to take ever more advantage of the knowledge we have. In this presentation, current developments in the state of the art as practiced in North America and Europe will be presented. Specifically, we will look at the efforts of the ‘Focus’ consortium in the European Union, and the ‘EMWG’ consortium in North America. These groups have been innovative in developing a process and mechanisms for discussion amongst academic, agriculture, industry and regulatory scientists, for consensus adoption of research advances into risk management methodology.

  • PDF

Time Domain Response of Random Electromagnetic Signals for Electromagnetic Topology Analysis Technique

  • Han, Jung-hoon
    • Journal of the Korea Society of Computer and Information
    • /
    • v.27 no.2
    • /
    • pp.135-144
    • /
    • 2022
  • Electromagnetic topology (EMT) technique is a method to analyze each component of the electromagnetic propagation environment and combine them in the form of a network in order to effectively model the complex propagation environment. In a typical commercial communication channel model, since the propagation environment is complex and difficult to predict, a probabilistic propagation channel model that utilizes an average solution, although with low accuracy, is used. However, modeling techniques using EMT technique are considered for application of propagation and coupling analysis of threat electromagnetic waves such as electromagnetic pulses, radio wave models used in electronic warfare, local communication channel models used in 5G and 6G communications that require relatively high accuracy electromagnetic wave propagation characteristics. This paper describes the effective implementation method, algorithm, and program implementation of the electromagnetic topology (EMT) method analyzed in the frequency domain. Also, a method of deriving a response in the time domain to an arbitrary applied signal source with respect to the EMT analysis result in the frequency domain will be discussed.

A Study on the Architectural Environment as a Combination of Performance and Event (퍼포먼스.이벤트의 결합체로서 건축환경연구)

  • 김주미
    • Archives of design research
    • /
    • v.14
    • /
    • pp.121-138
    • /
    • 1996
  • The purpose of this study is to develop a new architectural language and design strategies that would anticipate and incorporate new historical situations and new paradigms to understand the world. It consists of four sections as follows: First, it presents a new interpretation of space, human body, and movement that we find in modern art and tries to combine that new artistic insight with environmental design to provide a theoretical basis for performance-event architecture. Second, it conceives of architectural environment as a combination of space, movement, and probabilistic situations rather than a mere conglomeration of material. It also perceives the environment as a stage for performance and the act of designing as a performance. Third, in this context, man is conceived of as an organic system that responds to, interacts with, and adapts himself to his environment through self-regulation. By the same token, architecture should be a dynamic system that undergoes a constant transformation in its attempt to accommodate human actions and behaviors as he copes with the contemporary philosophy characterized by the principle of uncertainty, fast-changing society, and the new developments in technology. Fourth, the relativistic and organic view-point that constitutes the background for all this is radically different from the causalistic and mechanistic view that characterized the forms and functions of modernistic design. The present study places a great emphases on dematerialistic conception of environment and puts forth a disprogramming method that would accommodate interchangeability in the passage of time and the intertextuality of form and function. In the event, performance-event architecture is a strategy based on the systems world-view that would enable the recovery of man's autonomy and the reconception of his environment as an object of art.

  • PDF

Development of PBL-Based Computer Application Instruction Model (문제중심학습 기반 컴퓨터활용 수업 모형 개발)

  • Lee, Kyung Mi
    • The Journal of Korean Association of Computer Education
    • /
    • v.16 no.2
    • /
    • pp.29-37
    • /
    • 2013
  • In the environment of emphasizing creativity, computer application instruction model is in demand which is developed for the group discussion, enlarging the range of thinking, and creativity. The purpose of this study is to develop a PBL-based computer application instruction model concerning creativity after researching problems during the use of computer education by group. The theoretical background of class model is PBL, and the efficiency and creativity of class is considered while using web-mind map as the tool for the discussion and data sharing. As a result of applying the suggested model to the students enrolled in the production of presentation instruction, it made probabilistic meaningful outcome where complaint with communication, data sharing, role sharing, difference of contribution, excessive run-time and etc. were reduced.

  • PDF

Localization Method for Multiple Robots Based on Bayesian Inference in Cognitive Radio Networks (인지 무선 네트워크에서의 베이지안 추론 기반 다중로봇 위치 추정 기법 연구)

  • Kim, Donggu;Park, Joongoo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.22 no.2
    • /
    • pp.104-109
    • /
    • 2016
  • In this paper, a localization method for multiple robots based on Bayesian inference is proposed when multiple robots adopting multi-RAT (Radio Access Technology) communications exist in cognitive radio networks. Multiple robots are separately defined by primary and secondary users as in conventional mobile communications system. In addition, the heterogeneous spectrum environment is considered in this paper. To improve the performance of localization for multiple robots, a realistic multiple primary user distribution is explained by using the probabilistic graphical model, and then we introduce the Gibbs sampler strategy based on Bayesian inference. In addition, the secondary user selection minimizing the value of GDOP (Geometric Dilution of Precision) is also proposed in order to overcome the limitations of localization accuracy with Gibbs sampling. Via the simulation results, we can show that the proposed localization method based on GDOP enhances the accuracy of localization for multiple robots. Furthermore, it can also be verified from the simulation results that localization performance is significantly improved with increasing number of observation samples when the GDOP is considered.

A new approach to quantify safety benefits of disaster robots

  • Kim, Inn Seock;Choi, Young;Jeong, Kyung Min
    • Nuclear Engineering and Technology
    • /
    • v.49 no.7
    • /
    • pp.1414-1422
    • /
    • 2017
  • Remote response technology has advanced to the extent that a robot system, if properly designed and deployed, may greatly help respond to beyond-design-basis accidents at nuclear power plants. Particularly in the aftermath of the Fukushima accident, there is increasing interest in developing disaster robots that can be deployed in lieu of a human operator to the field to perform mitigating actions in the harsh environment caused by extreme natural hazards. The nuclear robotics team of the Korea Atomic Energy Research Institute (KAERI) is also endeavoring to construct disaster robots and, first of all, is interested in finding out to what extent safety benefits can be achieved by such a disaster robotic system. This paper discusses a new approach based on the probabilistic risk assessment (PRA) technique, which can be used to quantify safety benefits associated with disaster robots, along with a case study for seismic-induced station blackout condition. The results indicate that to avoid core damage in this special case a robot system with reliability > 0.65 is needed because otherwise core damage is inevitable. Therefore, considerable efforts are needed to improve the reliability of disaster robots, because without assurance of high reliability, remote response techniques will not be practically used.

Probabilistic Safety Assessment for High Level Nuclear Waste Repository System

  • Kim, Taw-Woon;Woo, Kab-Koo;Lee, Kun-Jai
    • Journal of Radiation Protection and Research
    • /
    • v.16 no.1
    • /
    • pp.53-72
    • /
    • 1991
  • An integrated model is developed in this paper for the performance assessment of high level radioactive waste repository. This integrated model consists of two simple mathematical models. One is a multiple-barrier failure model of the repository system based on constant failure rates which provides source terms to biosphere. The other is a biosphere model which has multiple pathways for radionuclides to reach to human. For the parametric uncertainty and sensitivity analysis for the risk assessment of high level radioactive waste repository, Latin hypercube sampling and rank correlation techniques are applied to this model. The former is cost-effective for large computer programs because it gives smaller error in estimating output distribution even with smaller number of runs compared to crude Monte Carlo technique. The latter is good for generating dependence structure among samples of input parameters. It is also used to find out the most sensitive, or important, parameter groups among given input parameters. The methodology of the mathematical modelling with statistical analysis will provide useful insights to the decision-making of radioactive waste repository selection and future researches related to uncertain and sensitive input parameters.

  • PDF

Sensitivity Analysis of Probabilistic Reliability Evaluation of KEPCO System Using TRELSS (TRELSS를 이용한 한전계통의 확률론적 신뢰도 평가의 감도해석)

  • Tran, T.T.;Kwon, J.J.;Choi, J.S.;Jeon, D.H.;Park, Y.S.;Han, G.N.
    • Proceedings of the KIEE Conference
    • /
    • 2005.11b
    • /
    • pp.234-236
    • /
    • 2005
  • The importance and necessity conducting studios on grid reliability evaluation have been increasingly important In recent years due to the number of black-out events occurring through in the world. Quantity evaluation of transmission system reliability is very important in a competitive electricity environment. The reason is that the successful operation of electric power under a deregulated electricity market depends on transmission system reliability management. Also in Korea it takes places. The results of many case studios fer the KEPCO system using the Transmission Reliability Evaluation for Large-Scale Systems (TRELSS) Version 6_2, a program developed by EPRI are introduced in this paper. Some sensitivity analysis has been Included in case study. This paper suggests that the some Important input parameters of the TRELSS can be determined optimally from this sensitivity analysis fer high reliability level operation of a system.

  • PDF

Naval ship's susceptibility assessment by the probabilistic density function

  • Kim, Kwang Sik;Hwang, Se Yun;Lee, Jang Hyun
    • Journal of Computational Design and Engineering
    • /
    • v.1 no.4
    • /
    • pp.266-271
    • /
    • 2014
  • The survivability of the naval ship is the capability of a warship to avoid or withstand a hostile environment. The survivability of the naval ship assessed by three categories (susceptibility, vulnerability and recoverability). The magnitude of susceptibility of a warship encountering with threat is dependent upon the attributes of detection equipment and weapon system. In this paper, as a part of a naval ship's survivability analysis, an assessment process model for the ship's susceptibility analysis technique is developed. Naval ship's survivability emphasizing the susceptibility is assessed by the probability of detection, and the probability of hit. Considering the radar cross section (RCS), the assessment procedure for the susceptibility is described. It's emphasizing the simplified calculation model based on the probability density function for probability of hit. Assuming the probability of hit given a both single-hit and multiple-hit, the susceptibility is accessed for a RCS and the hit probability for a rectangular target is applied for a given threat.

The Numerical Simulation of Muti-directional Wasves and Statistical Investigation (다방향파의 수치시뮬레이션 및 통계적 검토)

  • 송명재;조효제;이승건
    • Journal of Ocean Engineering and Technology
    • /
    • v.7 no.2
    • /
    • pp.114-120
    • /
    • 1993
  • Responses of marine vehicles and ocean structures in a seaway can be predicted by applying the probabilistic approach. When we consider a linear system, the responses in a random seaway can be evaluated through spectral analysis in the frequency domain. But when we treat nonlinear system in irregular waves, it is necessary to get time history of waves. In the previous study we introduced one-directional waves (long crested waves)as wave environment and carried out calculations and experiments in the waves. But the real sea in which marine vehicles and structures are operated has multi-directional waves (short crested waves). It is important to get a simulated random sea and analyse dynamic problems in the sea. We need entire sample function or probabillty density function to infer statistical value of random process. However if the process are ergodic process, we can get statistical values by analysis of one sample function. In this paper, we developed the simulation technique of multi-directional waves and proved that the time history given by this method keep ergodic characteristics by the statistical analysis.

  • PDF