• Title/Summary/Keyword: Plausible estimate

Search Result 28, Processing Time 0.02 seconds

Measurements of Velocity Profiles Inside a Partially Filled Pipeline Using PIV (PIV를 이용한 비만관내 유속 분포 측정)

  • Choi, Jung-Geun;Sung, Jae-Yong;Lee, Moung-Ho
    • Proceedings of the SAREK Conference
    • /
    • 2006.06a
    • /
    • pp.773-778
    • /
    • 2006
  • Velocity profiles inside a partially filled pipline have been investigated experimentally. To measure the velocity fields, a particle image velocimetry (PIV), which is a recent quantitative visualization technique, is applied. The velocity profile inside a circular pipe is well known, but if the pipe is partially filled, the problem is entirely different in the sense that the velocity distribution is significantly affected by the slope of pipe and filled water level, and so on. In order to calculate exact flow rate in the open channel or partially filled pipeline, three-dimensional velocity distributions at a given cross-sectional area are measured and compared the flow rates with the previously known empirical formula of Manning equation. The results show that the velocity profiles at center plane is considerably different from each other when the slope and water level change. Thus, The three-dimensional velocity profile can be the most plausible estimate for the exact flow rate.

  • PDF

Uncertainties in Risk Assessment

  • Hattis Dale;Froines John
    • 대한예방의학회:학술대회논문집
    • /
    • 1994.02a
    • /
    • pp.440-449
    • /
    • 1994
  • Current risk assessment practices largely reflect the need for a consistent set of relatively rapid, first-cut procedures to assess 'plausible upper limits' of various risks. These practices have important roles to play in 1) screening candidate hazards for initial attention and 2) directing attention to cases where moderate-cost measures to control exposures are likely to be warranted, in the absence of further extensive (and expensive) data gathering and analysis. A problem with the current practices, however, is that they have led assessors to do a generally poor job of analyzing and expressing uncertainties, fostering 'One-Number Disease' (in which everything from one's social policy position on risk acceptance to one's technical judgment on the likelihood of different cancer dose-response relationships is rolled into a single quantity). At least for analyses that involve relatively important decisions for society (both relatively large potential health risks and relatively large potential economic costs or other disruptions), we can and should at least go one further step - and that is to assess and convey both a central tendency estimate of exposure and risk as well as our more conventional 'conservative' upper-confidence-limit values. To accomplish this, more sophisticated efforts are needed to appropriately represent the likely effects of various sources of uncertainty along the casual chain from the release of toxicants to the production of adverse effects. When the effects of individual sources of uncertainty are assessed (and any important interactions included), Monte Carlo simulation procedures can be used to produce an overall analysis of uncertainties and to highlight areas where uncertainties might be appreciably reduced by further study. Beyond the information yielded by such analyses for decision-making in a few important cases, the value of doing several exemplary risk assessments in. this way is that a set of benchmarks can be defined that will help calibrate the assumptions used in the larger number of risk assessments that must be done by 'default' procedures.

  • PDF

Discovery of an elliptical jellyfish galaxy with MUSE

  • Sheen, Yun-Kyeong;Smith, Rory;Jaffe, Yara;Kim, Minjin;Duc, Pierre-Alain;Ree, Chang Hee;Nantais, Julie;Candlish, Graeme;Yi, Sukyoung;Demarco, Ricardo;Treister, Ezequiel
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.42 no.2
    • /
    • pp.46.2-46.2
    • /
    • 2017
  • We will present a discovery of an elliptical jellyfish galaxy in Abell 2670 (Sheen et al. 2017, ApJL, 840, L7). Our MUSE IFU spectra revealed a rotating gas disk in the center of the galaxy and long ionised gas tails emanating from the disk. Its one-sided tails and a tadpole-like morphology of star-forming blobs around the galaxy suggested that the galaxy is experiencing strong ram-pressure stripping in the cluster environment. Stellar kinematics with stellar absorption lines in the MUSE spectra demonstrated that the galaxy is an elliptical galaxy without any hint of a stellar disk. Then, the primary question would be the origin of the rich gas component in the elliptical galaxy. A plausible scenario is a wet merger with a gas-rich companion. In order to investigate star formation history of the system (the galaxy and star-forming blobs), we derived star-formation rate and metallicity from the MUSE spectra. Photometric UV-Optica-IR SED fitting was also performed using GALEX, SDSS, 2MASS and WISE data, to estimate dust and gas masses in the system. For a better understanding of star formation history and environmental effect of this galaxy, FIR/sub-mm follow-up observations are proposed.

  • PDF

Mapping the Mass of the Double Radio Relic Merging Galaxy Cluster PLCK G287+32.9: A Subaru and HST Weak-lensing Analysis

  • Finner, Kyle;Jee, Myungkook James;Dawson, William;Golovich, Nathan;Gruen, Daniel;Lemaux, Brian;Wittman, David
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.42 no.2
    • /
    • pp.41.2-41.2
    • /
    • 2017
  • Discovered as the second highest S/N detection of the Planck SZ survey, PLCK G287.0+32.9 is a massive galaxy cluster that belongs to a rare collection of merging clusters that exhibit two radio relics and a radio halo. A feature that makes this cluster even more unique is the separation of the radio relics with one $\sim 400$ kpc to the north-west of the X-ray peak and the other $\sim 2.8$ Mpc to the south-east. This asymmetric configuration requires a complex merging scenario. A key to gaining insight into the events that caused the formation of the merging features is to understand the dark matter mass distribution. Using a weak-lensing technique on deep Subaru and Hubble Space Telescope observations, we map the dark matter mass distribution of PLCK G287.0+32.9. Our investigation detects five significant mass structures. The mass is dominated by a primary structure that is centered near the X-ray peak of the intracluster medium. Four lesser mass structures are detected with two located within $\sim 1\arcmin$ of the primary mass structure, a third to the north-west, and a fourth near the south-east radio relic. Along with these detections, we estimate the mass of each structure and relate their distributions to the intracluster medium and galaxy distributions. In addition, we discuss the relation of the mass structures to the formation of the relics and plausible merging scenarios.

  • PDF

Total and Partial Prevalence of Cancer Across Kerman Province, Iran, in 2014, Using an Adapted Generalized Network Scale-Up Method

  • Vardanjani, Hossein Molavi;Baneshi, Mohammad Reza;Haghdoost, AliAkbar
    • Asian Pacific Journal of Cancer Prevention
    • /
    • v.16 no.13
    • /
    • pp.5493-5498
    • /
    • 2015
  • Due to the lack of nationwide population-based cancer registration, the total cancer prevalence in Iran is unknown. Our previous work in which we used a basic network scale-up (NSU) method, failed to provide plausible estimates of total cancer prevalence in Kerman. The aim of the present study was to estimate total and partial prevalence of cancer in southeastern Iran using an adapted version of the generalized network scale-up method. A survey was conducted in 2014 using multi-stage cluster sampling. A total of 1995 face-to-face gender-matched interviews were performed based on an adapted version of the NSU questionnaire. Interviewees were asked about their family cancer history. Total and partial prevalence were estimated using a generalized NSU estimator. The Monte Carlo method was adopted for the estimation of upper/lower bounds of the uncertainty range of point estimates. One-yr, 2-3 yr, and 4-5 yr prevalence (per 100,000 people) was respectively estimated at 78 (95%CI, 66, 90), 128 (95%CI, 118, 147), and 59 (95%CI, 49, 70) for women, and 48 (95%CI, 38, 58), 78 (95%CI, 66, 91), and 42 (95%CI, 32, 52) for men. The 5-yr prevalence of all cancers was estimated at 0.18 percent for men, and 0.27 percent for women. This study showed that the generalized familial network scale-up method is capable of estimating cancer prevalence, with acceptable precision.

A novel evidence theory model and combination rule for reliability estimation of structures

  • Tao, Y.R.;Wang, Q.;Cao, L.;Duan, S.Y.;Huang, Z.H.H.;Cheng, G.Q.
    • Structural Engineering and Mechanics
    • /
    • v.62 no.4
    • /
    • pp.507-517
    • /
    • 2017
  • Due to the discontinuous nature of uncertainty quantification in conventional evidence theory(ET), the computational cost of reliability analysis based on ET model is very high. A novel ET model based on fuzzy distribution and the corresponding combination rule to synthesize the judgments of experts are put forward in this paper. The intersection and union of membership functions are defined as belief and plausible membership function respectively, and the Murfhy's average combination rule is adopted to combine the basic probability assignment for focal elements. Then the combined membership functions are transformed to the equivalent probability density function by a normalizing factor. Finally, a reliability analysis procedure for structures with the mixture of epistemic and aleatory uncertainties is presented, in which the equivalent normalization method is adopted to solve the upper and lower bound of reliability. The effectiveness of the procedure is demonstrated by a numerical example and an engineering example. The results also show that the reliability interval calculated by the suggested method is almost identical to that solved by conventional method. Moreover, the results indicate that the computational cost of the suggested procedure is much less than that of conventional method. The suggested ET model provides a new way to flexibly represent epistemic uncertainty, and provides an efficiency method to estimate the reliability of structures with the mixture of epistemic and aleatory uncertainties.

A Study on Estimating Route Travel Time Using Collected Data of Bus Information System (버스정보시스템(BIS) 수집자료를 이용한 경로통행시간 추정)

  • Lee, Young Woo
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.33 no.3
    • /
    • pp.1115-1122
    • /
    • 2013
  • Recently the demands for traffic information tend to increase, and travel time might one of the most important traffic information. To effectively estimate exact travel time, highly reliable traffic data collection is required. BIS(Bus Information System) data would be useful for the estimation of the route travel time because BIS is collecting data for the bus travel time on the main road of the city on real-time basis. Traditionally use of BIS data has been limited to the realm of bus operating but it has not been used for a variety of traffic categories. Therefore, this study estimates a route travel time on road networks in urban areas on the basis of real-time data of BIS and then eventually constructs regression models. These models use an explanatory variable that corresponds to bus travel time excluding service time at the bus stop. The results show that the coefficient of determination for the constructed regression model is more than 0.950. As a result of T-test performance with assistance from collected data and estimated model values, it is likely that the model is statistically significant with a confidence level of 95%. It is generally found that the estimation for the exact travel time on real-time basis is plausible if the BIS data is used.

On Generating a Dynamic Price Formation System with Rationality -Application to U.S. Fisheries- (합리성을 가진 동태적 가격형성모형의 연구 -U.S. 수산자원에의 응용-)

  • Park, Hoanjae
    • Environmental and Resource Economics Review
    • /
    • v.14 no.3
    • /
    • pp.699-728
    • /
    • 2005
  • This article is basically an extension of Barten(1993), Brown et al. (1995), Holt and Bishop's(2002) price formation system. A new dynamic price formation system is attempted considering full rationality of the consumers' side. The underlying idea of the new dynamic price formation system is that consumers are rational and farsighted and thus consider past and future consumptions in addition to current consumption to accept the prices traders called. In an empirical application, the U.S. commercial fish demand data are particularly interesting to this analysis in which the species are over fished, including many of the most valuable species. Especially, the grouper-snapper complex are under management jurisdiction of the National Marine Fisheries Council. In the empirical section, it shows how to adapt the model to estimate the marginal values to consumers of commercial fisheries. Since it is conceived of regulations as inducing movements along the marginal value curves, it is of growing importance to regional and national policy makers who are confronted with competing claims on diminishing fish stocks by commercial fisheries interests. It performs well and shows the plausible signs and magnitudes of price flexibilities and interaction among species. It further contributes to the general methodology of applied economics.

  • PDF