• Title/Summary/Keyword: Bayesian Model

Search Result 1,317, Processing Time 0.03 seconds

Target-free vision-based approach for vibration measurement and damage identification of truss bridges

  • Dong Tan;Zhenghao Ding;Jun Li;Hong Hao
    • Smart Structures and Systems
    • /
    • v.31 no.4
    • /
    • pp.421-436
    • /
    • 2023
  • This paper presents a vibration displacement measurement and damage identification method for a space truss structure from its vibration videos. Features from Accelerated Segment Test (FAST) algorithm is combined with adaptive threshold strategy to detect the feature points of high quality within the Region of Interest (ROI), around each node of the truss structure. Then these points are tracked by Kanade-Lucas-Tomasi (KLT) algorithm along the video frame sequences to obtain the vibration displacement time histories. For some cases with the image plane not parallel to the truss structural plane, the scale factors cannot be applied directly. Therefore, these videos are processed with homography transformation. After scale factor adaptation, tracking results are expressed in physical units and compared with ground truth data. The main operational frequencies and the corresponding mode shapes are identified by using Subspace Stochastic Identification (SSI) from the obtained vibration displacement responses and compared with ground truth data. Structural damages are quantified by elemental stiffness reductions. A Bayesian inference-based objective function is constructed based on natural frequencies to identify the damage by model updating. The Success-History based Adaptive Differential Evolution with Linear Population Size Reduction (L-SHADE) is applied to minimise the objective function by tuning the damage parameter of each element. The locations and severities of damage in each case are then identified. The accuracy and effectiveness are verified by comparison of the identified results with the ground truth data.

Spatio-temporal Distribution of Suicide Risk in Iran: A Bayesian Hierarchical Analysis of Repeated Cross-sectional Data

  • Nazari, Seyed Saeed Hashemi;Mansori, Kamyar;Kangavari, Hajar Nazari;Shojaei, Ahmad;Arsang-Jang, Shahram
    • Journal of Preventive Medicine and Public Health
    • /
    • v.55 no.2
    • /
    • pp.164-172
    • /
    • 2022
  • Objectives: We aimed to estimate the space-time distribution of the risk of suicide mortality in Iran from 2006 to 2016. Methods: In this repeated cross-sectional study, the age-standardized risk of suicide mortality from 2006 to 2016 was determined. To estimate the cumulative and temporal risk, the Besag, York, and Mollié and Bernardinelli models were used. Results: The relative risk of suicide mortality was greater than 1 in 43.0% of Iran's provinces (posterior probability >0.8; range, 0.46 to 3.93). The spatio-temporal model indicated a high risk of suicide in 36.7% of Iran's provinces. In addition, significant upward temporal trends in suicide risk were observed in the provinces of Tehran, Fars, Kermanshah, and Gilan. A significantly decreasing pattern of risk was observed for men (β, -0.013; 95% credible interval [CrI], -0.010 to -0.007), and a stable pattern of risk was observed for women (β, -0.001; 95% CrI, -0.010 to 0.007). A decreasing pattern of suicide risk was observed for those aged 15-29 years (β, -0.006; 95% CrI, -0.010 to -0.0001) and 30-49 years (β, -0.001; 95% CrI, -0.018 to -0.002). The risk was stable for those aged >50 years. Conclusions: The highest risk of suicide mortality was observed in Iran's northwestern provinces and among Kurdish women. Although a low risk of suicide mortality was observed in the provinces of Tehran, Fars, and Gilan, the risk in these provinces is increasing rapidly compared to other regions.

HI gas kinematics of paired galaxies in the cluster environment from ASKAP pilot observations

  • Kim, Shin-Jeong;Oh, Se-Heon;Kim, Minsu;Park, Hye-Jin;Kim, Shinna
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.46 no.2
    • /
    • pp.70.1-70.1
    • /
    • 2021
  • We examine the HI gas kinematics and distributions of galaxy pairs in group or cluster environments from high-resolution Australian Square Kilometer Array Pathfinder (ASKAP) WALLABY pilot observations. We use 32 well-resolved close pair galaxies from the Hydra, Norma, and NGC 4636, two clusters and a group of which are identified by their spectroscopy information and additional visual inspection. We perform profile decomposition of HI velocity profiles of the galaxies using a new tool, BAYGAUD which allows us to separate a line-of-sight velocity profile into an optimal number of Gaussian components based on Bayesian MCMC techniques. Then, we construct super profiles via stacking of individual HI velocity profiles after aligning their central velocities. We fit a model which consists of double Gaussian components to the super profiles, and classify them as kinematically cold and warm HI gas components with respect to their velocity dispersions, narrower or wider 𝜎, respectively. The kinematically cold HI gas reservoir (M_cold/M_HI) of the paired galaxies is found to be relatively higher than that of unpaired control samples in the clusters and the group, showing a positive correlation with the HI mass in general. Additionally, we quantify the gravitational instability of the HI gas disk of the sample galaxies using their Toomre Q parameters and HI morphological disturbances. While no significant difference is found for the Q parameter values between the paired and unpaired galaxies, the paired galaxies tend to have larger HI asymmetry values which are derived using their moment0 map compared to those of the non-paired control sample galaxies in the distribution.

  • PDF

Sensitivity assessment of environmental drought based on Bayesian Network model in the Nakdong River basin (베이지안 네트워크 모형 기반의 환경적 가뭄의 민감도 평가: 낙동강 유역을 대상으로)

  • Yoo, Jiyoung;Kim, Tae-Woong
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2021.06a
    • /
    • pp.79-79
    • /
    • 2021
  • 기상학적 측면에서 강수 부족으로 인한 수생태환경(하천), 호소환경(저수지) 및 유역환경(중권역)으로 미치는 환경학적 가뭄의 영향을 평가하기 위한 시도는 매우 중요하다. 만약 동일한 규모의 강수부족 현상이 발생할지라도, 환경적 측면에서의 수질 및 수생태에 미치는 영향이 매우 큰 유역이 있고, 반면 어느 정도의 복원력을 유지할 수 있는 유역이 있을 것이다. 즉, 서로 다른 유역환경에 따라 가뭄으로 인한 환경적 영향은 달라질 가능성이 크며, 이처럼 환경적 가뭄에 취약한 지역을 위해서는 지속적인 환경가뭄 모니터링이 중요하다. 환경적 측면에서 가뭄의 영향을 평가하기 위해서는 다양한 수질 관련 항목을 연계한 환경가뭄 감시가 중요하며, 이와 더불어 가뭄과 관련한 다양한 이해관계자 간의 효율적인 의사결정 도구가 필요하다. 따라서 본 연구에서는 다양한 시나리오 정보를 제공할 수 있는 베이지안 네트워크 모형을 적용하여 환경가뭄 민감도 평가 방안을 제시하고자 한다. 본 모형에서는 수질 문제가 가장 심하게 대두되고 있는 낙동강 유역을 대상으로, 기상학적 가뭄에 의한 수생태 및 환경 관련 변수들(BOD, T-P, TOC)의 복잡한 상호의존성을 파악할 수 있는 베이지안 네트워크 모형을 활용하였다. 또한, 기상학적 가뭄에 의한 상류와 하류 간의 환경적 영향을 연계하여 해석하기 위한 모형을 구축하였다. 그 결과, 기상학적 가뭄으로 인한 환경적 민감도가 크게 나타나는 중권역(예: 임하댐유역)과 이와 반대인 중권역(예: 병성천유역)의 구분이 가능하였다. 또한, 상류에서 발생한 심한 기상학적 가뭄이 하류 지역 내 환경적인 영향을 지속할 가능성이 있음을 확인되었다. 따라서 본 연구에서 제안한 방법은 환경적 가뭄의 취약지역을 우선 선정하고, 나아가 상-하류 간의 환경적 가뭄을 감시하는 데 있어 활용도가 있을 것으로 기대된다.

  • PDF

Movie Choice under Joint Decision: Reassessment of Online WOM Effect

  • Kim, Youngju;Kim, Jaehwan
    • Asia Marketing Journal
    • /
    • v.15 no.1
    • /
    • pp.155-168
    • /
    • 2013
  • This study describes consumers' movie choices in conjunction with other group members and attempts to reassess the effect of the online word of mouth (WOM) source in a joint decision context. The tendency of many people to go to movies in groups has been mentioned in previous literature but there is no modeling research that studies movie choice from the group decision perspective. We found that ignoring the group movie-going perspective can result in a misunderstanding, especially underestimation of genre preference and the impact of the WOM variables. Most of the studies to measure online WOM effects were done at the aggregate level, and the role of online WOM variables(volume vs valence) is mixed in the literature. We postulate that group-level analysis might offer insight to resolve these mixed understanding of WOM effects in the literature. We implemented the study via a random effect model with group-level heterogeneity. Romance, drama, and action were selected as genre variables; valence and volume were selected as online WOM variables. A choice-based conjoint survey was used for data collection and the models was estimated via Bayesian MCMC method. The empirical results show that (i) both genre and online WOM are important variables when consumers choose movies, especially as group, and (ii) the WOM valence effect are amplified more than the volume effect does as individuals are engaged in group decision. This research contributes to the literature in several ways. First, we investigate movie choice from a group movie-going perspective that is more realistic and consistent with the market behavior. Secondly, the study sheds new light on the WOM effect. At group-level, both valence and volume significantly affect movie choices, which adds to the understanding of the role of online WOM in consumers' movie choice.

  • PDF

Application of deep learning with bivariate models for genomic prediction of sow lifetime productivity-related traits

  • Joon-Ki Hong;Yong-Min Kim;Eun-Seok Cho;Jae-Bong Lee;Young-Sin Kim;Hee-Bok Park
    • Animal Bioscience
    • /
    • v.37 no.4
    • /
    • pp.622-630
    • /
    • 2024
  • Objective: Pig breeders cannot obtain phenotypic information at the time of selection for sow lifetime productivity (SLP). They would benefit from obtaining genetic information of candidate sows. Genomic data interpreted using deep learning (DL) techniques could contribute to the genetic improvement of SLP to maximize farm profitability because DL models capture nonlinear genetic effects such as dominance and epistasis more efficiently than conventional genomic prediction methods based on linear models. This study aimed to investigate the usefulness of DL for the genomic prediction of two SLP-related traits; lifetime number of litters (LNL) and lifetime pig production (LPP). Methods: Two bivariate DL models, convolutional neural network (CNN) and local convolutional neural network (LCNN), were compared with conventional bivariate linear models (i.e., genomic best linear unbiased prediction, Bayesian ridge regression, Bayes A, and Bayes B). Phenotype and pedigree data were collected from 40,011 sows that had husbandry records. Among these, 3,652 pigs were genotyped using the PorcineSNP60K BeadChip. Results: The best predictive correlation for LNL was obtained with CNN (0.28), followed by LCNN (0.26) and conventional linear models (approximately 0.21). For LPP, the best predictive correlation was also obtained with CNN (0.29), followed by LCNN (0.27) and conventional linear models (approximately 0.25). A similar trend was observed with the mean squared error of prediction for the SLP traits. Conclusion: This study provides an example of a CNN that can outperform against the linear model-based genomic prediction approaches when the nonlinear interaction components are important because LNL and LPP exhibited strong epistatic interaction components. Additionally, our results suggest that applying bivariate DL models could also contribute to the prediction accuracy by utilizing the genetic correlation between LNL and LPP.

A Study on the Overall Economic Risks of a Hypothetical Severe Accident in Nuclear Power Plant Using the Delphi Method (델파이 기법을 이용한 원전사고의 종합적인 경제적 리스크 평가)

  • Jang, Han-Ki;Kim, Joo-Yeon;Lee, Jai-Ki
    • Journal of Radiation Protection and Research
    • /
    • v.33 no.4
    • /
    • pp.127-134
    • /
    • 2008
  • Potential economic impact of a hypothetical severe accident at a nuclear power plant(Uljin units 3/4) was estimated by applying the Delphi method, which is based on the expert judgements and opinions, in the process of quantifying uncertain factors. For the purpose of this study, it is assumed that the radioactive plume directs the inland direction. Since the economic risk can be divided into direct costs and indirect effects and more uncertainties are involved in the latter, the direct costs were estimated first and the indirect effects were then estimated by applying a weighting factor to the direct cost. The Delphi method however subjects to risk of distortion or discrimination of variables because of the human behavior pattern. A mathematical approach based on the Bayesian inferences was employed for data processing to improve the Delphi results. For this task, a model for data processing was developed. One-dimensional Monte Carlo Analysis was applied to get a distribution of values of the weighting factor. The mean and median values of the weighting factor for the indirect effects appeared to be 2.59 and 2.08, respectively. These values are higher than the value suggested by OECD/NEA, 1.25. Some factors such as small territory and public attitude sensitive to radiation could affect the judgement of panel. Then the parameters of the model for estimating the direct costs were classified as U- and V-types, and two-dimensional Monte Carlo analysis was applied to quantify the overall economic risk. The resulting median of the overall economic risk was about 3.9% of the gross domestic products(GDP) of Korea in 2006. When the cost of electricity loss, the highest direct cost, was not taken into account, the overall economic risk was reduced to 2.2% of GDP. This assessment can be used as a reference for justifying the radiological emergency planning and preparedness.

Change Detection of land-surface Environment in Gongju Areas Using Spatial Relationships between Land-surface Change and Geo-spatial Information (지표변화와 지리공간정보의 연관성 분석을 통한 공주지역 지표환경 변화 분석)

  • Jang Dong-Ho
    • Journal of the Korean Geographical Society
    • /
    • v.40 no.3 s.108
    • /
    • pp.296-309
    • /
    • 2005
  • In this study, we investigated the change of future land-surface and relationships of land-surface change with geo-spatial information, using a Bayesian prediction model based on a likelihood ratio function, for analysing the land-surface change of the Gongju area. We classified the land-surface satellite images, and then extracted the changing area using a way of post classification comparison. land-surface information related to the land-surface change is constructed in a GIS environment, and the map of land-surface change prediction is made using the likelihood ratio function. As the results of this study, the thematic maps which definitely influence land-surface change of rural or urban areas are elevation, water system, population density, roads, population moving, the number of establishments, land price, etc. Also, thematic maps which definitely influence the land-surface change of forests areas are elevation, slope, population density, population moving, land price, etc. As a result of land-surface change analysis, center proliferation of old and new downtown is composed near Gum-river, and the downtown area will spread around the local roads and interchange areas in the urban area. In case of agricultural areas, a small tributary of Gum-river or an area of local roads which are attached with adjacent areas showed the high probability of change. Most of the forest areas are located in southeast and from this result we can guess why the wide chestnut-tree cultivation complex is located in these areas and the capability of forest damage is very high. As a result of validation using a prediction rate curve, a capability of prediction of urban area is $80\%$, agriculture area is $55\%$, forest area is $40\%$ in higher $10\%$ of possibility which the land-surface change would occur. This integration model is unsatisfactory to Predict the forest area in the study area and thus as a future work, it is necessary to apply new thematic maps or prediction models In conclusion, we can expect that this way can be one of the most essential land-surface change studies in a few years.

Robo-Advisor Algorithm with Intelligent View Model (지능형 전망모형을 결합한 로보어드바이저 알고리즘)

  • Kim, Sunwoong
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.2
    • /
    • pp.39-55
    • /
    • 2019
  • Recently banks and large financial institutions have introduced lots of Robo-Advisor products. Robo-Advisor is a Robot to produce the optimal asset allocation portfolio for investors by using the financial engineering algorithms without any human intervention. Since the first introduction in Wall Street in 2008, the market size has grown to 60 billion dollars and is expected to expand to 2,000 billion dollars by 2020. Since Robo-Advisor algorithms suggest asset allocation output to investors, mathematical or statistical asset allocation strategies are applied. Mean variance optimization model developed by Markowitz is the typical asset allocation model. The model is a simple but quite intuitive portfolio strategy. For example, assets are allocated in order to minimize the risk on the portfolio while maximizing the expected return on the portfolio using optimization techniques. Despite its theoretical background, both academics and practitioners find that the standard mean variance optimization portfolio is very sensitive to the expected returns calculated by past price data. Corner solutions are often found to be allocated only to a few assets. The Black-Litterman Optimization model overcomes these problems by choosing a neutral Capital Asset Pricing Model equilibrium point. Implied equilibrium returns of each asset are derived from equilibrium market portfolio through reverse optimization. The Black-Litterman model uses a Bayesian approach to combine the subjective views on the price forecast of one or more assets with implied equilibrium returns, resulting a new estimates of risk and expected returns. These new estimates can produce optimal portfolio by the well-known Markowitz mean-variance optimization algorithm. If the investor does not have any views on his asset classes, the Black-Litterman optimization model produce the same portfolio as the market portfolio. What if the subjective views are incorrect? A survey on reports of stocks performance recommended by securities analysts show very poor results. Therefore the incorrect views combined with implied equilibrium returns may produce very poor portfolio output to the Black-Litterman model users. This paper suggests an objective investor views model based on Support Vector Machines(SVM), which have showed good performance results in stock price forecasting. SVM is a discriminative classifier defined by a separating hyper plane. The linear, radial basis and polynomial kernel functions are used to learn the hyper planes. Input variables for the SVM are returns, standard deviations, Stochastics %K and price parity degree for each asset class. SVM output returns expected stock price movements and their probabilities, which are used as input variables in the intelligent views model. The stock price movements are categorized by three phases; down, neutral and up. The expected stock returns make P matrix and their probability results are used in Q matrix. Implied equilibrium returns vector is combined with the intelligent views matrix, resulting the Black-Litterman optimal portfolio. For comparisons, Markowitz mean-variance optimization model and risk parity model are used. The value weighted market portfolio and equal weighted market portfolio are used as benchmark indexes. We collect the 8 KOSPI 200 sector indexes from January 2008 to December 2018 including 132 monthly index values. Training period is from 2008 to 2015 and testing period is from 2016 to 2018. Our suggested intelligent view model combined with implied equilibrium returns produced the optimal Black-Litterman portfolio. The out of sample period portfolio showed better performance compared with the well-known Markowitz mean-variance optimization portfolio, risk parity portfolio and market portfolio. The total return from 3 year-period Black-Litterman portfolio records 6.4%, which is the highest value. The maximum draw down is -20.8%, which is also the lowest value. Sharpe Ratio shows the highest value, 0.17. It measures the return to risk ratio. Overall, our suggested view model shows the possibility of replacing subjective analysts's views with objective view model for practitioners to apply the Robo-Advisor asset allocation algorithms in the real trading fields.

Comparison of Dynamic Origin Destination Demand Estimation Models in Highway Network (고속도로 네트워크에서 동적기종점수요 추정기법 비교연구)

  • 이승재;조범철;김종형
    • Journal of Korean Society of Transportation
    • /
    • v.18 no.5
    • /
    • pp.83-97
    • /
    • 2000
  • The traffic management schemes through traffic signal control and information provision could be effective when the link-level data and trip-level data were used simultaneously in analysis Procedures. But, because the trip-level data. such as origin, destination and departure time, can not be obtained through the existing surveillance systems directly. It is needed to estimate it using the link-level data which can be obtained easily. Therefore the objective of this study is to develop the model to estimate O-D demand using only the link flows in highway network as a real time. The methodological approaches in this study are kalman filer, least-square method and normalized least-square method. The kalman filter is developed in the basis of the bayesian update. The normalized least-square method is developed in the basis of the least-square method and the natural constraint equation. These three models were experimented using two kinds of simulated data. The one has two abrupt changing Patterns in traffic flow rates The other is a 24 hours data that has three Peak times in a day Among these models, kalman filer has Produced more accurate and adaptive results than others. Therefore it is seemed that this model could be used in traffic demand management. control, travel time forecasting and dynamic assignment, and so forth.

  • PDF