• Title/Summary/Keyword: Models, statistical

Search Result 3,041, Processing Time 0.023 seconds

The Statistical Model for Predicting Flood Frequency (홍수 빈도 예측을 위한 통계학적 모형)

  • 노재식;이길춘
    • Water for future
    • /
    • v.25 no.2
    • /
    • pp.89-97
    • /
    • 1992
  • This study is to verify the applicability of statistical models for predicting flood frequency at the stage gaging stations selected by considering whether the flow is natural condition in the Han River basin. From the result of verification, this statistical flood frequency models showed that is fairly reasonable to apply in practice, and also were compared with sampling variance to calibrate the statistical dfficiency of the estimate of the T year flood Q(T) by two different flood frequency models. As a result, it was showed that for return periods greater than about T=10 years the annual exceedence series estimate of Q(T) has smaller sampling variance than the annual maximum series estimate. It was showed that for the range of return periods the partial duration series estimate of Q(T) has smaller sampling varianed than the annual maximum series estimate only if the POT model contains at least 2N(N:record length)items or more in order to estimate Q(T) more efficiently than the ANNMAX model.

  • PDF

Non-destructive assessment of the three-point-bending strength of mortar beams using radial basis function neural networks

  • Alexandridis, Alex;Stavrakas, Ilias;Stergiopoulos, Charalampos;Hloupis, George;Ninos, Konstantinos;Triantis, Dimos
    • Computers and Concrete
    • /
    • v.16 no.6
    • /
    • pp.919-932
    • /
    • 2015
  • This paper presents a new method for assessing the three-point-bending (3PB) strength of mortar beams in a non-destructive manner, based on neural network (NN) models. The models are based on the radial basis function (RBF) architecture and the fuzzy means algorithm is employed for training, in order to boost the prediction accuracy. Data for training the models were collected based on a series of experiments, where the cement mortar beams were subjected to various bending mechanical loads and the resulting pressure stimulated currents (PSCs) were recorded. The input variables to the NN models were then calculated by describing the PSC relaxation process through a generalization of Boltzmannn-Gibbs statistical physics, known as non-extensive statistical physics (NESP). The NN predictions were evaluated using k-fold cross-validation and new data that were kept independent from training; it can be seen that the proposed method can successfully form the basis of a non-destructive tool for assessing the bending strength. A comparison with a different NN architecture confirms the superiority of the proposed approach.

Volatility Analysis for Multivariate Time Series via Dimension Reduction (차원축소를 통한 다변량 시계열의 변동성 분석 및 응용)

  • Song, Eu-Gine;Choi, Moon-Sun;Hwang, S.Y.
    • Communications for Statistical Applications and Methods
    • /
    • v.15 no.6
    • /
    • pp.825-835
    • /
    • 2008
  • Multivariate GARCH(MGARCH) has been useful in financial studies and econometrics for modeling volatilities and correlations between components of multivariate time series. An obvious drawback lies in that the number of parameters increases rapidly with the number of variables involved. This thesis tries to resolve the problem by using dimension reduction technique. We briefly review both factor models for dimension reduction and the MGARCH models including EWMA (Exponentially weighted moving-average model), DVEC(Diagonal VEC model), BEKK and CCC(Constant conditional correlation model). We create meaningful portfolios obtained after reducing dimension through statistical factor models and fundamental factor models and in turn these portfolios are applied to MGARCH. In addition, we compare portfolios by assessing MSE, MAD(Mean absolute deviation) and VaR(Value at Risk). Various financial time series are analyzed for illustration.

Development of Statistical Model for Line Width Estimation in Laser Micro Material Processing Using Optical Sensor (레이저 미세 가공 공정에서 광센서를 이용한 선폭 예측을 위한 통계적 모델의 개발)

  • Park Young Whan;Rhee Sehun
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.22 no.7 s.172
    • /
    • pp.27-37
    • /
    • 2005
  • Direct writing technology on the silicon wafer surface is used to reduce the size of the chip as the miniature trend in electronic circuit. In order to improve the productivity and efficiency, the real time quality estimation is very important in each semiconductor process. In laser marking, marking quality is determined by readability which is dependant on the contrast of surface, the line width, and the melting depth. Many researchers have tried to find theoretical and numerical estimation models fur groove geometry. However, these models are limited to be applied to the real system. In this study, the estimation system for the line width during the laser marking was proposed by process monitoring method. The light intensity emitted by plasma which is produced when irradiating the laser to the silicon wafer was measured using the optical sensor. Because the laser marking is too fast to measure with external sensor, we build up the coaxial monitoring system. Analysis for the correlation between the acquired signals and the line width according to the change of laser power was carried out. Also, we developed the models enabling the estimation of line width of the laser marking through the statistical regression models and may see that their estimating performances were excellent.

Linear Mixed Models in Genetic Epidemiological Studies and Applications (선형혼합모형의 역할 및 활용사례: 유전역학 분석을 중심으로)

  • Lim, Jeongmin;Won, Sungho
    • The Korean Journal of Applied Statistics
    • /
    • v.28 no.2
    • /
    • pp.295-308
    • /
    • 2015
  • We have experienced a substantial improvement in and cost-drop for genotyping that enables genetic epidemiological studies with large-scale genetic data. Genome-wide association studies have identified more than ten thousand causal variants. Many statistical methods based on linear mixed models have been developed for various goals such as estimating heritability and identifying disease susceptibility locus. Empirical results also repeatedly stress the importance of linear mixed models. Therefore, we review the statistical methods related with to linear mixed models and illustrate the meaning of their estimates.

Comparison of Statistical Models for Analysis of Fatigue Life of Cable (케이블 피로 수명 해석 통계 모델 비교)

  • Suh, Jeong-In;Yoo, Sung-Won
    • Journal of the Korea institute for structural maintenance and inspection
    • /
    • v.7 no.4
    • /
    • pp.129-137
    • /
    • 2003
  • The cable in the cable-supported structures is long, therefore it can be reasonable to apply the different models, compared with those used for general steel elements. This paper compares the statistical models with existing cable fatigue data, after deriving the cdf(cumulative distibution function) with modifying the log-normal distribution, the existing extremal distributions so as to include length effect. The paper presents the appropriate model for analyzing and assessing the fatigue behavior of cable which is being used for actual structures.

Maximum likelihood estimation of stochastic volatility models with leverage effect and fat-tailed distribution using hidden Markov model approximation (두꺼운 꼬리 분포와 레버리지효과를 포함하는 확률변동성모형에 대한 최우추정: HMM근사를 이용한 최우추정)

  • Kim, TaeHyung;Park, JeongMin
    • The Korean Journal of Applied Statistics
    • /
    • v.35 no.4
    • /
    • pp.501-515
    • /
    • 2022
  • Despite the stylized statistical features of returns of financial returns such as fat-tailed distribution and leverage effect, no stochastic volatility models that can explicitly capture these features have been presented in the existing frequentist approach. we propose an approximate parameterization of stochastic volatility models that can explicitly capture the fat-tailed distribution and leverage effect of financial returns and a maximum likelihood estimation of the model using Langrock et al. (2012)'s hidden Markov model approximation in a frequentist approach. Through extensive simulation experiments and an empirical analysis, we present the statistical evidences validating the efficacy and accuracy of proposed parameterization.

Prediction of Future Sea Surface Temperature around the Korean Peninsular based on Statistical Downscaling (통계적 축소법을 이용한 한반도 인근해역의 미래 표층수온 추정)

  • Ham, Hee-Jung;Kim, Sang-Su;Yoon, Woo-Seok
    • Journal of Industrial Technology
    • /
    • v.31 no.B
    • /
    • pp.107-112
    • /
    • 2011
  • Recently, climate change around the world due to global warming has became an important issue and damages by climate change have a bad effect on human life. Changes of Sea Surface Temperature(SST) is associated with natural disaster such as Typhoon and El Nino. So we predicted daily future SST using Statistical Downscaling Method and CGCM 3.1 A1B scenario. 9 points of around Korea peninsular were selected to predict future SST and built up a regression model using Multiple Linear Regression. CGCM 3.1 was simulated with regression model, and that comparing Probability Density Function, Box-Plot, and statistical data to evaluate suitability of regression models, it was validated that regression models were built up properly.

  • PDF

Estimation methods and interpretation of competing risk regression models (경쟁 위험 회귀 모형의 이해와 추정 방법)

  • Kim, Mijeong
    • The Korean Journal of Applied Statistics
    • /
    • v.29 no.7
    • /
    • pp.1231-1246
    • /
    • 2016
  • Cause-specific hazard model (Prentice et al., 1978) and subdistribution hazard model (Fine and Gray, 1999) are mostly used for the right censored survival data with competing risks. Some other models for survival data with competing risks have been subsequently introduced; however, those models have not been popularly used because the models cannot provide reliable statistical estimation methods or those are overly difficult to compute. We introduce simple and reliable competing risk regression models which have been recently proposed as well as compare their methodologies. We show how to use SAS and R for the data with competing risks. In addition, we analyze survival data with two competing risks using five different models.

Design-oriented strength and strain models for GFRP-wrapped concrete

  • Messaoud, Houssem;Kassoul, Amar;Bougara, Abdelkader
    • Computers and Concrete
    • /
    • v.26 no.3
    • /
    • pp.293-307
    • /
    • 2020
  • The aim of this paper is to develop design-oriented models for the prediction of the ultimate strength and ultimate axial strain for concrete confined with glass fiber-reinforced polymer (GFRP) wraps. Twenty of most used and recent design-oriented models developed to predict the strength and strain of GFRP-confined concrete in circular sections are selected and evaluated basing on a database of 163 test results of concrete cylinders confined with GFRP wraps subjected to uniaxial compression. The evaluation of these models is performed using three statistical indices namely the coefficient of the determination (R2), the root mean square error (RMSE), and the average absolute error (AAE). Based on this study, new strength and strain models for GFRP-wrapped concrete are developed using regression analysis. The obtained results show that the proposed models exhibit better performance and provide accurate predictions over the existing models.