• Title/Summary/Keyword: kernel estimation

Search Result 296, Processing Time 0.027 seconds

Probabilistic Power Flow Studies Incorporating Correlations of PV Generation for Distribution Networks

  • Ren, Zhouyang;Yan, Wei;Zhao, Xia;Zhao, Xueqian;Yu, Juan
    • Journal of Electrical Engineering and Technology
    • /
    • v.9 no.2
    • /
    • pp.461-470
    • /
    • 2014
  • This paper presents a probabilistic power flow (PPF) analysis method for distribution network incorporating the randomness and correlation of photovoltaic (PV) generation. Based on the multivariate kernel density estimation theory, the probabilistic model of PV generation is proposed without any assumption of theoretical parametric distribution, which can accurately capture not only the randomness but also the correlation of PV resources at adjacent locations. The PPF method is developed by combining the proposed PV model and Monte Carlo technique to evaluate the influence of the randomness and correlation of PV generation on the performance of distribution networks. The historical power output data of three neighboring PV generators in Oregon, USA, and 34-bus/69-bus radial distribution networks are used to demonstrate the correctness, effectiveness, and application of the proposed PV model and PPF method.

Uncertainty analysis of containment dose rate for core damage assessment in nuclear power plants

  • Wu, Guohua;Tong, Jiejuan;Gao, Yan;Zhang, Liguo;Zhao, Yunfei
    • Nuclear Engineering and Technology
    • /
    • v.50 no.5
    • /
    • pp.673-682
    • /
    • 2018
  • One of the most widely used methods to estimate core damage during a nuclear power plant accident is containment radiation measurement. The evolution of severe accidents is extremely complex, leading to uncertainty in the containment dose rate (CDR). Therefore, it is difficult to accurately determine core damage. This study proposes to conduct uncertainty analysis of CDR for core damage assessment. First, based on source term estimation, the Monte Carlo (MC) and point-kernel integration methods were used to estimate the probability density function of the CDR under different extents of core damage in accident scenarios with late containment failure. Second, the results were verified by comparing the results of both methods. The point-kernel integration method results were more dispersed than the MC results, and the MC method was used for both quantitative and qualitative analyses. Quantitative analysis indicated a linear relationship, rather than the expected proportional relationship, between the CDR and core damage fraction. The CDR distribution obeyed a logarithmic normal distribution in accidents with a small break in containment, but not in accidents with a large break in containment. A possible application of our analysis is a real-time core damage estimation program based on the CDR.

An efficient reliability analysis strategy for low failure probability problems

  • Cao, Runan;Sun, Zhili;Wang, Jian;Guo, Fanyi
    • Structural Engineering and Mechanics
    • /
    • v.78 no.2
    • /
    • pp.209-218
    • /
    • 2021
  • For engineering, there are two major challenges in reliability analysis. First, to ensure the accuracy of simulation results, mechanical products are usually defined implicitly by complex numerical models that require time-consuming. Second, the mechanical products are fortunately designed with a large safety margin, which leads to a low failure probability. This paper proposes an efficient and high-precision adaptive active learning algorithm based on the Kriging surrogate model to deal with the problems with low failure probability and time-consuming numerical models. In order to solve the problem with multiple failure regions, the adaptive kernel-density estimation is introduced and improved. Meanwhile, a new criterion for selecting points based on the current Kriging model is proposed to improve the computational efficiency. The criterion for choosing the best sampling points considers not only the probability of misjudging the sign of the response value at a point by the Kriging model but also the distribution information at that point. In order to prevent the distance between the selected training points from too close, the correlation between training points is limited to avoid information redundancy and improve the computation efficiency of the algorithm. Finally, the efficiency and accuracy of the proposed method are verified compared with other algorithms through two academic examples and one engineering application.

Estimation of Hazard Function and its Associated Factors in Gastric Cancer Patients using Wavelet and Kernel Smoothing Methods

  • Ahmadi, Azadeh;Roudbari, Masoud;Gohari, Mahmood Reza;Hosseini, Bistoon
    • Asian Pacific Journal of Cancer Prevention
    • /
    • v.13 no.11
    • /
    • pp.5643-5646
    • /
    • 2012
  • Background and Objectives: Increase of mortality rates of gastric cancer in Iran and the world in recent years reveal necessity of studies on this disease. Here, hazard function for gastric cancer patients was estimated using Wavelet and Kernel methods and some related factors were assessed. Materials and Methods: Ninety-five gastric cancer patients in Fayazbakhsh Hospital between 1996 and 2003 were studied. The effects of age of patients, gender, stage of disease and treatment method on patient's lifetime were assessed. For data analyses, survival analyses using Wavelet method and Log-rank test in R software were used. Results: Nearly 25.3% of patients were female. Fourteen percent had surgery treatment and the rest had treatment without surgery. Three fourths died and the rest were censored. Almost 9.5% of patients were in early stages of the disease, 53.7% in locally advance stage and 36.8% in metastatic stage. Hazard function estimation with the wavelet method showed significant difference for stages of disease (P<0.001) and did not reveal any significant difference for age, gender and treatment method. Conclusion: Only stage of disease had effects on hazard and most patients were diagnosed in late stages of disease, which is possibly one of the most reasons for high hazard rate and low survival. Therefore, it seems to be necessary a public education about symptoms of disease by media and regular tests and screening for early diagnosis.

A Study of the Valid Model(Kernel Regression) of Main Feed-Water for Turbine Cycle (주급수 유량의 유효 모델(커널 회귀)에 대한 연구)

  • Yang, Hac-Jin;Kim, Seong-Kun
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.20 no.12
    • /
    • pp.663-670
    • /
    • 2019
  • Corrective thermal performance analysis is required for power plants' turbine cycles to determine the performance status of the cycle and improve the economic operation of the power plant. We developed a sectional classification method for the main feed-water flow to make precise corrections for the performance analysis based on the Performance Test Code (PTC) of the American Society of Mechanical Engineers (ASME). The method was developed for the estimation of the turbine cycle performance in a classified section. The classification is based on feature identification of the correlation status of the main feed-water flow measurements. We also developed predictive algorithms for the corrected main feed-water through a Kernel Regression (KR) model for each classified feature area. The method was compared with estimation using an Artificial Neural Network (ANN). The feature classification and predictive model provided more practical and reliable methods for the corrective thermal performance analysis of a turbine cycle.

Variance function estimation with LS-SVM for replicated data

  • Shim, Joo-Yong;Park, Hye-Jung;Seok, Kyung-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.20 no.5
    • /
    • pp.925-931
    • /
    • 2009
  • In this paper we propose a variance function estimation method for replicated data based on averages of squared residuals obtained from estimated mean function by the least squares support vector machine. Newton-Raphson method is used to obtain associated parameter vector for the variance function estimation. Furthermore, the cross validation functions are introduced to select the hyper-parameters which affect the performance of the proposed estimation method. Experimental results are then presented which illustrate the performance of the proposed procedure.

  • PDF

Nonparametric Estimation of Univariate Binary Regression Function

  • Jung, Shin Ae;Kang, Kee-Hoon
    • International Journal of Advanced Culture Technology
    • /
    • v.10 no.1
    • /
    • pp.236-241
    • /
    • 2022
  • We consider methods of estimating a binary regression function using a nonparametric kernel estimation when there is only one covariate. For this, the Nadaraya-Watson estimation method using single and double bandwidths are used. For choosing a proper smoothing amount, the cross-validation and plug-in methods are compared. In the real data analysis for case study, German credit data and heart disease data are used. We examine whether the nonparametric estimation for binary regression function is successful with the smoothing parameter using the above two approaches, and the performance is compared.

Stepwise Estimation for Multiple Non-Crossing Quantile Regression using Kernel Constraints (커널 제약식을 이용한 다중 비교차 분위수 함수의 순차적 추정법)

  • Bang, Sungwan;Jhun, Myoungshic;Cho, HyungJun
    • The Korean Journal of Applied Statistics
    • /
    • v.26 no.6
    • /
    • pp.915-922
    • /
    • 2013
  • Quantile regression can estimate multiple conditional quantile functions of the response, and as a result, it provide comprehensive information of the relationship between the response and the predictors. However, when estimating several conditional quantile functions separately, two or more estimated quantile functions may cross or overlap and consequently violate the basic properties of quantiles. In this paper, we propose a new stepwise method to estimate multiple non-crossing quantile functions using constraints on the kernel coefficients. A simulation study are presented to demonstrate satisfactory performance of the proposed method.

Bandwidth selection for discontinuity point estimation in density (확률밀도함수의 불연속점 추정을 위한 띠폭 선택)

  • Huh, Jib
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.1
    • /
    • pp.79-87
    • /
    • 2012
  • In the case that the probability density function has a discontinuity point, Huh (2002) estimated the location and jump size of the discontinuity point based on the difference between the right and left kernel density estimators using the one-sided kernel function. In this paper, we consider the cross-validation, made by the right and left maximum likelihood cross-validations, for the bandwidth selection in order to estimate the location and jump size of the discontinuity point. This method is motivated by the one-sided cross-validation of Hart and Yi (1998). The finite sample performance is illustrated by simulated example.

A Kernel-function-based Approach to Sequential Estimation with $\beta$-protection of Quantiles

  • 김성래;김성균
    • Proceedings of the Korean Society of Computational and Applied Mathematics Conference
    • /
    • 2003.09a
    • /
    • pp.14-14
    • /
    • 2003
  • Given a sequence { $X_{n}$} of independent and identically distributed random variables with F, a sequential procedure for the p-th quantile ξ$_{P}$= $F^{-1}$ (P), 0$\beta$-protection. Some asymptotic properties for the proposed procedure and of an involved stopping time are proved: asymptotic consistency, asymptotic efficiency and asymptotic normality. From one of the results an effect of smoothing based on kernel functions is discussed. The results are also extended to the contaminated case.e.e.

  • PDF