• Title/Summary/Keyword: Variance estimation

Search Result 733, Processing Time 0.021 seconds

Resampling-based Test of Hypothesis in L1-Regression

  • Kim, Bu-Yong
    • Communications for Statistical Applications and Methods
    • /
    • v.11 no.3
    • /
    • pp.643-655
    • /
    • 2004
  • L$_1$-estimator in the linear regression model is widely recognized to have superior robustness in the presence of vertical outliers. While the L$_1$-estimation procedures and algorithms have been developed quite well, less progress has been made with the hypothesis test in the multiple L$_1$-regression. This article suggests computer-intensive resampling approaches, jackknife and bootstrap methods, to estimating the variance of L$_1$-estimator and the scale parameter that are required to compute the test statistics. Monte Carlo simulation studies are performed to measure the power of tests in small samples. The simulation results indicate that bootstrap estimation method is the most powerful one when it is employed to the likelihood ratio test.

Accuracy Measures of Empirical Bayes Estimator for Mean Rates

  • Jeong, Kwang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • v.17 no.6
    • /
    • pp.845-852
    • /
    • 2010
  • The outcomes of counts commonly occur in the area of disease mapping for mortality rates or disease rates. A Poisson distribution is usually assumed as a model of disease rates in conjunction with a gamma prior. The small area typically refers to a small geographical area or demographic group for which very little information is available from the sample surveys. Under this situation the model-based estimation is very popular, in which the auxiliary variables from various administrative sources are used. The empirical Bayes estimator under Poissongamma model has been considered with its accuracy measures. An accuracy measure using a bootstrap samples adjust the underestimation incurred by the posterior variance as an estimator of true mean squared error. We explain the suggested method through a practical dataset of hitters in baseball games. We also perform a Monte Carlo study to compare the accuracy measures of mean squared error.

Robust Kalman Filter Design via Selecting Performance Indices (성능지표 선정을 통한 강인한 칼만필터 설계)

  • Jung Jongchul;Huh Kunsoo
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.29 no.1 s.232
    • /
    • pp.59-66
    • /
    • 2005
  • In this paper, a robust stationary Kalman filter is designed by minimizing selected performance indices so that it is less sensitive to uncertainties. The uncertainties include not only stochastic factors such as process noise and measurement noise, but also deterministic factors such as unknown initial estimation error, modeling error and sensing bias. To reduce the effect on the uncertainties, three performance indices that should be minimized are selected based on the quantitative error analysis to both the deterministic and the stochastic uncertainties. The selected indices are the size of the observer gain, the condition number of the observer matrix, and the estimation error variance. The observer gain is obtained by optimally solving the multi-objectives optimization problem that minimizes the indices. The robustness of the proposed filter is demonstrated through the comparison with the standard Kalman filter.

Restricted maximum likelihood estimation of a censored random effects panel regression model

  • Lee, Minah;Lee, Seung-Chun
    • Communications for Statistical Applications and Methods
    • /
    • v.26 no.4
    • /
    • pp.371-383
    • /
    • 2019
  • Panel data sets have been developed in various areas, and many recent studies have analyzed panel, or longitudinal data sets. Maximum likelihood (ML) may be the most common statistical method for analyzing panel data models; however, the inference based on the ML estimate will have an inflated Type I error because the ML method tends to give a downwardly biased estimate of variance components when the sample size is small. The under estimation could be severe when data is incomplete. This paper proposes the restricted maximum likelihood (REML) method for a random effects panel data model with a censored dependent variable. Note that the likelihood function of the model is complex in that it includes a multidimensional integral. Many authors proposed to use integral approximation methods for the computation of likelihood function; however, it is well known that integral approximation methods are inadequate for high dimensional integrals in practice. This paper introduces to use the moments of truncated multivariate normal random vector for the calculation of multidimensional integral. In addition, a proper asymptotic standard error of REML estimate is given.

Item sum techniques for quantitative sensitive estimation on successive occasions

  • Priyanka, Kumari;Trisandhya, Pidugu
    • Communications for Statistical Applications and Methods
    • /
    • v.26 no.2
    • /
    • pp.175-189
    • /
    • 2019
  • The problem of the estimation of quantitative sensitive variable using the item sum technique (IST) on successive occasions has been discussed. IST difference, IST regression, and IST general class of estimators have been proposed to estimate quantitative sensitive variable at the current occasion in two occasion successive sampling. The proposed new estimators have been elaborated under Trappmann et al. (Journal of Survey Statistics and Methodology, 2, 58-77, 2014) as well as Perri et al. (Biometrical Journal, 60, 155-173, 2018) allocation designs to allocate long list and short list samples of IST. The properties of all proposed estimators have been derived including optimum replacement policy. The proposed estimators have been mutually compared under the above mentioned allocation designs. The comparison has also been conducted with a direct method. Numerical applications through empirical as well as simplistic simulation has been used to show how the illustrated IST on successive occasions may venture in practical situations.

Enhancing Single Thermal Image Depth Estimation via Multi-Channel Remapping for Thermal Images (열화상 이미지 다중 채널 재매핑을 통한 단일 열화상 이미지 깊이 추정 향상)

  • Kim, Jeongyun;Jeon, Myung-Hwan;Kim, Ayoung
    • The Journal of Korea Robotics Society
    • /
    • v.17 no.3
    • /
    • pp.314-321
    • /
    • 2022
  • Depth information used in SLAM and visual odometry is essential in robotics. Depth information often obtained from sensors or learned by networks. While learning-based methods have gained popularity, they are mostly limited to RGB images. However, the limitation of RGB images occurs in visually derailed environments. Thermal cameras are in the spotlight as a way to solve these problems. Unlike RGB images, thermal images reliably perceive the environment regardless of the illumination variance but show lacking contrast and texture. This low contrast in the thermal image prohibits an algorithm from effectively learning the underlying scene details. To tackle these challenges, we propose multi-channel remapping for contrast. Our method allows a learning-based depth prediction model to have an accurate depth prediction even in low light conditions. We validate the feasibility and show that our multi-channel remapping method outperforms the existing methods both visually and quantitatively over our dataset.

STATISTICALLY PREPROCESSED DATA BASED PARAMETRIC COST MODEL FOR BUILDING PROJECTS

  • Sae-Hyun Ji;Moonseo Park;Hyun-Soo Lee
    • International conference on construction engineering and project management
    • /
    • 2009.05a
    • /
    • pp.417-424
    • /
    • 2009
  • For a construction project to progress smoothly, effective cost estimation is vital, particularly in the conceptual and schematic design stages. In these early phases, despite the fact that initial estimates are highly sensitive to changes in project scope, owners require accurate forecasts which reflect their supplying information. Thus, cost estimators need effective estimation strategies. Practically, parametric cost estimates are the most commonly used method in these initial phases, which utilizes historical cost data (Karshenas 1984, Kirkham 2007). Hence, compilation of historical data regarding appropriate cost variance governing parameters is a prime requirement. However, precedent practice of data mining (data preprocessing) for denoising internal errors or abnormal values is needed before compilation. As an effort to deal with this issue, this research proposed a statistical methodology for data preprocessing and verified that data preprocessing has a positive impact on the enhancement of estimate accuracy and stability. Moreover, Statistically Preprocessed data Based Parametric (SPBP) cost models are developed based on multiple regression equations and verified their effectiveness compared with conventional cost models.

  • PDF

A NOVEL WEIBULL MARSHALL-OLKIN POWER LOMAX DISTRIBUTION: PROPERTIES AND APPLICATIONS TO MEDICINE AND ENGINEERING

  • ELHAM MORADI;ZAHRA SHOKOOH GHAZANI
    • Journal of applied mathematics & informatics
    • /
    • v.41 no.6
    • /
    • pp.1275-1301
    • /
    • 2023
  • This paper introduced the Weibull Marshall-Olkin Power Lomax (WMOPL) distribution. The statistical aspects of the proposed model are presented, such as the quantiles function, moments, mean residual life and mean deviations, variance, skewness, kurtosis, and reliability measures like the residual life function, and stress-strength reliability. The parameters of the new model are estimated using six different methods, and simulation research is illustrated to compare the six estimation methods. In the end, two real data sets show that the Weibull Marshall-Olkin Power Lomax distribution is flexible and suitable for modeling data.

Study of Direct Parameter Estimation for Neyman-Scott Rectangular Pulse Model (Neyman-Scott 구형 펄스모형의 직접적인 매개변수 추정연구)

  • Jeong, Chang-Sam
    • Journal of Korea Water Resources Association
    • /
    • v.42 no.11
    • /
    • pp.1017-1028
    • /
    • 2009
  • NSRPM (Neyman-Scott Rectangular Pulse Model) is one of the common model for generating future precipitation time series in stochastical hydrology. There are 5 parameters to compose the NSRPM model for generating precipitation time series. Generally parameter estimation using moment has some problems related with increased objective functions and shows different results in accordance with random variable generating models. In this study, direct parameter estimation method was proposed to cover with disadvantages of parameter estimation using moment. To apply the direct parameter estimation, generating stochastical data variance in accordance with numbers of precipitation events of NSRPM was done. Both kinds of methods were applied at the Cheongju gauge station data. Precipitation time series were generated using 4 different random variable generator, and compared with observed time series to check the accuracies. As a results, direct method showed more stable and better results.

Time delay estimation algorithm using Elastic Net (Elastic Net를 이용한 시간 지연 추정 알고리즘)

  • Jun-Seok Lim;Keunwa Lee
    • The Journal of the Acoustical Society of Korea
    • /
    • v.42 no.4
    • /
    • pp.364-369
    • /
    • 2023
  • Time-delay estimation between two receivers is a technique that has been applied in a variety of fields, from underwater acoustics to room acoustics and robotics. There are two types of time delay estimation techniques: one that estimates the amount of time delay from the correlation between receivers, and the other that parametrically models the time delay between receivers and estimates the parameters by system recognition. The latter has the characteristic that only a small fraction of the system's parameters are directly related to the delay. This characteristic can be exploited to improve the accuracy of the estimation by methods such as Lasso regularization. However, in the case of Lasso regularization, the necessary information is lost. In this paper, we propose a method using Elastic Net that adds Ridge regularization to Lasso regularization to compensate for this. Comparing the proposed method with the conventional Generalized Cross Correlation (GCC) method and the method using Lasso regularization, we show that the estimation variance is very small even for white Gaussian signal sources and colored signal sources.