• Title/Summary/Keyword: 변수추정법

Search Result 589, Processing Time 0.029 seconds

Estimation of Forest Biomass for Muju County using Biomass Conversion Table and Remote Sensing Data (산림 바이오매스 변환표와 위성영상을 이용한 무주군의 산림 바이오매스추정)

  • Chung, Sang Young;Yim, Jong Su;Cho, Hyun Kook;Jeong, Jin Hyun;Kim, Sung Ho;Shin, Man Yong
    • Journal of Korean Society of Forest Science
    • /
    • v.98 no.4
    • /
    • pp.409-416
    • /
    • 2009
  • Forest biomass estimation is essential for greenhouse gas inventories and terrestrial carbon accounting. Remote sensing allows for estimating forest biomass over a large area. This study was conducted to estimate forest biomass and to produce a forest biomass map for Muju county using forest biomass conversion table developed by field plot data from the 5th National Forest Inventory and Landsat TM-5. Correlation analysis was carried out to select suitable independent variables for developing regression models. It was resulted that the height class, crown closure density, and age class were highly correlated with forest biomass. Six regression models were used with the combination of these three stand variables and verified by validation statistics such as root mean square error (RMSE) and mean bias. It was found that a regression model with crown closure density and height class (Model V) was better than others for estimating forest biomass. A biomass conversion table by model V was produced and then used for estimating forest biomass in the study site. The total forest biomass of the Muju county was estimated about 8.8 million ton, or 128.3 ton/ha by the conversion table.

An Analysis of Balassa-Samuelson Effect by Panel Cointegration Test (패널공적분검정을 통한 발라사-사무엘슨 효과 분석)

  • Choi, Yong-Jae
    • International Area Studies Review
    • /
    • v.22 no.3
    • /
    • pp.67-84
    • /
    • 2018
  • The purpose of this paper is to investigate the Balassa-Samuelson effect that real exchange rate could deviate from its long-run equilibrium. To analyze this effect, I estimated the long-run relationship between real exchange and productivity using the dynamic panel ordinary least square(DOLS) and panel error correction model(ECM) after conducting the unit root and cointegration test. The results show that all variables except for the real exchange rate have the unit root. Then I conducted the cointegration test to find out whether there exist the stable long-run relationships. The results show that the variables are cointegrated and significant statistically. The DOLS and ECM methods are used to estimate the coefficient of the cointegrated variables. The major finding are that the estimates are statistically significant and that they show the same sign as the economic theory predicts.

Effective Estimation of Porosity and Fluid Saturation using Joint Inversion Result of Seismic and Electromagnetic Data (탄성파탐사와 전자탐사 자료의 복합역산 결과를 이용한 효과적인 공극률 및 유체포화율의 추정)

  • Jeong, Soocheol;Seol, Soon Jee;Byun, Joongmoo
    • Geophysics and Geophysical Exploration
    • /
    • v.18 no.2
    • /
    • pp.54-63
    • /
    • 2015
  • Petrophysical parameters such as porosity and fluid saturation which provide useful information for reservoir characterization could be estimated by rock physics model (RPM) using seismic velocity and resistivity. Therefore, accurate P-wave velocity and resistivity information have to be obtained for successful estimation of the petrophysical parameters. Compared with the individual inversion of electromagnetic (EM) or seismic data, the joint inversion using both EM and seismic data together can reduce the uncertainty and gives the opportunity to use the advantages of each data. Thus, more reliable petrophysical properties could be estimated through the joint inversion. In this paper, for the successful estimation of petrophysical parameters, we proposed an effective method which applies a grid-search method to find the porosity and fluid saturation. The relations of porosity and fluid saturation with P-wave velocity and resistivity were expressed by using RPM and the improved resistivity distribution used to this study was obtained by joint inversion of seismic and EM data. When the proposed method was applied to the synthetic data which were simulated for subsea reservoir exploration, reliable petrophysical parameters were obtained. The results indicate that the proposed method can be applied for detecting a reservoir and calculating the accurate oil and gas reserves.

Detection of Structural Damage from Measured Acceleration (측정 가속도를 사용한 구조 손상 진단)

  • 곽임종
    • Proceedings of the Earthquake Engineering Society of Korea Conference
    • /
    • 1997.04a
    • /
    • pp.144-151
    • /
    • 1997
  • 구조물로부터 측정된 가속도 시간이력을 이용하여 구조손상을 찾아내고 평가하는 기법을 제시하였다. 구조계의 손상을 찾아내는 알고리즘의 주요한 수단으로써 parametric system identification 방법을 사용하였고 매개변수화된 구조물의 최적 매개변수를 추정하기 위해 구속된 비선형 최적화기법을 사용하였다. 손상된 부재를 분리하기 위한 방법으로서 적합적 매개변수 모음법을 적용하였고 손상의 정도를 통계적으로 평가하기 위하여 측정된 가속도 시간이력에 time window 기법을 적용하였다. 가속도 이력 측정에 있어서의 불충분성과 측정오차를 고려하여 알고리즘을 개발하였고, 조화진동하중으로 구조물을 가진하여 구조 손상을 진단하는 수치모의 실험을 실시하였다.

  • PDF

Estimation of Areal Reduction Factor using a two Parameter Mixed Gamma Distribution (2변수 혼합감마분포를 이용한 면적감소계수의 산정)

  • Yoo, Chulsang;Kim, Kyoungjun
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2004.05b
    • /
    • pp.584-588
    • /
    • 2004
  • 본 연구에서는 혼합 확률밀도함수를 이용한 면적감소계수의 추정법을 제안한다. 기존 면적감소계수의 추정에는 동시간 강우자료가 필요하나 그런 자료를 충분히 추하기는 쉽지 않다. 본 연구에서 제안하는 방법은 보다 가용한 일 강우자료를 이용하는 방법으로 강우의 간헐성을 고려하기 위해 연속분포가 아닌 혼합분포를 이용한다. 본 연구에서는 혼합감마분포를 이용하여 금강유역의 면적감소계수를 추정하였으며, 그 결과 보다 쉽게 아울러 기존의 방법에 의한 결과와 잘 대비되는 결과를 얼을 수 있었다.

  • PDF

Unknown-Parameter Identification for Accurate Control of 2-Link Manipulator using Dual Extended Kalman Filter (2링크 매니퓰레이터 제어를 위한 듀얼 확장 칼만 필터 기반의 미지 변수 추정 기법)

  • Seung, Ji Hoon;Park, Jung Kil;Yoo, Sung Goo
    • Journal of the Korea Convergence Society
    • /
    • v.9 no.6
    • /
    • pp.53-60
    • /
    • 2018
  • In this paper, we described the unknown parameter identification using Dual Extended Kalman Filter for precise control of 2-link manipulator. 2-link manipulator has highly non-linear characteristic with changed parameter thought tasks. The parameter kinds of mass and inertia of system is important to handle with the manipulator robustly. To solve the control problem by estimating the state and unknown parameters of the system through the proposed method. In order to verify the performance of proposed method, we simulate the implementation using Matlab and compare with results of RLS algorithm. At the results, proposed method has a better performance than those of RLS and verify the estimation performance in the parameter estimation.

Gradient Estimation for Progressive Photon Mapping (점진적 광자 매핑을 위한 기울기 계산 기법)

  • Donghee Jeon;Jeongmin Gu;Bochang Moon
    • Journal of the Korea Computer Graphics Society
    • /
    • v.30 no.3
    • /
    • pp.141-147
    • /
    • 2024
  • Progressive photon mapping is a widely adopted rendering technique that conducts a kernel-density estimation on photons progressively generated from lights. Its hyperparameter, which controls the reduction rate of the density estimation, highly affects the quality of its rendering image due to the bias-variance tradeoff of pixel estimates in photon-mapped results. We can minimize the errors of rendered pixel estimates in progressive photon mapping by estimating the optimal parameters based on gradient-based optimization techniques. To this end, we derived the gradients of pixel estimates with respect to the parameters when performing progressive photon mapping and compared our estimated gradients with finite differences to verify estimated gradients. The gradient estimated in this paper can be applied in an online learning algorithm that simultaneously performs progressive photon mapping and parameter optimization in future work.

Derivation of Relationship between Cross-site Correlation among data and among Estimators of L-moments for Generalize Extreme value distribution (Generalized Extreme Value 분포 자료의 교차상관과 L-모멘트 추정값의 교차상관의 관계 유도)

  • Jeong, Dae-Il
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.29 no.3B
    • /
    • pp.259-267
    • /
    • 2009
  • Generalized Extreme Value (GEV) distribution is recommended for flood frequency and extreme rainfall distribution in many country. L-moment method is the most common estimation procedure for the GEV distribution. In this study, the relationships between the cross-site correlations between extreme events and the cross-correlation of estimators of L-moment ratios (L-moment Coefficient of Variation (L-CV) and L-moment Coefficient of Skewness (L-CS)) for data generated from GEV distribution were derived by Monte Carlo simulation. Those relationships were fit to the simple power function. In this Monte Carlo simulation, GEV+ distribution were employed wherein unrealistic negative values were excluded. The simple power models provide accurate description of the relationships between cross-correlation of data and cross-correlation of L-moment ratios. Estimated parameters and accuracies of the power functions were reported for different GEV distribution parameters combinations. Moreover, this study provided a description about regional regression approach using Generalized Least Square (GLS) regression method which require the cross-site correlation among L-moment estimators. The relationships derived in this study allow regional GLS regression analyses of both L-CV and L-CS estimators that correctly incorporate the cross-correlation among GEV L-moment estimators.

Empirical Bayesian Misclassification Analysis on Categorical Data (범주형 자료에서 경험적 베이지안 오분류 분석)

  • 임한승;홍종선;서문섭
    • The Korean Journal of Applied Statistics
    • /
    • v.14 no.1
    • /
    • pp.39-57
    • /
    • 2001
  • Categorical data has sometimes misclassification errors. If this data will be analyzed, then estimated cell probabilities could be biased and the standard Pearson X2 tests may have inflated true type I error rates. On the other hand, if we regard wellclassified data with misclassified one, then we might spend lots of cost and time on adjustment of misclassification. It is a necessary and important step to ask whether categorical data is misclassified before analyzing data. In this paper, when data is misclassified at one of two variables for two-dimensional contingency table and marginal sums of a well-classified variable are fixed. We explore to partition marginal sums into each cells via the concepts of Bound and Collapse of Sebastiani and Ramoni (1997). The double sampling scheme (Tenenbein 1970) is used to obtain informations of misclassification. We propose test statistics in order to solve misclassification problems and examine behaviors of the statistics by simulation studies.

  • PDF

Structural Damage Assessment Using Transient Dynamic Response (동적과도응답을 사용한 구조물의 손상진단)

  • 신수봉;오성호;곽임종;고현무
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.13 no.4
    • /
    • pp.395-404
    • /
    • 2000
  • A damage detection and assessment algorithm is developed by measuring accelerations at limited locations of a structure under forced vibrations. The developed algorithm applies a time-domain system identification (SI) method that identifies a structure by solving a linearly constrained nonlinear optimization problem for optimal structural parameters. An equation error of the dynamic equilibrium of motion is minimized to estimate optimal parameters. An adaptive parameter grouping scheme is applied to localize damaged members with sparse measured accelerations. Damage is assessed in a statistical manner by applying a time-windowing technique to the measured time history of acceleration. Displacements and velocities at the measured degrees of freedom (DOF) are computed by integrating the measured accelerations. The displacements at the unmeasured DOF are estimated as additional unknowns to the unknown structural parameters, and the corresponding velocities and accelerations we computed by a numerical differentiation. A numerical simulation study with a truss structure is carried out to examine the efficiency of the algorithm. A data perturbation scheme is applied to determine the thresholds lot damage indices and to compute the damage possibility of each member.

  • PDF