• Title/Summary/Keyword: Initial data

Search Result 4,974, Processing Time 0.043 seconds

Comparison of Latin Hypercube Sampling and Simple Random Sampling Applied to Neural Network Modeling of HfO2 Thin Film Fabrication

  • Lee, Jung-Hwan;Ko, Young-Don;Yun, Il-Gu;Han, Kyong-Hee
    • Transactions on Electrical and Electronic Materials
    • /
    • v.7 no.4
    • /
    • pp.210-214
    • /
    • 2006
  • In this paper, two sampling methods which are Latin hypercube sampling (LHS) and simple random sampling were. compared to improve the modeling speed of neural network model. Sampling method was used to generate initial weights and bias set. Electrical characteristic data for $HfO_2$ thin film was used as modeling data. 10 initial parameter sets which are initial weights and bias sets were generated using LHS and simple random sampling, respectively. Modeling was performed with generated initial parameters and measured epoch number. The other network parameters were fixed. The iterative 20 minimum epoch numbers for LHS and simple random sampling were analyzed by nonparametric method because of their nonnormality.

PROBABILISTIC MEASUREMENT OF RISK ASSOCIATED WITH INITIAL COST ESTIMATES

  • Seokyon Hwang
    • International conference on construction engineering and project management
    • /
    • 2013.01a
    • /
    • pp.488-493
    • /
    • 2013
  • Accurate initial cost estimates are essential to effective management of construction projects where many decisions are made in the course of project management by referencing the estimates. In practice, the initial estimates are frequently derived from historical actual cost data, for which standard distribution-based techniques are widely applied in the construction industry to account for risk associated with the estimates. This approach assumes the same probability distribution of estimate errors for any selected estimates. This assumption, however, is not always satisfied. In order to account for the probabilistic nature of estimate errors, an alternative method for measuring the risk associated with a selected initial estimate is developed by applying the Bayesian probability approach. An application example include demonstrates how the method is implemented. A hypothesis test is conducted to reveal the robustness of the Bayesian probability model. The method is envisioned to effectively complement cost estimating methods that are currently in use by providing benefits as follows: (1) it effectively accounts for the probabilistic nature of errors in estimates; (2) it is easy to implement by using historical estimates and actual costs that are readily available in most construction companies; and (3) it minimizes subjective judgment by using quantitative data only.

  • PDF

Initial Weights in the PLS Algorithm for ACSI Based on SEM

  • Song, Mi-Jung;Lee, Ji-Yeon
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.1
    • /
    • pp.173-185
    • /
    • 2006
  • In this paper, we propose two methods for setting initial weights in the PLS algorithm which is employed to measure the customer satisfaction in SEM. Using data from the survey of the students conducted with the questionnaire of the ACSI survey, we evaluate the education service in terms of the satisfaction level of the students and compare our proposed methods with the previous method.

  • PDF

BLOW-UP PHENOMENA OF ARBITRARY POSITIVE INITIAL ENERGY SOLUTIONS FOR A VISCOELASTIC WAVE EQUATION WITH NONLINEAR DAMPING AND SOURCE TERMS

  • Yi, Su-Cheol
    • Journal of the Chungcheong Mathematical Society
    • /
    • v.35 no.2
    • /
    • pp.137-147
    • /
    • 2022
  • In this paper, we considered the Dirichlet initial boundary value problem of a nonlinear viscoelastic wave equation with nonlinear damping and source terms, and investigated finite time blow-up phenomena of the solutions to the equation with arbitrary positive initial data, under suitable conditions.

Compatibility for the Typhoon Damages Predicted by Korea Risk Assessment Model Input Data (한국형 재해평가모형(RAM)의 초기입력자료 적합성 평가)

  • Park, Jong-Kil;Lee, Bo-Ram;Jung, Woo-Sik
    • Journal of Environmental Science International
    • /
    • v.24 no.7
    • /
    • pp.865-874
    • /
    • 2015
  • This study was conducted to investigate the correlation between the distribution chart and input data of the predicted 3-second gust and damage cost, by using the forecast field and analysis field of Regional Data Assimilation Prediction System (RDAPS) as initial input data of Korea risk assessment model (RAM) developed in the preceding study. In this study the cases of typhoon Rusa which caused occurred great damage to the Korean peninsula was analyzed to assess the suitability of initial input data. As a result, this study has found out that the distribution chart from the forecast field and analysis field predicted from the point where the effect due to the typhoon began had similarity in both 3-second gust and damage cost with the course of time. As a result of examining the correlation, the 3-second gust had over 0.8, and it means that the forecast field and analysis field show similar results. This study has shown that utilizing the forecast field as initial input data of Korea RAM could suit the purpose of pre-disaster prevention.

Study of evaluation wind resource detailed area with complex terrain using combined MM5/CALMET system (고해상도 바람지도 구축 시스템에 관한 연구)

  • Lee, Hwa-Woon;Kim, Dong-Hyeuk;Kim, Min-Jung;Lee, Soon-Hwan;Park, Soon-Young;Kim, Hyun-Goo
    • 한국신재생에너지학회:학술대회논문집
    • /
    • 2008.10a
    • /
    • pp.274-277
    • /
    • 2008
  • To evaluate high-resolution wind resources for local and coastal area with complex terrain was attemped to combine the prognostic MM5 mesoscale model with CALMET diagnostic modeling this study. Firstly, MM5 was simulated for 1km resolution, nested fine domain, with FDDA using QuikSCAT seawinds data was employed to improve initial meteorological fields. Wind field and other meteorological variables from MM5 with all vertical levels used as initial guess field for CALMET. And 5 surface and 1 radio sonde observation data is performed objective analysis whole domain cells. Initial and boundary condition are given by 3 hourly RDAPS data of KMA in prognostic MM5 simulation. Geophysical data was used high-resolution terrain elevation and land cover(30 seconds) data from USGS with MM5 simulation. On the other hand SRTM 90m resolution and EGIS 30m landuse was adopted for CALMET diagnostic simulation. The simulation was performed on whole year for 2007. Vertical wind field a hour from CALMET and latest results of MM5 simulation was comparison with wind profiler(KEOP-2007 campaign) data at HAENAM site.

  • PDF

Pavement Performance Model Development Using Bayesian Algorithm (베이지안 기법을 활용한 공용성 모델개발 연구)

  • Mun, Sungho
    • International Journal of Highway Engineering
    • /
    • v.18 no.1
    • /
    • pp.91-97
    • /
    • 2016
  • PURPOSES : The objective of this paper is to develop a pavement performance model based on the Bayesian algorithm, and compare the measured and predicted performance data. METHODS : In this paper, several pavement types such as SMA (stone mastic asphalt), PSMA (polymer-modified stone mastic asphalt), PMA (polymer-modified asphalt), SBS (styrene-butadiene-styrene) modified asphalt, and DGA (dense-graded asphalt) are modeled in terms of the performance evaluation of pavement structures, using the Bayesian algorithm. RESULTS : From case studies related to the performance model development, the statistical parameters of the mean value and standard deviation can be obtained through the Bayesian algorithm, using the initial performance data of two different pavement cases. Furthermore, an accurate performance model can be developed, based on the comparison between the measured and predicted performance data. CONCLUSIONS : Based on the results of the case studies, it is concluded that the determined coefficients of the nonlinear performance models can be used to accurately predict the long-term performance behaviors of DGA and modified asphalt concrete pavements. In addition, the developed models were evaluated through comparison studies between the initial measurement and prediction data, as well as between the final measurement and prediction data. In the model development, the initial measured data were used.

Estimation of an intitial image for fast fractal decoding (고속 프랙탈 영상 복원을 위한 초기 영상 추정)

  • 문용호;박태희;백광렬;김재호
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.22 no.2
    • /
    • pp.325-333
    • /
    • 1997
  • In fractral decoding procedure, the reconstructed image is obtained by iteratively applying the contractive transform to an arbitrary initial image. But this method is not suitable for the fast decoding because convergence speed depends on the selection of initial image. Therefore, the initial image to achieve fast decoding should be selected. In this paper, we propose an initial image estimation that can be applied to various decoding methods. The initial image similar to the original image is estimated by using only the compressed data so that the proposed method does not affect the compression ratio. From the simulation, the PSNR of the proposed initial image is 6dB higher han that of ones iterated output image of conventional decoding with Babaraimage. Computations in addition and multiplication are reduced about 96%. On the other hands, if we apply the proposed initial image to other decoding algorithms, the faster convergence speed is expected.

  • PDF

A new fractal image decoding algorithm with fast convergence speed (고속 수렴 속도를 갖는 새로운 프랙탈 영상 복호화 알고리듬)

  • 유권열;문광석
    • Journal of the Korean Institute of Telematics and Electronics S
    • /
    • v.34S no.8
    • /
    • pp.74-83
    • /
    • 1997
  • In this paper, we propose a new fractal image decoding algorithm with fast convergence speed by using the data dependence and the improved initial image estimation. Conventional method for fractal image decoding requires high-degrdd computational complexity in decoding process, because of iterated contractive transformations applied to whole range blocks. On proposed method, Range of reconstruction imagte is divided into referenced range and data dependence region. And computational complexity is reduced by application of iterated contractive transformations for the referenced range only. Data dependence region can be decoded by one transformations when the referenced range is converged. In addition, more exact initial image is estimated by using bound () function in case of all, and an initial image more nearer to a fixed point is estimated by using range block division estimation. Consequently, the convergence speed of reconstruction iamge is improved with 40% reduction of computational complexity.

  • PDF

DIFFERENTIABILITY OF NEUTRAL STOCHASTIC DIFFERENTIAL EQUATIONS DRIVEN BY G-BROWNIAN MOTION WITH RESPECT TO THE INITIAL DATA

  • Zakaria Boumezbeur;Hacene Boutabia
    • Honam Mathematical Journal
    • /
    • v.45 no.3
    • /
    • pp.433-456
    • /
    • 2023
  • This paper deals with differentiability of solutions of neutral stochastic differential equations with respect to the initial data in the G-framework. Since the initial data belongs to the space BC ([-r, 0] ; ℝn) of bounded continuous ℝn-valued functions defined on [-r, 0] (r > 0), the derivative belongs to the Banach space 𝓛BC (ℝn) of linear bounded operators from BC ([-r, 0] ; ℝn) to ℝn. We give the neutral stochastic differential equation of the derivative. In addition, we exhibit two examples confirming the accuracy of the obtained results.