• Title/Summary/Keyword: 비모수 모형

Search Result 395, Processing Time 0.027 seconds

The Comparative Study of Software Optimal Release Time Based on Log-Logistic Distribution (Log-Logistic 분포 모형에 근거한 소프트웨어 최적방출시기에 관한 비교연구)

  • Kim, Hee-Cheul
    • Journal of the Korea Society of Computer and Information
    • /
    • v.13 no.7
    • /
    • pp.1-9
    • /
    • 2008
  • In this paper, make a study decision problem called an optimal release policies after testing a software system in development phase and transfer it to the user. When correcting or modifying the software, because of the possibility of introducing new faults when correcting or modifying the software, infinite failure non-homogeneous Poisson process models presented and propose an optimal release policies of the life distribution applied log-logistic distribution which can capture the increasing! decreasing nature of the failure occurrence rate per fault. In this paper, discuss optimal software release policies which minimize a total average software cost of development and maintenance under the constraint of satisfying a software reliability requirement. In a numerical example, after trend test applied and estimated the parameters using maximum likelihood estimation of inter-failure time data, make out estimating software optimal release time.

  • PDF

Application of Rainwater Harvesting System Reliability Model Based on Non-parametric Stochastic Daily Rainfall Generator to Haundae District of Busan (비모수적 추계학적 일 강우 발생기 기반의 빗물이용시설 신뢰도 평가모형의 부산광역시 해운대 신시가지 적용)

  • Choi, ChiHyun;Park, MooJong;Baek, ChunWoo;Kim, SangDan
    • Journal of Korean Society on Water Environment
    • /
    • v.27 no.5
    • /
    • pp.634-645
    • /
    • 2011
  • A newly developed rainwater harvesting (RWH) system reliability model is evaluated for roof area of buildings in Haeundae District of Busan. RWH system is used to supply water for toilet flushing, back garden irrigation, and air cooling. This model is portable because it is based on a non-parametric precipitation generation algorithm using a markov chain. Precipitation occurrence is simulated using transition probabilities derived for each day of the year based on the historical probability of wet and dry day state changes. Precipitation amounts are selected from a matrix of historical values within a moving 30 day window that is centered on the target day. Then, the reliability of RWH system is determined for catchment area and tank volume ranges using synthetic precipitation data. As a result, the synthetic rainfall data well reproduced the characteristics of precipitation in Busan. Also the reliabilities of RWH system for each of demands were computed to high values. Furthermore, for study area using the RWH system, reduction efficiencies for rooftop runoff inputs to the sewer system and potable water demand are evaluated for 23%, 53%, respectively.

Application of a large-scale climate ensemble simulation data to evaluate the scale of extreme rainfall: The case of 2018 Hiroshima extreme-scale rainfall event (극한 호우의 규모 평가를 위한 대규모 기후 앙상블 자료의 적용: 2018년 히로시마 극한 호우의 사례)

  • Kim, Youngkyu;Son, Minwoo
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2022.05a
    • /
    • pp.290-290
    • /
    • 2022
  • 본 연구는 대규모 기후 앙상블 모의 결과를 이용하여 산정된 극한 강우량을 최근 발생한 극한 호우사상의 규모 평가에 적용하는 것을 목적으로 수행되었다. 2018 년 히로시마 호우사상은 지속시간 24 시간에서 재현기간 1,000 년에 상응하는 극한 규모를 나타냈기 때문에 짧은 기간동안 수집된 관측자료만으로 규모를 평가하기 어렵다. 따라서 이를 평가하고자 대규모 기후 앙상블 모의결과 기반의 d4PDF 자료를 이용하였다. 이 자료는 3,000 개의 연 최대 강우자료를 제공하고, 이를 토대로 통계적 모형 및 가정 없이 비모수적으로 10 년부터 1,000 년의 재현기간을 나타내는 지속시간 24 시간의 확률강우량을 산정했다. 산정된 d4PDF 의 확률강우량은 관측강우량의 확률강우량과 비교하였으며, 관측기간에 가까운 50 년의 재현기간에서는 두 확률강우량의 차이가 3.53%였지만 관측기간 (33 년)과 재현기간 (100 년 이상)의 차이가 증가할수록 오차가 10% 이상으로 증가하는 양상을 나타냈다. 이는 장기간 재현기간에서 관측강우량의 확률강우량은 불확실성을 내포하는 것을 의미한다. d4PDF 의 확률강우량에 대해서 2018 년 히로시마 호우사상은 300 년에 가까운 재현기간을 나타냈다. 미래 기후조건에서의 d4PDF 자료를 이용해 확률강우량을산정했으며, 현재 기후조건대비 미래 기후조건에서 10 년부터 1000 년의 재현기간을 나타내는 확률강우량은 모두 20% 이상으로 증가했다. 미래 기후조건의 확률강우량에 대해 2018 년 히로시마 호우사상은 100 년에 가까운 재현기간을 나타냈으며, 이는 미래 기후조건에서 히로시마 호우사상의 발생 확률이 0.33% (현재 기후)에서 1% (미래 기후)로 증가하는 것을 의미한다. 결과적으로, 대규모 기후 앙상블 모의결과 기반의 d4PDF 는 현재 기후조건과 미래 기후조건하에서 극한 규모의 호우사상의 정량적인 평가에 유용하게 활용될 수 있다.

  • PDF

A Study on Risk Parity Asset Allocation Model with XGBoos (XGBoost를 활용한 리스크패리티 자산배분 모형에 관한 연구)

  • Kim, Younghoon;Choi, HeungSik;Kim, SunWoong
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.1
    • /
    • pp.135-149
    • /
    • 2020
  • Artificial intelligences are changing world. Financial market is also not an exception. Robo-Advisor is actively being developed, making up the weakness of traditional asset allocation methods and replacing the parts that are difficult for the traditional methods. It makes automated investment decisions with artificial intelligence algorithms and is used with various asset allocation models such as mean-variance model, Black-Litterman model and risk parity model. Risk parity model is a typical risk-based asset allocation model which is focused on the volatility of assets. It avoids investment risk structurally. So it has stability in the management of large size fund and it has been widely used in financial field. XGBoost model is a parallel tree-boosting method. It is an optimized gradient boosting model designed to be highly efficient and flexible. It not only makes billions of examples in limited memory environments but is also very fast to learn compared to traditional boosting methods. It is frequently used in various fields of data analysis and has a lot of advantages. So in this study, we propose a new asset allocation model that combines risk parity model and XGBoost machine learning model. This model uses XGBoost to predict the risk of assets and applies the predictive risk to the process of covariance estimation. There are estimated errors between the estimation period and the actual investment period because the optimized asset allocation model estimates the proportion of investments based on historical data. these estimated errors adversely affect the optimized portfolio performance. This study aims to improve the stability and portfolio performance of the model by predicting the volatility of the next investment period and reducing estimated errors of optimized asset allocation model. As a result, it narrows the gap between theory and practice and proposes a more advanced asset allocation model. In this study, we used the Korean stock market price data for a total of 17 years from 2003 to 2019 for the empirical test of the suggested model. The data sets are specifically composed of energy, finance, IT, industrial, material, telecommunication, utility, consumer, health care and staple sectors. We accumulated the value of prediction using moving-window method by 1,000 in-sample and 20 out-of-sample, so we produced a total of 154 rebalancing back-testing results. We analyzed portfolio performance in terms of cumulative rate of return and got a lot of sample data because of long period results. Comparing with traditional risk parity model, this experiment recorded improvements in both cumulative yield and reduction of estimated errors. The total cumulative return is 45.748%, about 5% higher than that of risk parity model and also the estimated errors are reduced in 9 out of 10 industry sectors. The reduction of estimated errors increases stability of the model and makes it easy to apply in practical investment. The results of the experiment showed improvement of portfolio performance by reducing the estimated errors of the optimized asset allocation model. Many financial models and asset allocation models are limited in practical investment because of the most fundamental question of whether the past characteristics of assets will continue into the future in the changing financial market. However, this study not only takes advantage of traditional asset allocation models, but also supplements the limitations of traditional methods and increases stability by predicting the risks of assets with the latest algorithm. There are various studies on parametric estimation methods to reduce the estimated errors in the portfolio optimization. We also suggested a new method to reduce estimated errors in optimized asset allocation model using machine learning. So this study is meaningful in that it proposes an advanced artificial intelligence asset allocation model for the fast-developing financial markets.

Analyzing the Efficiency of Defense Basic Research Projects using DEA (자료포락분석(DEA)을 활용한 국방 기초연구개발 사업의 효율성 분석)

  • Lim, Yong-Hwan
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.21 no.7
    • /
    • pp.517-524
    • /
    • 2020
  • In line with the recent wave of the 4th Industrial Revolution, the environment for defense R&D is transforming into a center of high-tech military technology. In particular, developed countries are strengthening control of technology exports and technology transfer to protect advanced defense science and technology. For this reason, the budget demand for securing the ability to develop independently high-tech weapons and core technologies suitable for the future battlefield environment is increasing, and increasing efficiency in R&D investment has been highlighted for efficient distribution of limited budgets. This study examined the efficiency of the defense basic R&D project using the non-parametric approach, DEA. The R&D budget, R&D researcher, and R&D period were selected as the input variables, and the number of papers and patents were used as output variables. The efficiency of basic R&D projects was analyzed through CCR, BCC models, and SE. Lastly, based on the efficiency measurements, the cause of the inefficiency of R&D projects was suggested, and ways to improve efficiency were suggested. This study is expected to be used as useful information that can be applied to project performance management through efficiency analysis of basic defense R&D projects and be reflected in the project planning stage through feedback.

Generalized kernel estimating equation for panel estimation of small area unemployment rates (소지역 실업률의 패널추정을 위한 일반화커널추정방정식)

  • Shim, Jooyong;Kim, Youngwon;Hwang, Changha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.24 no.6
    • /
    • pp.1199-1210
    • /
    • 2013
  • The high unemployment rate is one of the major problems in most countries nowadays. Hence, the demand for small area labor statistics has rapidly increased over the past few years. However, since sample surveys for producing official statistics are mainly designed for large areas, it is difficult to produce reliable statistics at the small area level due to small sample sizes. Most of existing studies about the small area estimation are related with the estimation of parameters based on cross-sectional data. By the way, since many official statistics are repeatedly collected at a regular interval of time, for instance, monthly, quarterly, or yearly, we need an alternative model which can handle this type of panel data. In this paper, we derive the generalized kernel estimating equation which can model time-dependency among response variables and handle repeated measurement or panel data. We compare the proposed estimating equation with the generalized linear model and the generalized estimating equation through simulation, and apply it to estimating the unemployment rates of 25 areas in Gyeongsangnam-do and Ulsan for 2005.

Identification of Uncertainty on the Reduction of Dead Storage in Soyang Dam Using Bayesian Stochastic Reliability Analysis (Bayesian 추계학적 신뢰도 기법을 이용한 소양강댐 퇴사용량 감소의 불확실성 분석)

  • Lee, Cheol-Eung;Kim, Sang Ug
    • Journal of Korea Water Resources Association
    • /
    • v.46 no.3
    • /
    • pp.315-326
    • /
    • 2013
  • Despite of the importance on the maintenance of a reservoir storage, relatively few studies have addressed the stochastic reliability analysis including uncertainty on the decrease of the reservoir storage by the sedimentation. Therefore, the stochastic gamma process under the reliability framework is developed and applied to estimate the reduction of the Soyang Dam reservoir storage in this paper. Especially, in the estimation of parameters of the stochastic gamma process, the Bayesian MCMC scheme using informative prior distribution is used to incorporate a wide variety of information related with the sedimentation. The results show that the selected informative prior distribution is reasonable because the uncertainty of the posterior distribution is reduced considerably compared to that of the prior distribution. Also, the range of the expected life time of the dead storage in Soyang Dam reservoir including uncertainty is estimated from 119.3 years to 183.5 years at 5% significance level. Finally, it is suggested that the improvement of the assessment strategy in this study can provide the valuable information to the decision makers who are in charge of the maintenance of a reservoir.

Comparative Study on the Estimation Methods of Traffic Crashes: Empirical Bayes Estimate vs. Observed Crash (교통사고 추정방법 비교 연구: 경험적 베이즈 추정치 vs. 관측교통사고건수)

  • Shin, Kangwon
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.30 no.5D
    • /
    • pp.453-459
    • /
    • 2010
  • In the study of traffic safety, it is utmost important to obtain more reliable estimates of the expected crashes for a site (or a segment). The observed crashes have been mainly used as the estimate of the expected crashes in Korea, while the empirical Bayes (EB) estimates based on the Poisson-gamma mixture model have been used in the USA and several European countries. Although numerous studies have used the EB method for estimating the expected crashes and/or the effectiveness of the safety countermeasures, no past studies examine the difference in the estimation errors between the two estimates. Thus, this study compares the estimation errors of the two estimates using a Monte Carlo simulation study. By analyzing the crash dataset at 3,000,000 simulated sites, this study reveals that the estimation errors of the EB estimates are always less than those of the observed crashes. Hence, it is imperative to incorporate the EB method into the traffic safety research guideline in Korea. However, the results show that the differences in the estimation errors between the two estimates decrease as the uncertainty of the prior distribution increases. Consequently, it is recommended that the EB method be used with reliable hyper-parameter estimates after conducting a comprehensive examination on the estimated negative binomial model.

Analysis of the Efficiency of Gyeonggi-do Senior Welfare Centers by DEA Model (DEA를 이용한 경기도 노인복지관 효율성 분석)

  • Kim, Keum Hwan;Pak, Ae Kyung;Ryu, Seo Hyun;Lee, Nam Sik
    • Asia-Pacific Journal of Business Venturing and Entrepreneurship
    • /
    • v.8 no.3
    • /
    • pp.165-177
    • /
    • 2013
  • The purpose of this study was to examine the efficiency of senior welfare centers and the cause of differences among senior welfare centers in that regard, and to investigate influential factors for the differences in efficiency and the size of the influence of the factors. What methods would be effective at assessing the efficiency of senior welfare centers by taking into account their circumstances was reviewed, andpost-hoc analyses were made by using data envelopment analysis(DEA) and DAE/AP Modified prosthetic which were useful tools to evaluate relative efficiency. After 20 senior welfare centers located in Gyeonggi-do were selected, their yearly operating data of 2009 were utilized. The purpose of this study was to examine the efficiency of senior welfare centers. The evaluation data released by the Gyeonggi Welfare Foundation were analyzed by DEA, which is one of nonparametric statistics, and it was possible to obtain significant results on the regional operating efficiency of social welfare centers in 14 metropolitan cities and provinces, the causes and degree of their inefficiency and what areas one could refer to. As the data for the counties were utilized in this study, it's not quite possible to produce accurate results on the relative efficiency of senior welfare centers, but this study could be said to be of significance in that it suggested how to evaluate the overall operating efficiency of senior welfare centers in the counties involving the degree of their operating inefficiency, what improvements should be made and what reference groups there might be and provided information on the usefulness of the DEA model.

  • PDF

A Development of Traffic Queue Length Measuring Algorithm Using ILD(Inductive Loop Detector) Based on COSMOS (실시간 신호제어시스템의 대기길이 추정 알고리즘 개발)

  • seong ki-ju;Lee choul-ki;Jeong Jun-ha;Lee young-in;Park dae-hyun
    • The Journal of The Korea Institute of Intelligent Transport Systems
    • /
    • v.3 no.1 s.4
    • /
    • pp.85-96
    • /
    • 2004
  • The study begin with a basic concept, if the occupancy length of vehicle detector is directly proportional to the delay of vehicle. That is, it analogize vehicle's delay of a occupancy time. The results of a study was far superior in the estimation of a queue length. It is a very good points the operator is not necessary to optimize s1, s2, Thdoc. Thdoc(critical congestion degree) replaced 0.7 with 0.2 - 0.3. But, if vehicles have been experience in delay was not occupy vehicle detector, the study is in existence some problems. In conclusion, it is necessary that stretch queue detector or install paired queue detector. Also I want to be made steady progress a following study relation to this study, because it is required traffic signal control on congestion.

  • PDF