• Title/Summary/Keyword: 결정성 검증

Search Result 2,383, Processing Time 0.035 seconds

A Study on Risk Parity Asset Allocation Model with XGBoos (XGBoost를 활용한 리스크패리티 자산배분 모형에 관한 연구)

  • Kim, Younghoon;Choi, HeungSik;Kim, SunWoong
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.1
    • /
    • pp.135-149
    • /
    • 2020
  • Artificial intelligences are changing world. Financial market is also not an exception. Robo-Advisor is actively being developed, making up the weakness of traditional asset allocation methods and replacing the parts that are difficult for the traditional methods. It makes automated investment decisions with artificial intelligence algorithms and is used with various asset allocation models such as mean-variance model, Black-Litterman model and risk parity model. Risk parity model is a typical risk-based asset allocation model which is focused on the volatility of assets. It avoids investment risk structurally. So it has stability in the management of large size fund and it has been widely used in financial field. XGBoost model is a parallel tree-boosting method. It is an optimized gradient boosting model designed to be highly efficient and flexible. It not only makes billions of examples in limited memory environments but is also very fast to learn compared to traditional boosting methods. It is frequently used in various fields of data analysis and has a lot of advantages. So in this study, we propose a new asset allocation model that combines risk parity model and XGBoost machine learning model. This model uses XGBoost to predict the risk of assets and applies the predictive risk to the process of covariance estimation. There are estimated errors between the estimation period and the actual investment period because the optimized asset allocation model estimates the proportion of investments based on historical data. these estimated errors adversely affect the optimized portfolio performance. This study aims to improve the stability and portfolio performance of the model by predicting the volatility of the next investment period and reducing estimated errors of optimized asset allocation model. As a result, it narrows the gap between theory and practice and proposes a more advanced asset allocation model. In this study, we used the Korean stock market price data for a total of 17 years from 2003 to 2019 for the empirical test of the suggested model. The data sets are specifically composed of energy, finance, IT, industrial, material, telecommunication, utility, consumer, health care and staple sectors. We accumulated the value of prediction using moving-window method by 1,000 in-sample and 20 out-of-sample, so we produced a total of 154 rebalancing back-testing results. We analyzed portfolio performance in terms of cumulative rate of return and got a lot of sample data because of long period results. Comparing with traditional risk parity model, this experiment recorded improvements in both cumulative yield and reduction of estimated errors. The total cumulative return is 45.748%, about 5% higher than that of risk parity model and also the estimated errors are reduced in 9 out of 10 industry sectors. The reduction of estimated errors increases stability of the model and makes it easy to apply in practical investment. The results of the experiment showed improvement of portfolio performance by reducing the estimated errors of the optimized asset allocation model. Many financial models and asset allocation models are limited in practical investment because of the most fundamental question of whether the past characteristics of assets will continue into the future in the changing financial market. However, this study not only takes advantage of traditional asset allocation models, but also supplements the limitations of traditional methods and increases stability by predicting the risks of assets with the latest algorithm. There are various studies on parametric estimation methods to reduce the estimated errors in the portfolio optimization. We also suggested a new method to reduce estimated errors in optimized asset allocation model using machine learning. So this study is meaningful in that it proposes an advanced artificial intelligence asset allocation model for the fast-developing financial markets.

Study on Effective Preservation of Bovine Pericardium Using Decellulariation and ${\alpha}$-galactosidase for Eliminating Xenoreactive Antigen (이종 항원 제거를 위한 무세포화와 알파-갈락토시다아제를 이용한 효과적인 우심낭 보존 방법에 관한 연구)

  • Kim, Min-Seok;Park, Cham-Jin;Kim, Soo-Hwan;Lim, Hong-Gook;Kim, Yong-Jin
    • Journal of Chest Surgery
    • /
    • v.43 no.6
    • /
    • pp.576-587
    • /
    • 2010
  • Background: Effective decellularization and fixation process is critical, in order to use xenogenic valves clinically. In the present study, we decellularized bovine pericardium using sodium dodecyl sulfate (SDS) and N-lauroyl sarcosinate, treated with $\alpha$-galactosidase, and then fixed in various manners, to find out the most effective tissue preservation & fixation procedure. Material and Method: Bovine pericardium was decellularized with SDS and N-lauroyl sarcosinate, and treated with $\alpha$-galactosidase. Both groups were fixed differently, by varying glutaraldehyde (GA) or EDC (1-ethyl-3-(3-dimethyl aminopropyl)-carbodiimide)/N-hydroxysuccinamide (NHS) treatment conditions. Thereafter, physical examination, tensile strength test, thermal stability test, cytotoxicity test, pronase test, pronase-ninhydrin test, purpald test, permeability test, compliance test, H&E staining, DNA quantification, and $\alpha$-galactose staining were carried out to each groups. Result: GA fixed groups showed better physical properties and thermal stability than EDC/NHS fixed groups, EDC/NHS-GA dual fixed groups showed better physical properties and thermal stability than EDC/NHS fixed groups, and showed better thermal stability than GA fixed groups. In pronase test and pronase-ninhydrin test, GA fixed groups and EDC/NHS-GA dual fixed groups showed stronger crosslinks than EDC/NHS groups. Permeability and compliance tended to increase in EDC/NHS-GA dual fixed groups, compared to GA fixed groups. But, EDC/NHS-GA dual fixed groups had stronger tensile strength and lower cytotoxicity than GA fixed groups. Conclusion: We have verified that EDC/NHS-GA dual fixation can make effective crosslinks and lower the toxicity of GA fixation. Henceforth, we will verify if EDC/NHS-GA dual fixation can lower calcifications & tissue failure in vivo experiment.

Social Network-based Hybrid Collaborative Filtering using Genetic Algorithms (유전자 알고리즘을 활용한 소셜네트워크 기반 하이브리드 협업필터링)

  • Noh, Heeryong;Choi, Seulbi;Ahn, Hyunchul
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.2
    • /
    • pp.19-38
    • /
    • 2017
  • Collaborative filtering (CF) algorithm has been popularly used for implementing recommender systems. Until now, there have been many prior studies to improve the accuracy of CF. Among them, some recent studies adopt 'hybrid recommendation approach', which enhances the performance of conventional CF by using additional information. In this research, we propose a new hybrid recommender system which fuses CF and the results from the social network analysis on trust and distrust relationship networks among users to enhance prediction accuracy. The proposed algorithm of our study is based on memory-based CF. But, when calculating the similarity between users in CF, our proposed algorithm considers not only the correlation of the users' numeric rating patterns, but also the users' in-degree centrality values derived from trust and distrust relationship networks. In specific, it is designed to amplify the similarity between a target user and his or her neighbor when the neighbor has higher in-degree centrality in the trust relationship network. Also, it attenuates the similarity between a target user and his or her neighbor when the neighbor has higher in-degree centrality in the distrust relationship network. Our proposed algorithm considers four (4) types of user relationships - direct trust, indirect trust, direct distrust, and indirect distrust - in total. And, it uses four adjusting coefficients, which adjusts the level of amplification / attenuation for in-degree centrality values derived from direct / indirect trust and distrust relationship networks. To determine optimal adjusting coefficients, genetic algorithms (GA) has been adopted. Under this background, we named our proposed algorithm as SNACF-GA (Social Network Analysis - based CF using GA). To validate the performance of the SNACF-GA, we used a real-world data set which is called 'Extended Epinions dataset' provided by 'trustlet.org'. It is the data set contains user responses (rating scores and reviews) after purchasing specific items (e.g. car, movie, music, book) as well as trust / distrust relationship information indicating whom to trust or distrust between users. The experimental system was basically developed using Microsoft Visual Basic for Applications (VBA), but we also used UCINET 6 for calculating the in-degree centrality of trust / distrust relationship networks. In addition, we used Palisade Software's Evolver, which is a commercial software implements genetic algorithm. To examine the effectiveness of our proposed system more precisely, we adopted two comparison models. The first comparison model is conventional CF. It only uses users' explicit numeric ratings when calculating the similarities between users. That is, it does not consider trust / distrust relationship between users at all. The second comparison model is SNACF (Social Network Analysis - based CF). SNACF differs from the proposed algorithm SNACF-GA in that it considers only direct trust / distrust relationships. It also does not use GA optimization. The performances of the proposed algorithm and comparison models were evaluated by using average MAE (mean absolute error). Experimental result showed that the optimal adjusting coefficients for direct trust, indirect trust, direct distrust, indirect distrust were 0, 1.4287, 1.5, 0.4615 each. This implies that distrust relationships between users are more important than trust ones in recommender systems. From the perspective of recommendation accuracy, SNACF-GA (Avg. MAE = 0.111943), the proposed algorithm which reflects both direct and indirect trust / distrust relationships information, was found to greatly outperform a conventional CF (Avg. MAE = 0.112638). Also, the algorithm showed better recommendation accuracy than the SNACF (Avg. MAE = 0.112209). To confirm whether these differences are statistically significant or not, we applied paired samples t-test. The results from the paired samples t-test presented that the difference between SNACF-GA and conventional CF was statistical significant at the 1% significance level, and the difference between SNACF-GA and SNACF was statistical significant at the 5%. Our study found that the trust/distrust relationship can be important information for improving performance of recommendation algorithms. Especially, distrust relationship information was found to have a greater impact on the performance improvement of CF. This implies that we need to have more attention on distrust (negative) relationships rather than trust (positive) ones when tracking and managing social relationships between users.

Calculation of Surface Broadband Emissivity by Multiple Linear Regression Model (다중선형회귀모형에 의한 지표면 광대역 방출율 산출)

  • Jo, Eun-Su;Lee, Kyu-Tae;Jung, Hyun-Seok;Kim, Bu-Yo;Zo, Il-Sung
    • Journal of the Korean earth science society
    • /
    • v.38 no.4
    • /
    • pp.269-282
    • /
    • 2017
  • In this study, the surface broadband emissivity ($3.0-14.0{\mu}m$) was calculated using the multiple linear regression model with narrow bands (channels 29, 30, and 31) emissivity data of the Moderate Resolution Imaging Spectroradiometer (MODIS) on Earth Observing System Terra satellite. The 307 types of spectral emissivity data (123 soil types, 32 vegetation types, 19 types of water bodies, 43 manmade materials, and 90 rock) with MODIS University of California Santa Barbara emissivity library and Advanced Spaceborne Thermal Emission & Reflection Radiometer spectral library were used as the spectral emissivity data for the derivation and verification of the multiple linear regression model. The derived determination coefficient ($R^2$) of multiple linear regression model had a high value of 0.95 (p<0.001) and the root mean square error between these model calculated and theoretical broadband emissivities was 0.0070. The surface broadband emissivity from our multiple linear regression model was comparable with that by Wang et al. (2005). The root mean square error between surface broadband emissivities calculated by models in this study and by Wang et al. (2005) during January was 0.0054 in Asia, Africa, and Oceania regions. The minimum and maximum differences of surface broadband emissivities between two model results were 0.0027 and 0.0067 respectively. The similar statistical results were also derived for August. The surface broadband emissivities by our multiple linear regression model could thus be acceptable. However, the various regression models according to different land covers need be applied for the more accurate calculation of the surface broadband emissivities.

A Study on Shaker's Free Design from Fashion (유행(流行)으로부터 자유로운 세이커(Shaker) 디자인에 대한 고찰)

  • Choi, Sung-Woon;Huh, Jin
    • Archives of design research
    • /
    • v.20 no.3 s.71
    • /
    • pp.279-288
    • /
    • 2007
  • Today, design is not free from fashion, which emerges and vanishes temporarily, and aims at equalization. As a result, products quickly become obsolete because of fashion. This means that the span of products is determined by a social concept, which is not clarified, regardless of their functions. Usable products will gradually disappear from us and it will cause serious environmental problems, unless we can find out measures against fashion. As such, it is important to study the characteristics of the shaker's design in this circumstance. The Shaker's community has a distinguishable difference from other general societies. Temporary fashion and misled information cannot interfere with their consciousness. Religion, the life and the principle of design have developed on the same level in their community. Especially, any decoration or the difference of materials is not allowed in shaker's design. It reflects their thinking that all people are equal in the sight of God. Therefore, any decoration for social and economical superiority can not be used. Through this consciousness, they can be free from fashion or decoration. They, also, believe that they can reach perfection through practicality and simplicity. The reason why shaker's design is not disturbed by fashion is that their belief is involved in their design. Consequently, if religious or conscious contents are primarily set up, design can be free from fashion and products can be used for a long time.

  • PDF

Estimation of Soil Surface Temperature by Heat Flux in Soil (Heat flux를 이용한 토양 표면 온도 예측)

  • Hur, Seung-Oh;Kim, Won-Tae;Jung, Kang-Ho;Ha, Sang-Keon
    • Korean Journal of Soil Science and Fertilizer
    • /
    • v.37 no.3
    • /
    • pp.131-135
    • /
    • 2004
  • This study was carried out for the analysis of temperature characteristics on soil surface using soil heat flux which is one of the important parameters forming soil temperature. Soil surface temperature was estimated by using the soil temperature measured at 10 cm soil depth and the soil heat flux measured by flux plate at 5 cm soil depth. There was time lag of two hours between soil temperature and soil heat flux. Temperature changes over time showed a positive correlation with soil heat flux. Soil surface temperature was estimated by the equation using variable separation method for soil surface temperature. Arithmetic mean using temperatures measured at soil surface and 10 cm depth, and soil temperature measured at 5 cm depth were compared for accuracy of the value. To validate the regression model through this comparison, F-validation was used. Usefulness of deductive regression model was admitted because intended F-value was smaller than 0.001 and the determination coefficient was 0.968. It can be concluded that the estimated surface soil temperatures obtained by variable separation method were almost equal to the measured surface soil temperature.

Backward Path Tracking Control of a Trailer Type Robot Using a RCGS-Based Model (RCGA 기반의 모델을 이용한 트레일러형 로봇의 후방경로 추종제어)

  • Wi, Yong-Uk;Kim, Heon-Hui;Ha, Yun-Su;Jin, Gang-Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.7 no.9
    • /
    • pp.717-722
    • /
    • 2001
  • This paper presents a methodology on the backward path tracking control of a trailer type robot which consists of two parts: a tractor and a trailer. It is difficult to control the motion of a trailer vehicle since its dynamics is non-holonomic. Therefore, in this paper, the modeling and parameter estimation of the system using a real-coded genetic algorithm(RCGA) is proposed and a backward path tracking control algorithm is then obtained based on the linearized model. Experimental results verify the effectiveness of the proposed method.

  • PDF

Managing Duplicate Memberships of Websites : An Approach of Social Network Analysis (웹사이트 중복회원 관리 : 소셜 네트워크 분석 접근)

  • Kang, Eun-Young;Kwahk, Kee-Young
    • Journal of Intelligence and Information Systems
    • /
    • v.17 no.1
    • /
    • pp.153-169
    • /
    • 2011
  • Today using Internet environment is considered absolutely essential for establishing corporate marketing strategy. Companies have promoted their products and services through various ways of on-line marketing activities such as providing gifts and points to customers in exchange for participating in events, which is based on customers' membership data. Since companies can use these membership data to enhance their marketing efforts through various data analysis, appropriate website membership management may play an important role in increasing the effectiveness of on-line marketing campaign. Despite the growing interests in proper membership management, however, there have been difficulties in identifying inappropriate members who can weaken on-line marketing effectiveness. In on-line environment, customers tend to not reveal themselves clearly compared to off-line market. Customers who have malicious intent are able to create duplicate IDs by using others' names illegally or faking login information during joining membership. Since the duplicate members are likely to intercept gifts and points that should be sent to appropriate customers who deserve them, this can result in ineffective marketing efforts. Considering that the number of website members and its related marketing costs are significantly increasing, it is necessary for companies to find efficient ways to screen and exclude unfavorable troublemakers who are duplicate members. With this motivation, this study proposes an approach for managing duplicate membership based on the social network analysis and verifies its effectiveness using membership data gathered from real websites. A social network is a social structure made up of actors called nodes, which are tied by one or more specific types of interdependency. Social networks represent the relationship between the nodes and show the direction and strength of the relationship. Various analytical techniques have been proposed based on the social relationships, such as centrality analysis, structural holes analysis, structural equivalents analysis, and so on. Component analysis, one of the social network analysis techniques, deals with the sub-networks that form meaningful information in the group connection. We propose a method for managing duplicate memberships using component analysis. The procedure is as follows. First step is to identify membership attributes that will be used for analyzing relationship patterns among memberships. Membership attributes include ID, telephone number, address, posting time, IP address, and so on. Second step is to compose social matrices based on the identified membership attributes and aggregate the values of each social matrix into a combined social matrix. The combined social matrix represents how strong pairs of nodes are connected together. When a pair of nodes is strongly connected, we expect that those nodes are likely to be duplicate memberships. The combined social matrix is transformed into a binary matrix with '0' or '1' of cell values using a relationship criterion that determines whether the membership is duplicate or not. Third step is to conduct a component analysis for the combined social matrix in order to identify component nodes and isolated nodes. Fourth, identify the number of real memberships and calculate the reliability of website membership based on the component analysis results. The proposed procedure was applied to three real websites operated by a pharmaceutical company. The empirical results showed that the proposed method was superior to the traditional database approach using simple address comparison. In conclusion, this study is expected to shed some light on how social network analysis can enhance a reliable on-line marketing performance by efficiently and effectively identifying duplicate memberships of websites.

Usefulness assessment of secondary shield for the lens exposure dose reduction during radiation treatment of peripheral orbit (안와 주변 방사선 치료 시 수정체 피폭선량 감소를 위한 2차 차폐의 유용성 평가)

  • Kwak, Yong Kuk;Hong, Sun Gi;Ha, Min Yong;Park, Jang Pil;Yoo, Sook Hyun;Cho, Woong
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.27 no.1
    • /
    • pp.87-95
    • /
    • 2015
  • Purpose : This study presents the usefulness assessment of secondary shield for the lens exposure dose reduction during radiation treatment of peripheral orbit. Materials and Methods : We accomplished IMRT treatment plan similar with a real one through the computed treatment planning system after CT simulation using human phantom. For the secondary shield, we used Pb plate (thickness 3mm, diameter 25mm) and 3 mm tungsten eye-shield block. And we compared lens dose using OSLD between on TPS and on simulation. Also, we irradiated 200 MU(6 MV, SPD(Source to Phantom Distance)=100 cm, $F{\cdot}S\;5{\times}5cm$) on a 5cm acrylic phantom using the secondary shielding material of same condition, 3mm Pb and tungsten eye-shield block. And we carried out the same experiment using 8cm Pb block to limit effect of leakage & transmitted radiation out of irradiation field. We attached OSLD with a 1cm away from the field at the side of phantom and applied a 3mm bolus equivalent to the thickness of eyelid. Results : Using human phantom, the Lens dose on IMRT treatment plan is 315.9cGy and the real measurement value is 216.7cGy. And after secondary shield using 3mm Pb plate and tungsten eye-shield block, each lens dose is 234.3, 224.1 cGy. The result of a experiment using acrylic phantom, each value is 5.24, 5.42 and 5.39 cGy in case of no block, 3mm Pb plate and tungsten eye-shield block. Applying O.S.B out of the field, each value is 1.79, 2.00 and 2.02 cGy in case of no block, 3mm Pb plate and tungsten eye-shield block. Conclusion : When secondary shielding material is used to protect critical organ while irradiating photon, high atomic number material (like metal) that is near by critical organ can be cause of dose increase according to treatment region and beam direction because head leakage and collimator & MLC transmitted radiation are exist even if it's out of the field. The attempt of secondary shield for the decrease of exposure dose was meaningful, but untested attempt can have a reverse effect. So, a preliminary inspection through Q.A must be necessary.

  • PDF

A Study on the Diffusion Factor of e-finance (e-Finance의 확산요인에 관한 연구)

  • Kim, Min-Ho;Song, Chae-Hun;Song, Sun-Yok;Cha, Sun-Kwon
    • International Commerce and Information Review
    • /
    • v.4 no.2
    • /
    • pp.253-277
    • /
    • 2002
  • Nowaday, the advanced technology in information and communication has been leading the dramatic change of transaction paradigm expansion from physical basis to electronic one. As we know, financial services support most of financial exchange between two business parties. So the expansion of electronic transaction paradigm affects to every financial institutions which provide financial services. Thus, financial institutions have accepted e-Finance systems and providing internet financial services to live in the competition. The purpose of this study is to contribute the qualitative enhancement of its customer service, rapid diffusion and accurate strategy establishment for e-Finance industry in the user side. Through the literature review and factor and reliability analysis, this study selects six diffusion factors such as efficiency of perceived e-Finance, reliability and safety of e-Finance in perceived e-Finance itself's characteristic; confidence, technical factors and the customer service quality of e-Finance system in perception on e-Finance System; inclination to innovation in the personal characteristic. According to result of hypothesis verification by using logistics regression analysis, technical factors and the customer service quality of e-Finance system in perception on e-Finance System and inclination to innovation in the personal characteristic gave statistically positive effect to the diffusion decision at the significant level 0.05 and 0.01. However efficiency of perceived e-Finance, reliability and safety of e-Finance in perceived e-Finance itself's characteristic didn't affect to diffusion decision and confidence of e-Finance system in perception on e-Finance System didn't have any statistical significancy. This study can be used as a basic material for the forward empirical study of diffusion factors in the user side and be able to apply to company and government policy making or embodiment, determination for customer service quality degree of financial institutions. But this study has some limitations like didn't touch satisfaction factors and its effect, only deal domestic customers and didn't use multi-regression analysis.

  • PDF