• Title/Summary/Keyword: IT경영시스템

Search Result 3,240, Processing Time 0.033 seconds

A Study on the Effect of Network Centralities on Recommendation Performance (네트워크 중심성 척도가 추천 성능에 미치는 영향에 대한 연구)

  • Lee, Dongwon
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.1
    • /
    • pp.23-46
    • /
    • 2021
  • Collaborative filtering, which is often used in personalization recommendations, is recognized as a very useful technique to find similar customers and recommend products to them based on their purchase history. However, the traditional collaborative filtering technique has raised the question of having difficulty calculating the similarity for new customers or products due to the method of calculating similaritiesbased on direct connections and common features among customers. For this reason, a hybrid technique was designed to use content-based filtering techniques together. On the one hand, efforts have been made to solve these problems by applying the structural characteristics of social networks. This applies a method of indirectly calculating similarities through their similar customers placed between them. This means creating a customer's network based on purchasing data and calculating the similarity between the two based on the features of the network that indirectly connects the two customers within this network. Such similarity can be used as a measure to predict whether the target customer accepts recommendations. The centrality metrics of networks can be utilized for the calculation of these similarities. Different centrality metrics have important implications in that they may have different effects on recommended performance. In this study, furthermore, the effect of these centrality metrics on the performance of recommendation may vary depending on recommender algorithms. In addition, recommendation techniques using network analysis can be expected to contribute to increasing recommendation performance even if they apply not only to new customers or products but also to entire customers or products. By considering a customer's purchase of an item as a link generated between the customer and the item on the network, the prediction of user acceptance of recommendation is solved as a prediction of whether a new link will be created between them. As the classification models fit the purpose of solving the binary problem of whether the link is engaged or not, decision tree, k-nearest neighbors (KNN), logistic regression, artificial neural network, and support vector machine (SVM) are selected in the research. The data for performance evaluation used order data collected from an online shopping mall over four years and two months. Among them, the previous three years and eight months constitute social networks composed of and the experiment was conducted by organizing the data collected into the social network. The next four months' records were used to train and evaluate recommender models. Experiments with the centrality metrics applied to each model show that the recommendation acceptance rates of the centrality metrics are different for each algorithm at a meaningful level. In this work, we analyzed only four commonly used centrality metrics: degree centrality, betweenness centrality, closeness centrality, and eigenvector centrality. Eigenvector centrality records the lowest performance in all models except support vector machines. Closeness centrality and betweenness centrality show similar performance across all models. Degree centrality ranking moderate across overall models while betweenness centrality always ranking higher than degree centrality. Finally, closeness centrality is characterized by distinct differences in performance according to the model. It ranks first in logistic regression, artificial neural network, and decision tree withnumerically high performance. However, it only records very low rankings in support vector machine and K-neighborhood with low-performance levels. As the experiment results reveal, in a classification model, network centrality metrics over a subnetwork that connects the two nodes can effectively predict the connectivity between two nodes in a social network. Furthermore, each metric has a different performance depending on the classification model type. This result implies that choosing appropriate metrics for each algorithm can lead to achieving higher recommendation performance. In general, betweenness centrality can guarantee a high level of performance in any model. It would be possible to consider the introduction of proximity centrality to obtain higher performance for certain models.

The Effects on CRM Performance and Relationship Quality of Successful Elements in the Establishment of Customer Relationship Management: Focused on Marketing Approach (CRM구축과정에서 마케팅요인이 관계품질과 CRM성과에 미치는 영향)

  • Jang, Hyeong-Yu
    • Journal of Global Scholars of Marketing Science
    • /
    • v.18 no.4
    • /
    • pp.119-155
    • /
    • 2008
  • Customer Relationship Management(CRM) has been a sustainable competitive edge of many companies. CRM analyzes customer data for designing and executing targeted marketing analysing customer behavior in order to make decisions relating to products and services including management information system. It is critical for companies to get and maintain profitable customers. How to manage relationships with customers effectively has become an important issue for both academicians and practitioners in recent years. However, the existing academic literature and the practical applications of customer relationship management(CRM) strategies have been focused on the technical process and organizational structure about the implementation of CRM. These limited focus on CRM lead to the result of numerous reports of failed implementations of various types of CRM projects. Many of these failures are also related to the absence of marketing approach. Identifying successful factors and outcomes focused on marketing concept before introducing a CRM project are a pre-implementation requirements. Many researchers have attempted to find the factors that contribute to the success of CRM. However, these research have some limitations in terms of marketing approach without explaining how the marketing based factors contribute to the CRM success. An understanding of how to manage relationship with crucial customers effectively based marketing approach has become an important topic for both academicians and practitioners. However, the existing papers did not provide a clear antecedent and outcomes factors focused on marketing approach. This paper attempt to validate whether or not such various marketing factors would impact on relational quality and CRM performance in terms of marketing oriented perceptivity. More specifically, marketing oriented factors involving market orientation, customer orientation, customer information orientation, and core customer orientation can influence relationship quality(satisfaction and trust) and CRM outcome(customer retention and customer share). Another major goals of this research are to identify the effect of relationship quality on CRM outcomes consisted of customer retention and share to show the relationship strength between two factors. Based on meta analysis for conventional studies, I can construct the following research model. An empirical study was undertaken to test the hypotheses with data from various companies. Multiple regression analysis and t-test were employed to test the hypotheses. The reliability and validity of our measurements were tested by using Cronbach's alpha coefficient and principal factor analysis respectively, and seven hypotheses were tested through performing correlation test and multiple regression analysis. The first key outcome is a theoretically and empirically sound CRM factors(marketing orientation, customer orientation, customer information orientation, and core customer orientation.) in the perceptive of marketing. The intensification of ${\beta}$coefficient among antecedents factors in terms of marketing was not same. In particular, The effects on customer trust of marketing based CRM antecedents were significantly confirmed excluding core customer orientation. It was notable that the direct effects of core customer orientation on customer trust were not exist. This means that customer trust which is firmly formed by long term tasks will not be directly linked to the core customer orientation. the enduring management concerned with this interactions is probably more important for the successful implementation of CRM. The second key result is that the implementation and operation of successful CRM process in terms of marketing approach have a strong positive association with both relationship quality(customer trust/customer satisfaction) and CRM performance(customer retention and customer possession). The final key fact that relationship quality has a strong positive effect on customer retention and customer share confirms that improvements in customer satisfaction and trust improve accessibility to customers, provide more consistent service and ensure value-for-money within the front office which result in growth of customer retention and customer share. Particularly, customer satisfaction and trust which is main components of relationship quality are found to be positively related to the customer retention and customer share. Interactive managements of these main variables play key roles in connecting the successful antecedent of CRM with final outcome involving customer retention and share. Based on research results, This paper suggest managerial implications concerned with constructions and executions of CRM focusing on the marketing perceptivity. I can conclude in general the CRM can be achieved by the recognition of antecedents and outcomes based on marketing concept. The implementation of marketing concept oriented CRM will be connected with finding out about customers' purchasing habits, opinions and preferences profiling individuals and groups to market more effectively and increase sales changing the way you operate to improve customer service and marketing. Benefiting from CRM is not just a question of investing the right software, but adapt CRM users to the concept of marketing including marketing orientation, customer orientation, and customer information orientation. No one deny that CRM is a process or methodology used to develop stronger relationships being composed of many technological components, but thinking about CRM in primarily technological terms is a big mistake. We can infer from this paper that the more useful way to think and implement about CRM is as a process that will help bring together lots of pieces of marketing concept about customers, marketing effectiveness, and market trends. Finally, a real situation we conducted our research may enable academics and practitioners to understand the antecedents and outcomes in the perceptive of marketing more clearly.

  • PDF

An Overview of Readjustment Measures Against the Banking Industry's Non-Performing Loans (은행부실채권(銀行不實債權) 정리방안(整理方案)에 대한 고찰(考察))

  • Kim, Joon-kyung
    • KDI Journal of Economic Policy
    • /
    • v.13 no.1
    • /
    • pp.35-63
    • /
    • 1991
  • Currently, Korea's banking industry holds a sizable amount of non-performing loans which stem from the government-led bailout of many troubled firms in the 1980s. Although this burden was somewhat relieved with the aid of banks' recapitalization in the booming securities market between 1986-88, the insolvent credits still resulted in low profitability in the banking sector and have been detrimental to the progress of financial liberalization and internationalization. This paper surveys the corporate bailout experiences of major advanced countries and Korea in the past and derives a rationale for readjustment measures against non-performing loans, in which rescue plans depend on the nature of the financial system. Considering the features of Korea's financial system and the banking sector's recent performance, it discusses possible means of liquidation in keeping with the rationale. The conflict of interests among parties involved in non-performing loans is widely known as one of the major constraints in writing off the loans. Specifically, in the case of Korea, the government's excessive intervention in allocating credits has preempted the legitimate role of the banking sector, which now only passively manages its past loans, and has implicitly confused private with public risk. This paper argues that to minimize the incidence of insolvent loan readjustment, the government's role should be reduced and that the correspondent banks should be more active in the liquidation process, through the market mechanism, reflecting their access to detailed information on the troubled firms. One solution is that banks, after classifying the insolvent loans by the lateness or possibility of repayment, would swap the relatively sound loans for preferred stock and gradually write off the bad ones by expanding the banks' retained earnings and revaluing the banks' assets. Specifically, the debt-equity swap can benefit both creditors and debtors in the sense that it raises the liquidity and profitability of bank assets and strengthens the debtor's financial structure by easing the debt service burden. Such a creditor-led or market-led solution improves the financial strength and autonomy of the banking sector, thereby fostering more efficient resource allocation and risk sharing.

  • PDF

A Case Study of Artist-centered Art Fair for Popularizing Art Market (미술 대중화를 위한 작가중심형 아트페어 사례 연구)

  • Kim, Sun-Young;Yi, Eni-Shin
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.19 no.2
    • /
    • pp.279-292
    • /
    • 2018
  • Unlike the global art market which experienced rapid recovery from the impacts of the Global Financial Crisis in 2008, the Korean art market has not yet fully recovered. The gallery-oriented distribution system, vulnerable primary art market functions, and the market structure centered on a small number of collectors make it difficult for young and medium artists to enter the market and, as a result, deepen the economic polarization of artists. In addition, the high price of art works limits market participation by restricting the general public. This study began with the idea that the interest of the public in the art market as well as their participation in the market are urgent. To this end, we noted that public awareness of art transactions can be a starting point for improving the constitution of the fragile art market, focusing on the 'Artist-centered Art Fair' rather than existing art fairs. To examine the contribution of such an art fair to the popularization of the art market, we analyzed the case of the 'Visual Artist Market (VAM)' project of the Korea Arts Management Service. Results found that the 'Artist-centered Art Fair' focuses on providing opportunities for market entry to young and medium artists rather than on the interests of distributors, and promotes the popularization of the art market by promoting low-priced works to the general public. Also, the 'Artist-centered Art Fair' seems to play a primary role in the public sector to foster solid groups of artists as well as to establish healty distribution networks of Korean Art market. However, in the long run, it is necessary to promote sustainable development of the 'Artist-centered Art Fair' through indirect support, such as the provision of a publicity platform or consumer finance support, rather than direct support.

A Study on the Acceptance of Convergence System of Broadcasting, and Telecommunication, and Their Relative Efficiency Focusing on IPFV (방송과 통신 융합시스템의 수용 및 상대적 효능에 관한 연구: IPTV를 중심으로)

  • Um, Myoung-Yong;Lee, Sang-Ho;Kim, Jai-Beam
    • Asia pacific journal of information systems
    • /
    • v.19 no.3
    • /
    • pp.25-49
    • /
    • 2009
  • Advances in technology have resulted in the emergence of new information systems. The convergence of IT and manufacturing sectors has blurred the boundaries among industries. Also, such convergence has become established as a paradigm to build a new area. Especially the convergence of broadcasting and telecommunication, notably in the case of IPTV (Internet Protocol Television), is among the most salient examples of its kind in recent years as a major case of disruptive technology innovation. Despite its much fanfare, such convergence, however, has not fulfilled the expectation; it has not produced positive economic effects while negatively affecting the growth of IPIV. Stakeholders in and around IPIV including telecommunication companies, broadcasting corporations, and government bodies wish to gain control of IPTV under their wings. IPTV has drifted in the midst of conflicts among the stakeholders in and around IPTV, particularly telecommunication and broadcasting organizations in a broad sense. Our empirical research intends to deal with how audiences accept IPTV and how firms provide IPTV services to utilize their resources. Three research questions in this paper include, first, whether Technology Acceptance Model (TAM) can sufficiently explain the acceptance of IPTV as an information system. The second question concerns with empirically testing the playful aspect of IPTV to increase its audience acceptance. Last, but not least, this paper deals with how firms can efficiently and effectively allocate their limited resources to increase IPTV viewers. To answer those three main questions of our study, we collect data from 197 current subscribers of high speed internet service and/or cable/satellite television. Empirical results show that 'perceived usefulness (PU) $\rightarrow$ Intention to use' and 'perceived ease of use (PEU) $\rightarrow$ Intention to use' are significant. Also, 'perceived ease of use' is significantly related to 'perceived usefulness.' Perceived ease of handling IPTV without much effort can positively influence the perceived value of IPTV. In this regard, engineers and designers of IPTV should pay more attention to the user-friendly interface of IPTV. In addition, 'perceived playfulness (PP)' of IPTV is positively related to 'intention to use'. Flow, fun and entertainment have recently gained greater attention in the research concerned with information systems. Such attention is due to the changing features of information systems in recent years that combine the functional and leisure attributes. These results give practical implications to the design of IPTV that reflects not just leisure but also functional elements. This paper also investigates the relationship between 'perceived ease of use (PEU)' and 'perceived playfulness (PP).' PEU is positively related to pp. Audiences without fear can be attracted more easily to the user-friendly IPTV, thereby perceiving the fun and entertainment with ease. Practical implications from this finding are that, to attract more interest and involvement from the audience, IPTV needs to be designed with similar or even more user friendly interface. Of the factors related to 'intention to use', 'perceived usefulness (PU)' and 'perceived ease of use (PEU)' have greater impacts than 'perceived playfulness (PP).' Between PU and PEU, their impacts on 'intention to use' are not significantly different statistically. Managerial implications of this finding are that firms in preparation for the launch of IPTV service should prioritize the functions and interface of IPTV. This empirical paper also provides further insight into the ways in which firms can strategically allocate their limited resources so as to appeal to viewers, both current and potential, of IPTV.

Quality Dimensions Affecting the Effectiveness of a Semantic-Web Search Engine (검색 효과성에 영향을 미치는 시맨틱웹 검색시스템 품질요인에 관한 연구)

  • Han, Dong-Il;Hong, Il-Yoo
    • Asia pacific journal of information systems
    • /
    • v.19 no.1
    • /
    • pp.1-31
    • /
    • 2009
  • This paper empirically examines factors that potentially influence the success of a Web-based semantic search engine. A research model has been proposed that shows the impact of quality-related factors upon the effectiveness of a semantic search engine, based on DeLone and McLean's(2003) information systems success model. An empirical study has been conducted to test hypotheses formulated around the research model, and statistical methods were applied to analyze gathered data and draw conclusions. Implications for academics and practitioners are offered based on the findings of the study. The proposed model includes three quality dimensions of a Web-based semantic search engine-namely, information quality, system quality and service quality. These three dimensions each have measures designed to collectively assess the respective dimension. The model is intended to examine the relationship between measures of these quality dimensions and measures of two dependent constructs, including individuals' net benefit and user satisfaction. Individuals' net benefit was measured by the extent to which the user's information needs were adequately met, whereas user satisfaction was measured by a combination of the perceived satisfaction with search results and the perceived satisfaction with the overall system. A total of 23 hypotheses have been formulated around the model, and a questionnaire survey has been conducted using a functional semantic search website created by KT and Hakia, so as to collect data to validate the model. Copies of a questionnaire form were handed out in person to 160 research associates and employees working in the area of designing and developing semantic search engines. Those who received the form, 148 respondents returned valid responses. The survey form asked respondents to use the given website to answer questions concerning the system. The results of the empirical study have indicated that, of the three quality dimensions, information quality was found to have the strongest association with the effectiveness of a Web-based semantic search engine. This finding is consistent with the observation in the literature that the aspects of the information quality should serve as a basis for evaluating the search outcomes from a semantic search engine. Measures under the information quality dimension that have a positive effect on informational gratification and user satisfaction were found to be recall and currency. Under the system quality dimension, response time and interactivity, were positively related to informational gratification. On the other hand, only one measure under the service quality dimension, reliability was found to have a positive relationship with user satisfaction. The results were based on the seven hypotheses that have been accepted. One may wonder why 15 out of the 23 hypotheses have been rejected and question the theoretical soundness of the model. However, the correlations between independent variables and dependent variables came out to be fairly high. This suggests that the structural equation model yielded results inconsistent with those of coefficient analysis, because the structural equation model intends to examine the relationship among independent variables as well as the relationship between independent variables and dependent variables. The findings offer some useful implications for owners of a semantic search engine, as far as the design and maintenance of the website is concerned. First, the system should be designed to respond to the user's query as fast as possible. Also it should be designed to support the search process by recommending, revising, and choosing a search query, so as to maximize users' interactions with the system. Second, the system should present search results with maximum recall and currency to effectively meet the users' expectations. Third, it should be capable of providing online services in a reliable and trustworthy manner. Finally, effective increase in user satisfaction requires the improvement of quality factors associated with a semantic search engine, which would in turn help increase the informational gratification for users. The proposed model can serve as a useful framework for measuring the success of a Web-based semantic search engine. Applying the search engine success framework to the measurement of search engine effectiveness has the potential to provide an outline of what areas of a semantic search engine needs improvement, in order to better meet information needs of users. Further research will be needed to make this idea a reality.

Digital Hologram Compression Technique By Hybrid Video Coding (하이브리드 비디오 코팅에 의한 디지털 홀로그램 압축기술)

  • Seo, Young-Ho;Choi, Hyun-Jun;Kang, Hoon-Jong;Lee, Seung-Hyun;Kim, Dong-Wook
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.42 no.5 s.305
    • /
    • pp.29-40
    • /
    • 2005
  • According as base of digital hologram has been magnified, discussion of compression technology is expected as a international standard which defines the compression technique of 3D image and video has been progressed in form of 3DAV which is a part of MPEG. As we can identify in case of 3DAV, the coding technique has high possibility to be formed into the hybrid type which is a merged, refined, or mixid with the various previous technique. Therefore, we wish to present the relationship between various image/video coding techniques and digital hologram In this paper, we propose an efficient coding method of digital hologram using standard compression tools for video and image. At first, we convert fringe patterns into video data using a principle of CGH(Computer Generated Hologram), and then encode it. In this research, we propose a compression algorithm is made up of various method such as pre-processing for transform, local segmentation with global information of object image, frequency transform for coding, scanning to make fringe to video stream, classification of coefficients, and hybrid video coding. Finally the proposed hybrid compression algorithm is all of these methods. The tool for still image coding is JPEG2000, and the toots for video coding include various international compression algorithm such as MPEG-2, MPEG-4, and H.264 and various lossless compression algorithm. The proposed algorithm illustrated that it have better properties for reconstruction than the previous researches on far greater compression rate above from four times to eight times as much. Therefore we expect that the proposed technique for digital hologram coding is to be a good preceding research.

Introduction of region-based site functions into the traditional market environmental support funding policy development (재래시장 환경개선 지원정책 개발에서의 지역 장소적 기능 도입)

  • Jeong, Dae-Yong;Lee, Se-Ho
    • Proceedings of the Korean DIstribution Association Conference
    • /
    • 2005.05a
    • /
    • pp.383-405
    • /
    • 2005
  • The traditional market is foremost a regionally positioned place, wherein the market directly represents regional and cultural centered traits while it plays an important role in the circulation of facilities through reciprocal, informative and cultural exchanges while sewing to form local communities. The traditional market in Korea is one of representative retail businesses and premodern marketing techniques by family owned business of less than five members such as product management, purchase method, and marketing patterns etc. Since the 1990s, the appearance of new circulation-type businesses and large discount convenience stores escalated the loss of traditional competitiveness, increased the living standard of customers, changed purchasing patterns, and expanded the ubiquity of the Internet. All of these changes in external circulation circumstances have led the traditional markets to lose their place in the economy. The traditional market should revive on a regional site basis through the formation of a community of regional neighbors and through knowledge-sharing that leads to the creation of wealth. For the purpose of creating a wealth in a place, the following components are necessary: 1) a facility suitable for the spatial place of the present, 2)trust built through exchanges within the changing market environment, which would simultaneously satisfy customer's desires, 3) international bench marking on cases such as regionally centered TCM (England), BID (USA), and TMO (Japan) so that the market unit of store placement transfers from a spot policy to a line policy, 4)conversion of communicative conception through a surface policy approach centered around a macro-region perspective. The budget of the traditional market funding policy was operational between 2001 and 2004, serving as a counter move to solve the problem of the old traditional market through government intervention in regional economies to promote national economic strength. This national treasury funding project was centered on environmental improvement, research corps, and business modernization through the expenditure of 3,853 hundred million won (Korean currency). However, the effectiveness of this project has yet to be to proven through investigation. Furthermore, in promoting this funding support project, a lack of professionalism among merchants in the market led to constant limitations in comprehensive striving strategies, reduced capabilities in middle-and long-term plan setup, and created reductions in voluntary merchant agreement solutions. The traditional market should go beyond mere physical place and ordinary products creative site strategies employing the communicative approach must accompany these strategies to make the market a new regional and spatial living place. Thus, regarding recent paradigm changes and the introduction of region-based site functions into the traditional market, acquiring a conversion of direction into the newly developed project is essential to reinvestigate the traditional market composed of cultural and economic meanings, for the purpose of the research. Excavating social policy demands through the comparative analysis of domestic and international cases as well as innovative and expert management leadership development for NPO or NGO civil entrepreneurs through advanced case research on present promotion methods is extremely important. Discovering the seeds of the cultural contents industry cored around regional resource usages, commercializing regionally reknowned products, and constructing complex cultural living places for regional networks are especially important. In order to accelerate these solutions, a comprehensive and systemized approach research operated within a mentor academy system is required, as research will reveal distinctive traits of the traditional market in the aging society.

  • PDF

The Prediction of Export Credit Guarantee Accident using Machine Learning (기계학습을 이용한 수출신용보증 사고예측)

  • Cho, Jaeyoung;Joo, Jihwan;Han, Ingoo
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.1
    • /
    • pp.83-102
    • /
    • 2021
  • The government recently announced various policies for developing big-data and artificial intelligence fields to provide a great opportunity to the public with respect to disclosure of high-quality data within public institutions. KSURE(Korea Trade Insurance Corporation) is a major public institution for financial policy in Korea, and thus the company is strongly committed to backing export companies with various systems. Nevertheless, there are still fewer cases of realized business model based on big-data analyses. In this situation, this paper aims to develop a new business model which can be applied to an ex-ante prediction for the likelihood of the insurance accident of credit guarantee. We utilize internal data from KSURE which supports export companies in Korea and apply machine learning models. Then, we conduct performance comparison among the predictive models including Logistic Regression, Random Forest, XGBoost, LightGBM, and DNN(Deep Neural Network). For decades, many researchers have tried to find better models which can help to predict bankruptcy since the ex-ante prediction is crucial for corporate managers, investors, creditors, and other stakeholders. The development of the prediction for financial distress or bankruptcy was originated from Smith(1930), Fitzpatrick(1932), or Merwin(1942). One of the most famous models is the Altman's Z-score model(Altman, 1968) which was based on the multiple discriminant analysis. This model is widely used in both research and practice by this time. The author suggests the score model that utilizes five key financial ratios to predict the probability of bankruptcy in the next two years. Ohlson(1980) introduces logit model to complement some limitations of previous models. Furthermore, Elmer and Borowski(1988) develop and examine a rule-based, automated system which conducts the financial analysis of savings and loans. Since the 1980s, researchers in Korea have started to examine analyses on the prediction of financial distress or bankruptcy. Kim(1987) analyzes financial ratios and develops the prediction model. Also, Han et al.(1995, 1996, 1997, 2003, 2005, 2006) construct the prediction model using various techniques including artificial neural network. Yang(1996) introduces multiple discriminant analysis and logit model. Besides, Kim and Kim(2001) utilize artificial neural network techniques for ex-ante prediction of insolvent enterprises. After that, many scholars have been trying to predict financial distress or bankruptcy more precisely based on diverse models such as Random Forest or SVM. One major distinction of our research from the previous research is that we focus on examining the predicted probability of default for each sample case, not only on investigating the classification accuracy of each model for the entire sample. Most predictive models in this paper show that the level of the accuracy of classification is about 70% based on the entire sample. To be specific, LightGBM model shows the highest accuracy of 71.1% and Logit model indicates the lowest accuracy of 69%. However, we confirm that there are open to multiple interpretations. In the context of the business, we have to put more emphasis on efforts to minimize type 2 error which causes more harmful operating losses for the guaranty company. Thus, we also compare the classification accuracy by splitting predicted probability of the default into ten equal intervals. When we examine the classification accuracy for each interval, Logit model has the highest accuracy of 100% for 0~10% of the predicted probability of the default, however, Logit model has a relatively lower accuracy of 61.5% for 90~100% of the predicted probability of the default. On the other hand, Random Forest, XGBoost, LightGBM, and DNN indicate more desirable results since they indicate a higher level of accuracy for both 0~10% and 90~100% of the predicted probability of the default but have a lower level of accuracy around 50% of the predicted probability of the default. When it comes to the distribution of samples for each predicted probability of the default, both LightGBM and XGBoost models have a relatively large number of samples for both 0~10% and 90~100% of the predicted probability of the default. Although Random Forest model has an advantage with regard to the perspective of classification accuracy with small number of cases, LightGBM or XGBoost could become a more desirable model since they classify large number of cases into the two extreme intervals of the predicted probability of the default, even allowing for their relatively low classification accuracy. Considering the importance of type 2 error and total prediction accuracy, XGBoost and DNN show superior performance. Next, Random Forest and LightGBM show good results, but logistic regression shows the worst performance. However, each predictive model has a comparative advantage in terms of various evaluation standards. For instance, Random Forest model shows almost 100% accuracy for samples which are expected to have a high level of the probability of default. Collectively, we can construct more comprehensive ensemble models which contain multiple classification machine learning models and conduct majority voting for maximizing its overall performance.

A study on the Success Factors and Strategy of Information Technology Investment Based on Intelligent Economic Simulation Modeling (지능형 시뮬레이션 모형을 기반으로 한 정보기술 투자 성과 요인 및 전략 도출에 관한 연구)

  • Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.1
    • /
    • pp.35-55
    • /
    • 2013
  • Information technology is a critical resource necessary for any company hoping to support and realize its strategic goals, which contribute to growth promotion and sustainable development. The selection of information technology and its strategic use are imperative for the enhanced performance of every aspect of company management, leading a wide range of companies to have invested continuously in information technology. Despite researchers, managers, and policy makers' keen interest in how information technology contributes to organizational performance, there is uncertainty and debate about the result of information technology investment. In other words, researchers and managers cannot easily identify the independent factors that can impact the investment performance of information technology. This is mainly owing to the fact that many factors, ranging from the internal components of a company, strategies, and external customers, are interconnected with the investment performance of information technology. Using an agent-based simulation technique, this research extracts factors expected to affect investment performance on information technology, simplifies the analyses of their relationship with economic modeling, and examines the performance dependent on changes in the factors. In terms of economic modeling, I expand the model that highlights the way in which product quality moderates the relationship between information technology investments and economic performance (Thatcher and Pingry, 2004) by considering the cost of information technology investment and the demand creation resulting from product quality enhancement. For quality enhancement and its consequences for demand creation, I apply the concept of information quality and decision-maker quality (Raghunathan, 1999). This concept implies that the investment on information technology improves the quality of information, which, in turn, improves decision quality and performance, thus enhancing the level of product or service quality. Additionally, I consider the effect of word of mouth among consumers, which creates new demand for a product or service through the information diffusion effect. This demand creation is analyzed with an agent-based simulation model that is widely used for network analyses. Results show that the investment on information technology enhances the quality of a company's product or service, which indirectly affects the economic performance of that company, particularly with regard to factors such as consumer surplus, company profit, and company productivity. Specifically, when a company makes its initial investment in information technology, the resultant increase in the quality of a company's product or service immediately has a positive effect on consumer surplus, but the investment cost has a negative effect on company productivity and profit. As time goes by, the enhancement of the quality of that company's product or service creates new consumer demand through the information diffusion effect. Finally, the new demand positively affects the company's profit and productivity. In terms of the investment strategy for information technology, this study's results also reveal that the selection of information technology needs to be based on analysis of service and the network effect of customers, and demonstrate that information technology implementation should fit into the company's business strategy. Specifically, if a company seeks the short-term enhancement of company performance, it needs to have a one-shot strategy (making a large investment at one time). On the other hand, if a company seeks a long-term sustainable profit structure, it needs to have a split strategy (making several small investments at different times). The findings from this study make several contributions to the literature. In terms of methodology, the study integrates both economic modeling and simulation technique in order to overcome the limitations of each methodology. It also indicates the mediating effect of product quality on the relationship between information technology and the performance of a company. Finally, it analyzes the effect of information technology investment strategies and information diffusion among consumers on the investment performance of information technology.