• Title/Summary/Keyword: Human case

Search Result 4,273, Processing Time 0.039 seconds

Corporate Default Prediction Model Using Deep Learning Time Series Algorithm, RNN and LSTM (딥러닝 시계열 알고리즘 적용한 기업부도예측모형 유용성 검증)

  • Cha, Sungjae;Kang, Jungseok
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.4
    • /
    • pp.1-32
    • /
    • 2018
  • In addition to stakeholders including managers, employees, creditors, and investors of bankrupt companies, corporate defaults have a ripple effect on the local and national economy. Before the Asian financial crisis, the Korean government only analyzed SMEs and tried to improve the forecasting power of a default prediction model, rather than developing various corporate default models. As a result, even large corporations called 'chaebol enterprises' become bankrupt. Even after that, the analysis of past corporate defaults has been focused on specific variables, and when the government restructured immediately after the global financial crisis, they only focused on certain main variables such as 'debt ratio'. A multifaceted study of corporate default prediction models is essential to ensure diverse interests, to avoid situations like the 'Lehman Brothers Case' of the global financial crisis, to avoid total collapse in a single moment. The key variables used in corporate defaults vary over time. This is confirmed by Beaver (1967, 1968) and Altman's (1968) analysis that Deakins'(1972) study shows that the major factors affecting corporate failure have changed. In Grice's (2001) study, the importance of predictive variables was also found through Zmijewski's (1984) and Ohlson's (1980) models. However, the studies that have been carried out in the past use static models. Most of them do not consider the changes that occur in the course of time. Therefore, in order to construct consistent prediction models, it is necessary to compensate the time-dependent bias by means of a time series analysis algorithm reflecting dynamic change. Based on the global financial crisis, which has had a significant impact on Korea, this study is conducted using 10 years of annual corporate data from 2000 to 2009. Data are divided into training data, validation data, and test data respectively, and are divided into 7, 2, and 1 years respectively. In order to construct a consistent bankruptcy model in the flow of time change, we first train a time series deep learning algorithm model using the data before the financial crisis (2000~2006). The parameter tuning of the existing model and the deep learning time series algorithm is conducted with validation data including the financial crisis period (2007~2008). As a result, we construct a model that shows similar pattern to the results of the learning data and shows excellent prediction power. After that, each bankruptcy prediction model is restructured by integrating the learning data and validation data again (2000 ~ 2008), applying the optimal parameters as in the previous validation. Finally, each corporate default prediction model is evaluated and compared using test data (2009) based on the trained models over nine years. Then, the usefulness of the corporate default prediction model based on the deep learning time series algorithm is proved. In addition, by adding the Lasso regression analysis to the existing methods (multiple discriminant analysis, logit model) which select the variables, it is proved that the deep learning time series algorithm model based on the three bundles of variables is useful for robust corporate default prediction. The definition of bankruptcy used is the same as that of Lee (2015). Independent variables include financial information such as financial ratios used in previous studies. Multivariate discriminant analysis, logit model, and Lasso regression model are used to select the optimal variable group. The influence of the Multivariate discriminant analysis model proposed by Altman (1968), the Logit model proposed by Ohlson (1980), the non-time series machine learning algorithms, and the deep learning time series algorithms are compared. In the case of corporate data, there are limitations of 'nonlinear variables', 'multi-collinearity' of variables, and 'lack of data'. While the logit model is nonlinear, the Lasso regression model solves the multi-collinearity problem, and the deep learning time series algorithm using the variable data generation method complements the lack of data. Big Data Technology, a leading technology in the future, is moving from simple human analysis, to automated AI analysis, and finally towards future intertwined AI applications. Although the study of the corporate default prediction model using the time series algorithm is still in its early stages, deep learning algorithm is much faster than regression analysis at corporate default prediction modeling. Also, it is more effective on prediction power. Through the Fourth Industrial Revolution, the current government and other overseas governments are working hard to integrate the system in everyday life of their nation and society. Yet the field of deep learning time series research for the financial industry is still insufficient. This is an initial study on deep learning time series algorithm analysis of corporate defaults. Therefore it is hoped that it will be used as a comparative analysis data for non-specialists who start a study combining financial data and deep learning time series algorithm.

A Study on Global Initiatives on Greenhouse Gas Reduction in the International Aviation (항공분야 기후변화 대응 현황 - 최근 ICAO 고위급회의 논의를 중심으로 -)

  • Maeng, Sung-Gyu;Hwang, Ho-Won
    • The Korean Journal of Air & Space Law and Policy
    • /
    • v.24 no.2
    • /
    • pp.47-67
    • /
    • 2009
  • In recent years, greenhouse gas (GHG) reduction has become high priority issue in international aviation. GHG emissions from the aviation sector only accounts for approximately 2 percent of total GHG emissions in the world. However, as with GHG gases in other sectors, it has been pointed out as a contributing factor to global warming and there is an ongoing conversation in the aviation community to establish international framework for emissions reductions. In the case of international aviation, effects of aviation activities of a State go beyond the airports and airspace of that State. This makes compiling of GHG emissions data very difficult. There are also other legal and technical issues, namely the principle of “Common but Differentiated Responsibility (CBDR)” under the United Nations Framework Convention on Climate Change (UNFCCC) and “Fair Opportunity” principle of the Chicago Convention. For all these reason, it is expected that it will not be an easy job to establish an internationally agreed mechanism for reducing emissions in spite of continuing collaboration among States. UN adopted the UNFCCC in 1990 and the Kyoto Protocol in 1997 to impose common but differentiated responsibility on emissions reductions. In international aviation, ICAO has been taking the lead in measures for the aviation sector. In this role, ICAO held the High-level Meeting on International Aviation and Climate Change on 7 to 9 October 2009 at its Headquarters in Montreal and endorsed recommendations on reducing GHG from international aviation which will also be reported to the 15th Meeting of the Conference of the Parties (COP15). Key items include basic principle in global aviation emissions reduction: aspirational goals and implementation options: strategies and measures to achieve goals: means to measure and monitor the implementation; and financial and human resources. It is very likely that the Republic of Korea will be included among the Parties subject to mandatory limitation or reduction of GHG emissions after 2013. Therefore, it is necessary for Korea to thoroughly analyze ICAO measures to develop comprehensive measures for reducing aviation emissions and to take proactive actions to prepare for future discussions on critical issues after COP15.

  • PDF

A Study about Development of Environment Printing Technology and $CO_2$ (환경 인쇄 기술의 발전과 인쇄물의 $CO_2$ 발생량에 관한 연구)

  • Lee, Mun-Hag
    • Journal of the Korean Graphic Arts Communication Society
    • /
    • v.30 no.3
    • /
    • pp.89-114
    • /
    • 2012
  • For as to world, the concern about the environment problem is enhanced than any other time in the past because of being 21 century. And the environment problem is highlighted as the world-wide issue. The time of the environment problem intimidates the alive of the mankind and presence of an earth over the time. It becomes the essentiality not being selection in the personal living or the economical viewpoint now to prepare for the climatic modification. As to the company management, the green growth period which it excludes the environment management considering an environment, cannot carry on the company the continued management comes. That is, in the change center of the management paradigm, there is the environment management. Nearly, the greenhouse gas which the publication industry is the environmental toxic material like all industries is generated. The greenhouse gas is ejected in the process of running the manufacturing process and print shop of the various kinds material used as the raw material of the book. Particularly, the tree felling for getting the material of the paper is known to reach the direct influence on the global warming. This study does according to an object it considers and organizes the environment parameter based on this kind of fact as to the publication industry. And it is determined as the reference which is used as the basic materials preparing the case that carbon exhaust right transaction(CAP and TRADE) drawing are enforced in all industries and is sustainable the management of the publication industry and reduces the environmental risk among the company many risk management elements and plans and enforces the publication related policy that there is a value. In the printing publication industry, this study tried to inquire into elements discharging the environmental pollutant or the greenhouse gas. Additionally, in the printed publication production process, it tried to inquire into the effort for an environment-friendly and necessity at the printing paper and the printers ink, regarded as the element discharging the greenhouse gas all kinds of the printing materials, operation of the print shop and all kinds of the machines and recycle process, and etc. These considerations make these industrial field employees aware of the significance about a conservation and environmental protection. They try to give a help in the subsequent study producing quantitatively each environmental parameter emission of green house gas. This makes the calculation of the relative $CO_2$ output reproached ultimately possible. Meanwhile, in a sense, many research protects and improving an environment in connection with the contents of research at the printing publication industrial field is in progress. There will be the voluntary human face that it has to protect an environment but this can not do by the outside factor according to all kinds of environment related law and regulation. Anyway, because of acting on company management as the factor of oppression, the increase of this environment-related correspondence cost could know that the research that the environment loading relates with a procurement and development, environment management system introduction, quality control standard, including, normalizing including a material, and etc. through the part of the effort to reduce the cost low was actively in progress. As to the green growth era, as follows, this paper prescribed the subject and alternative of the print publication industry. It is surrounded by the firstly new digital environment and the generation of the subject. And secondly the printing industry is caused by the point of time when the green growth leaves by the topic which is largest in the global industry and it increases. The printing publication industry has to prepare the bridgehead for the environment-friendly green growth as the alternative for this resolution with first. The support blown in each industry becomes the obligation not being selection. Prestek in which the print publishing was exposed to spend many energies and which is known as the practice of the sustainable print publishing insisted that it mentioned importance of the green printing through the white pages in 2008 and a company had to be the green growth comprised through the environment-friendly activity. The core management for the sustainable printing publication industry presented from Presstack white pages is compacted to 4 words that it is a remove, reduce, recover, and recycle. Second, positively the digital printing(POD) system should be utilized. In the worldwide print out market, the digital printing area stops at the level of 10% or so but the change over and growth of the market of an analog-to-digital will increase rapidly in the future. As to the CEO Jeff Hayes of the Infoland, the offset print referred to that it of the traditional method got old and infirm with the minor phase of the new printing application like the customer to be wanted publication and the print of the digital method led the market. In conclusion, print publishers have to grasp well the market flow in the situation where a digitalization cannot be generalized and a support cannot avoid. And it keeps pace with the flow of the digital age and the recognition about the effort for the development and environment problem have to be raised. Particularly, the active green strategy is employed for the active green strategy.

Pesticide Residues Survey and Safety Evaluation for Perilla Leaf & Lettuce on the Garak-dong Agricultural & Marine Products Market (가락동 농수산물도매시장 반입 들깻잎과 상추의 잔류농약 실태 및 안전성 평가)

  • Park, Won-Hee;Hwang, In-Sook;Kim, Eun-Jeong;Cho, Tae-Hee;Hong, Chae-Kyu;Lee, Jae-In;Choi, Su-Jeong;Kim, Jin-A;Lee, Yun-Jeong;Kim, Mi-Sun;Kim, Gi-Hae;Kim, Moo-Sang
    • The Korean Journal of Pesticide Science
    • /
    • v.19 no.3
    • /
    • pp.151-160
    • /
    • 2015
  • This study was conducted to monitor residual pesticides in perilla leaf & lettuce and to assess their risk to human health. The total number of perilla leaf & lettuce were 4,063 and 2,248 respectively and these products were collected at the Garak-dong Agricultural & Marine Products Market sold by auction from 2010 to 2012. Multi-residual analysis of 285 pesticides were performed by GC-ECD, GC-NPD, HPLC-DAD and HPLC-FLD. In perilla leaf, 61 pesticides were detected and detection rate was 20.0%. 28 pesticides were detected in case of lettuce and detection rate was 4.8%. In order to do risk assessment by perilla leaf & lettuce consumption, estimated daily intake of residual pesticides were determined and compared to acceptable daily intake, referring to hazard index (HI values). The range of % HI values of perilla leaf was from 0.000% to 0.049%. The range of % HI values of lettuce was from 0.000% to 0.095%. These results show that the risk caused by pesticide residues in perilla leaf & lettuce intake were very low and these vegetable intake was safe.

A Comparative Study on the Effective Deep Learning for Fingerprint Recognition with Scar and Wrinkle (상처와 주름이 있는 지문 판별에 효율적인 심층 학습 비교연구)

  • Kim, JunSeob;Rim, BeanBonyka;Sung, Nak-Jun;Hong, Min
    • Journal of Internet Computing and Services
    • /
    • v.21 no.4
    • /
    • pp.17-23
    • /
    • 2020
  • Biometric information indicating measurement items related to human characteristics has attracted great attention as security technology with high reliability since there is no fear of theft or loss. Among these biometric information, fingerprints are mainly used in fields such as identity verification and identification. If there is a problem such as a wound, wrinkle, or moisture that is difficult to authenticate to the fingerprint image when identifying the identity, the fingerprint expert can identify the problem with the fingerprint directly through the preprocessing step, and apply the image processing algorithm appropriate to the problem. Solve the problem. In this case, by implementing artificial intelligence software that distinguishes fingerprint images with cuts and wrinkles on the fingerprint, it is easy to check whether there are cuts or wrinkles, and by selecting an appropriate algorithm, the fingerprint image can be easily improved. In this study, we developed a total of 17,080 fingerprint databases by acquiring all finger prints of 1,010 students from the Royal University of Cambodia, 600 Sokoto open data sets, and 98 Korean students. In order to determine if there are any injuries or wrinkles in the built database, criteria were established, and the data were validated by experts. The training and test datasets consisted of Cambodian data and Sokoto data, and the ratio was set to 8: 2. The data of 98 Korean students were set up as a validation data set. Using the constructed data set, five CNN-based architectures such as Classic CNN, AlexNet, VGG-16, Resnet50, and Yolo v3 were implemented. A study was conducted to find the model that performed best on the readings. Among the five architectures, ResNet50 showed the best performance with 81.51%.

Floristic study and conservation management strategies of algific talus slopes on the Korean peninsula (한반도 풍혈지의 관속식물상과 보전관리 방안)

  • Kim, Jin-Seok;Chung, Jae-Min;Kim, Jung-Hyun;Lee, Woong;Lee, Byoung-Yoon;Pak, Jae-Hong
    • Korean Journal of Plant Taxonomy
    • /
    • v.46 no.2
    • /
    • pp.213-246
    • /
    • 2016
  • Algific talus slopes tend to occur on steep north-facing slopes with bedrock that retains ice and emits cold air throughout the growing season. Algific talus slopes provide a suitable microclimate for disjunct or relict populations of northern plant species at low altitude habitats in temperate zones. The purpose of this study is to suggest a strategy for the comprehensive conservation of the vegetation of algific talus slopes through studies of the floristics and plant species compositions and threat factors at present and in the future of 15 major algific talus slopes in Korea. As a result, the vascular plants surveyed on 15 major algific talus slopes were recorded, with a total of 587 taxa, 109 families, 323 genera, 531 species, 7 subspecies, 47 varieties 1 form and 1 hybrid. Of them, endemic plants numbered 26 taxa, and threatened species according to the IUCN valuation basis numbered 8 taxa. Fourth (IV) and fifth (V) degree indicator species as specified by floristic subregions numbered 31 taxa. Peculiarly, several subalpine-native plant species, in this case Cystopteris fragilis, Gymnocarpium dryopteris, Huperzia selago, Rosa koreana, Vaccinium vitis-idaea and Woodsia hancockii, were distributed on algific talus slopes at 100-600 m above sea level. Numerous and diverse biological resources native to algific talus slopes in Korea have been consistently disturbed or damaged by human activities without some form of protection. An all-taxa biodiversity inventory should be surveyed to provide more information about all biological species living on algific talus slopes. In addition, conservation strategies to ensure biodiversity and effective management of algific talus slopes are discussed in detail.

Feasibility Study on the Fault Tree Analysis Approach for the Management of the Faults in Running PCR Analysis (PCR 과정의 오류 관리를 위한 Fault Tree Analysis 적용에 관한 시범적 연구)

  • Lim, Ji-Su;Park, Ae-Ri;Lee, Seung-Ju;Hong, Kwang-Won
    • Applied Biological Chemistry
    • /
    • v.50 no.4
    • /
    • pp.245-252
    • /
    • 2007
  • FTA (fault tree analysis), an analytical method for system failure management, was employed in the management of faults in running PCR analysis. PCR is executed through several processes, in which the process of PCR machine operation was selected for the analysis by FTA. The reason for choosing the simplest process in the PCR analysis was to adopt it as a first trial to test a feasibility of the FTA approach. First, fault events-top event, intermediate event, basic events-were identified by survey on expert knowledge of PCR. Then those events were correlated deductively to build a fault tree in hierarchical structure. The fault tree was evaluated qualitatively and quantitatively, yielding minimal cut sets, structural importance, common cause vulnerability, simulation of probability of occurrence of top event, cut set importance, item importance and sensitivity. The top event was 'errors in the step of PCR machine operation in running PCR analysis'. The major intermediate events were 'failures in instrument' and 'errors in actions in experiment'. The basic events were four events, one event and one event based on human errors, instrument failure and energy source failure, respectively. Those events were combined with Boolean logic gates-AND or OR, constructing a fault tree. In the qualitative evaluation of the tree, the basic events-'errors in preparing the reaction mixture', 'errors in setting temperature and time of PCR machine', 'failure of electrical power during running PCR machine', 'errors in selecting adequate PCR machine'-proved the most critical in the occurrence of the fault of the top event. In the quantitative evaluation, the list of the critical events were not the same as that from the qualitative evaluation. It was because the probability value of PCR machine failure, not on the list above though, increased with used time, and the probability of the events of electricity failure and defective of PCR machine were given zero due to rare likelihood of the events in general. It was concluded that this feasibility study is worth being a means to introduce the novel technique, FTA, to the management of faults in running PCR analysis.

The aplication of fuzzy classification methods to spatial analysis (공간분석을 위한 퍼지분류의 이론적 배경과 적용에 관한 연구 - 경상남도 邑級以上 도시의 기능분류를 중심으로 -)

  • ;Jung, In-Chul
    • Journal of the Korean Geographical Society
    • /
    • v.30 no.3
    • /
    • pp.296-310
    • /
    • 1995
  • Classification of spatial units into meaningful sets is an important procedure in spatial analysis. It is crucial in characterizing and identifying spatial structures. But traditional classification methods such as cluster analysis require an exact database and impose a clear-cut boundary between classes. Scrutiny of realistic classification problems, however, reveals that available infermation may be vague and that the boundary may be ambiguous. The weakness of conventional methods is that they fail to capture the fuzzy data and the transition between classes. Fuzzy subsets theory is useful for solving these problems. This paper aims to come to the understanding of theoretical foundations of fuzzy spatial analysis, and to find the characteristics of fuzzy classification methods. It attempts to do so through the literature review and the case study of urban classification of the Cities and Eups of Kyung-Nam Province. The main findings are summarized as follows: 1. Following Dubois and Prade, fuzzy information has an imprecise and/or uncertain evaluation. In geography, fuzzy informations about spatial organization, geographical space perception and human behavior are frequent. But the researcher limits his work to numerical data processing and he does not consider spatial fringe. Fuzzy spatial analysis makes it possible to include the interface of groups in classification. 2. Fuzzy numerical taxonomic method is settled by Deloche, Tranquis, Ponsard and Leung. Depending on the data and the method employed, groups derived may be mutually exclusive or they may overlap to a certain degree. Classification pattern can be derived for each degree of similarity/distance $\alpha$. By takina the values of $\alpha$ in ascending or descending order, the hierarchical classification is obtained. 3. Kyung-Nam Cities and Eups were classified by fuzzy discrete classification, fuzzy conjoint classification and cluster analysis according to the ratio of number of persons employed in industries. As a result, they were divided into several groups which had homogeneous characteristies. Fuzzy discrete classification and cluste-analysis give clear-cut boundary, but fuzzy conjoint classification delimit the edges and cores of urban classification. 4. The results of different methods are varied. But each method contributes to the revealing the transparence of spatial structure. Through the result of three kinds of classification, Chung-mu city which has special characteristics and the group of Industrial cities composed by Changwon, Ulsan, Masan, Chinhai, Kimhai, Yangsan, Ungsang, Changsungpo and Shinhyun are evident in common. Even though the appraisal of the fuzzy classification methods, this framework appears to be more realistic and flexible in preserving information pertinent to urban classification.

  • PDF

The Adjunctive Role of Resectional Surgery for the Treatment of Multidrug-Resistant Pulmonary Tuberculosis (다제내성 폐결핵의 치료에서 폐절제술의 보조적인 역할)

  • Koh, Won-Jung;Lee, Jae-Ho;Yoo, Chul-Gyu;Kim, Young-Whan;Chung, Hee-Soon;Sung, Sook-Whan;Im, Jung-Gi;Kim, Joo-Hyun;Shim, Young-Soo;Han, Sung-Koo
    • Tuberculosis and Respiratory Diseases
    • /
    • v.44 no.5
    • /
    • pp.975-991
    • /
    • 1997
  • Background : Many patients with isoniazid and rifampin-resistant pulmonary tuberculosis have organisms that are also resistant to other first-line drugs. Despite of aggressive retreatment chemotherapy, the results are often unsuccessful, with a failure rate approaching 40%. Recently, there has been a revival of resectional surgery for the treatment of multidrug-resistant pulmonary tuberculosis. Methods : A retrospective analyses of the case records and radiographic findings were done. Between January 1991 and December 1995, 14 human immunodeficiency virus (HIV)-seronegative patients with multidrug-resistant pulmonary tuberculosis were selected for resection to supplement chemotherapy. All patients had organisms resistant to many of the first-line drugs, including both isoniazid and rifampin. Results : Despite of aggressive therapy for median duration of 9.5 months, 12 of the 14 patients (86%) were still sputum smear and/or culture positive at the time of surgery. The disease was generally extensive. Although main lesions of the disease including thick-walled cavities were localized in one lung, lesser amounts of contralateral disease were demonstrated in 10 of 14 (71%). Types of surgery performed were pneumonectomy including extrapleural pneumonectomy in six patients, lobectomy or lobectomy plus in six patients, and segmentectomy in two patients. The resected lung appeared to have poor function ; preoperative perfusion lung scan showed only 4.8% of the total perfusion to the resected portion of the lung. There were no operative deaths. Two patients had major postoperative complications including empyema with bronchopleural fistula and prolonged air leak, respectively. Of the 14 patients, 13 (93%) remained sputum-culture-negative for M. tuberculosis for a median duration of 23 months and one remained continuously sputum smear and culture positive for M. tuberculosis. Conclusion : On the basis of comparison with historical controls, adjunctive resectional surgery appears to play a significant beneficial role in the management of patients with multidrug-resistant pulmonary tuberculosis if the disease is localized and there are adequate reserve in pulmonary function.

  • PDF

A study for Developing Performance Assessment Model of Technology Entrepreneurship Education Based on BSC - A Case Study to Graduate School of Entrepreneurial Management - (BSC(Balanced Scorecard) 기반의 기술창업교육 성과평가모형 개발 연구 - 창업대학원 성과평가지표 분석과 개선방안도출을 중심으로 -)

  • Yang, Young Seok
    • Asia-Pacific Journal of Business Venturing and Entrepreneurship
    • /
    • v.8 no.2
    • /
    • pp.129-139
    • /
    • 2013
  • This paper is targeted on proposing ameliorating alternative to performance assessment method of GSEM through evaluating the current one, which is initiated by SMBA to induce fair competition among 5 GSEM across the country and accommodate the quality improvement of entrepreneurship education since 2005 after beginning the SMBA support, from the perspective of BSC(Balanced Scorecard) tool. Ultimately, it complements the policy defects of SMBA over GSEM, in particular, in the process of performance assessment and management. This paper carries out two studies as follow. First, throughout reviewing the previous studies relating to BSC applications to non-profit organization, it set out the direction of introducing BSC in assessing performance of GSEM in order to enhance its effectiveness. Second, it evaluate the rationality of performance assessing tools apllied to GSEM by SMBA on the basis of BSC application over non-profit organization, especially in education institution. Research results shows the following implications. First, the current evaluation system over GSEM is just merely assessment itself and not much contributions for the post performance management. Second, The annual evaluation just remains to check up whether the policy goals are met or not. Third, the current evaluation puts much emphasis just on financial inputs and hardware infra, not considering human resources and utilization of government policy and institution. Fourth, the policy goals are unilaterally focused on entrepreneurs. Fifth, the current evaluation systems do not contain any indexes relating to learning and growth perspectives for concerning sustainable and independent growing up. However, lack of empirical testing require this paper to need the further study in the future.

  • PDF