• Title/Summary/Keyword: Production-process Management

Search Result 1,551, Processing Time 0.03 seconds

The changes of economic though (The trial of supply-side economics) (경제사상의 변화 (공급측면 경제학의 시험))

  • 서홍석
    • Journal of Applied Tourism Food and Beverage Management and Research
    • /
    • v.8
    • /
    • pp.89-121
    • /
    • 1997
  • Many of the measures and policies advocated by supply-siders, such as lower taxation, less government intervention, more freedom from restrictive legislation and regulation, and the need for increased productivity can be found in writing the classical economist. Nor is supply-side economics a complete divorcement from Keynesian analysis. In both camps the objectives are the same-high level employment, stable prices and healthy economic growth, the means or suggestions for attaining the objectives, however, differ. Consequently, recommended economic policies and measures are different. keynesians rely primarily on the manipulation of effective demand to increase output and employment and to combat inflation. They assume ample resources to be available in order that supply will respond to demand. The supply-siders emphasize the need to increase savings, investment, productivity and output as a means of increasing income. Supply-siders assume that the increase in income will lead to an increase in effective demand. Keynesians suggest that savings, particularly those not invested, dampen economic activity. Supply-siders hold that savings, or at least an increase in after-tax income, stimulates work effort and provides funds for investment. Perhaps keynesians are guilty of assuming that most savings are not going to be invested, whereas supply-siders may erroneously assume that almost all savings will flow into investment and/ or stimulate work effort. In reality, a middle ground is possible. The supply-siders stress the need to increase supply, but Keynes did not preclude the possibility of increasing economic activity by working through the supply side. According to Keynes' aggregate demand-aggregate supply framework, a decrease in supply will increase output and employment. It must be remembered, however, that Keynes' aggregate supply is really a price. Lowering the price or cost of supply would there by result in higher profit and/ or higher output. This coincides with the viewpoint of supply-siders who want to lower the cost of production via various means for the purpose of increasing supply. Then, too, some of the means, such as tax cuts, tax credits and accelerated depreciation, recommended by suply-siders to increase productivity and output would be favored by Keynesians also as a means of increasing investment, curbing costs, and increasing effective demand. In fact, these very measures were used in the early 1960s in the United State during the years when nagging unemployment was plaguing the economy. Keynesians disagree with the supply-siders' proposals to reduce transfer payments and slow down the process of income redistribution, except in full employment inflationary periods. Keynesians likewise disagree with tax measures that favored business as opposed to individuals and the notion of shifting the base of personal taxation away from income and toward spending. A frequent criticism levied at supply-side economics is that it lacks adequate models and thus far has not been quantified to any great extent. But, it should be remembered that Keynesian economics originally was lacking in models and based on a number of unproved assumptions, such as, the stability of the consumption function with its declining marginal propensity to consume. Just as the economic catastrophe of the great depression of the 1930s paved the way for the application of Keynesian or demand-side policies, perhaps the frustrating and restless conditions of the 1970s and 1980s is an open invitation for the application of supply-side policies. If so, the 1980s and 1990s may prove to be the testing era for the supply-side theories. By the end of 1990s we should have better supply-side models and know much more about the effectiveness of supply-side policies. By that time, also, supply-side thinking may be more crystallized and we will learn whether it is something temporary that will fade away, be widely accepted as the new economics replacing Keynesian demand analysis, or something to be continued but melded or fused with demand management.

  • PDF

An Empirical Study on the Success Factors of Implementing Product Life Cycle Management Systems (제품수명주기관리 시스템 도입의 성공요인에 관한 실증연구)

  • Kim, Jeong-Beom
    • Journal of KIISE:Software and Applications
    • /
    • v.37 no.12
    • /
    • pp.909-918
    • /
    • 2010
  • To analyze the national competitiveness of Korea leads to the conclusion that global high-tech enterprises have been playing leading and pulling roles in making Korea in line with advanced countries even though the country is lacking in various natural resources. The characteristics of these companies above are as follows; Firstly, these enterprises continue to accumulate core technologies and know-how with highly competent human resources and well-organized management. Secondly, they are well structured and equipped with information technology infrastructures which are, for example, ERP, SCM, CRM, and PLM. Among them PLM is considered to be the principal core information technology infra in manufacturing industry. The urgent task of manufacturing industry recently is to develop new products to accept various needs of consumers, and to launch the products in time to market, which requires the manufactures to be equipped with product development infra and system to upgrade product fulfillment and mass production system in a short period. The introduction of PLM System is a solution of core strategy as a manufacturer for collaboration, global development, reengineering of manufacturing system, the innovation and efficiency of manufacturing process, and product quality improvement. The purpose of this study is to analyze the success factors of introducing PLM System and its practicing effectiveness. And the results of empirical study are as follows; (1) Technical success factors positively impact system quality and user satisfaction, (2) Organizational success factors positively impact system quality, but does not impact user satisfaction, (3) Environmental success factors positively impact system quality and user satisfaction, (4) System quality positively impacts user satisfaction, (5) User satisfaction positively impacts the effectiveness of implementing PLM systems, but system quality does not impact it.

A Study on the Online Newspaper Archive : Focusing on Domestic and International Case Studies (온라인 신문 아카이브 연구 국내외 구축 사례를 중심으로)

  • Song, Zoo Hyung
    • The Korean Journal of Archival Studies
    • /
    • no.48
    • /
    • pp.93-139
    • /
    • 2016
  • Aside from serving as a body that monitors and criticizes the government through reviews and comments on public issues, newspapers can also form and spread public opinion. Metadata contains certain picture records and, in the case of local newspapers, the former is an important means of obtaining locality. Furthermore, advertising in newspapers and the way of editing in newspapers can be viewed as a representation of the times. For the value of archiving in newspapers when a documentation strategy is established, the newspaper is considered as a top priority that should be collected. A newspaper archive that will handle preservation and management carries huge significance in many ways. Journalists use them to write articles while scholars can use a newspaper archive for academic purposes. Also, the NIE is a type of a practical usage of such an archive. In the digital age, the newspaper archive has an important position because it is located in the core of MAM, which integrates and manages the media asset. With this, there are prospects that an online archive will perform a new role in the production of newspapers and the management of publishing companies. Korea Integrated News Database System (KINDS), an integrated article database, began its service in 1991, whereas Naver operates an online newspaper archive called "News Library." Initially, KINDS received an enthusiastic response, but nowadays, the utilization ratio continues to decrease because of the omission of some major newspapers, such as Chosun Ilbo and JoongAng Ilbo, and the numerous user interface problems it poses. Despite these, however, the system still presents several advantages. For example, it is easy to access freely because there is a set budget for the public, and accessibility to local papers is simple. A national library consistently carries out the digitalization of time-honored newspapers. In addition, individual newspaper companies have also started the service, but it is not enough for such to be labeled an archive. In the United States (US), "Chronicling America"-led by the Library of Congress with funding from the National Endowment for the Humanities-is in the process of digitalizing historic newspapers. The universities of each state and historical association provide funds to their public library for the digitalization of local papers. In the United Kingdom, the British Library is constructing an online newspaper archive called "The British Newspaper Archive," but unlike the one in the US, this service charges a usage fee. The Joint Information Systems Committee has also invested in "The British Newspaper Archive," and its construction is still ongoing. ProQuest Archiver and Gale NewsVault are the representative platforms because of their efficiency and how they have established the standardization of newspapers. Now, it is time to change the way we understand things, and a drastic investment is required to improve the domestic and international online newspaper archive.

Development of Stand Yield Table Based on Current Growth Characteristics of Chamaecyparis obtusa Stands (현실임분 생장특성에 의한 편백 임분수확표 개발)

  • Jung, Su Young;Lee, Kwang Soo;Lee, Ho Sang;Ji Bae, Eun;Park, Jun Hyung;Ko, Chi-Ung
    • Journal of Korean Society of Forest Science
    • /
    • v.109 no.4
    • /
    • pp.477-483
    • /
    • 2020
  • We constructed a stand yield table for Chamaecyparis obtusa based on data from an actual forest. The previous stand yield table had a number of disadvantages because it was based on actual forest information. In the present study we used data from more than 200 sampling plots in a stand of Chamaecyparis obtusa. The analysis included theestimation, recovery and prediction of the distribution of values for diameter at breast height (DBH), and the result is a valuable process for the preparation ofstand yield tables. The DBH distribution model uses a Weibull function, and the site index (base age: 30 years), the standard for assessing forest productivity, was derived using the Chapman-Richards formula. Several estimation formulas for the preparation of the stand yield table were considered for the fitness index, and the optimal formula was chosen. The analysis shows that the site index is in the range of 10 to 18 in the Chamaecyparis obtusa stand. The estimated stand volume of each sample plot was found to have an accuracy of 62%. According to the residuals analysis, the stands showed even distribution around zero, which indicates that the results are useful in the field. Comparing the table constructed in this study to the existing stand yield table, we found that our table yielded comparatively higher values for growth. This is probably because the existing analysis data used a small amount of research data that did not properly reflect. We hope that the stand yield table of Chamaecyparis obtusa, a representative species of southern regions, will be widely used for forest management. As these forests stabilize and growth progresses, we plan to construct an additional yield table applicable to the production of developed stands.

Considerations of Countermeasure Tasks in the Fields of Forest and Forestry in Korea through Case Study on "The Nagoya Protocol (Access to Genetic Resources and Benefit Sharing)" ("유전자원의 접근과 이익공유(ABS)" 사례연구를 통한 국내 산림·임업분야 대응과제 고찰)

  • Lee, Gwan Gyu;Kim, Jun Soon;Jung, Haw young
    • Journal of Korean Society of Forest Science
    • /
    • v.100 no.3
    • /
    • pp.522-534
    • /
    • 2011
  • The aim of this study is to draw forth the tasks for establishing the right of native biology in Korea through the case study on 'Access on genetic resources and Benefit Sharing'. For this purpose, this study decided on its research subject by selecting Hoodia, on which ABS treaty was made the most recently, through the examination of the representative ABS precedents on plant species. This study analyzed the process background of ABS on Hoodia, and compared & analyzed the ABS procedures of 'Bonn Guidelines' adopted by the 6th Conference of the Parties of the Convention on Biological Diversity in 2002 and Hoodia case. Together with the ABS major issues in common drawn as a result of this analysis, and "Nagoya Protocol" adopted by the 10th Conference of the Parties of the Convention on Biological Diversity, this study intended to shed a light on the impending tasks which Korea faces at present and its role relationship. The research results are as follows: 1. It is required that species habitats should be divided based on biological classification and its subsequent community should be established with the development of infrastructure such as a community's independent production, management and monitoring of bio-species. 2. There needs to be a designation of ABS National Focal Point for sharing of ABS-related general information, boosting of implementation of the relevant convention. 3. There needs to be the establishment of ABS convention system consequent on legislative, administrative, political procedures, and designation of the Competent National Authorities for the provision of the format of Prior Informed Consent (PIC) and Mutually Agreed Terms (MAT) and their contents assessment and confirmation. 4. There should be the establishment of integrated management system of ABS-related research and development of forest biological resources and its relevant research projects. 5. There should be information development through the distribution of responsibility and role between the ministries and offices concerned according to bio-resources, and there needs to be efforts in aiming for opening a working group of academic-industrial institutions for developing a mutually interchangeable system. 6. It's required that the efficient access between industrial circles and the people should be promoted by setting up ABS support center of biological resources in ministry and office's charge. 7. There should be a selection of a national supervisory organization for securement of the right of a local community and monitoring of ABS convention implementation, and a countermeasure system for preventing outflow of forest bioresources. Conclusively, it's judged that it will be possible to inquire into the countermeasures for the establishment of the native forest biology dominion through such research results.

DEVELOPMENT OF STATEWIDE TRUCK TRAFFIC FORECASTING METHOD BY USING LIMITED O-D SURVEY DATA (한정된 O-D조사자료를 이용한 주 전체의 트럭교통예측방법 개발)

  • 박만배
    • Proceedings of the KOR-KST Conference
    • /
    • 1995.02a
    • /
    • pp.101-113
    • /
    • 1995
  • The objective of this research is to test the feasibility of developing a statewide truck traffic forecasting methodology for Wisconsin by using Origin-Destination surveys, traffic counts, classification counts, and other data that are routinely collected by the Wisconsin Department of Transportation (WisDOT). Development of a feasible model will permit estimation of future truck traffic for every major link in the network. This will provide the basis for improved estimation of future pavement deterioration. Pavement damage rises exponentially as axle weight increases, and trucks are responsible for most of the traffic-induced damage to pavement. Consequently, forecasts of truck traffic are critical to pavement management systems. The pavement Management Decision Supporting System (PMDSS) prepared by WisDOT in May 1990 combines pavement inventory and performance data with a knowledge base consisting of rules for evaluation, problem identification and rehabilitation recommendation. Without a r.easonable truck traffic forecasting methodology, PMDSS is not able to project pavement performance trends in order to make assessment and recommendations in the future years. However, none of WisDOT's existing forecasting methodologies has been designed specifically for predicting truck movements on a statewide highway network. For this research, the Origin-Destination survey data avaiiable from WisDOT, including two stateline areas, one county, and five cities, are analyzed and the zone-to'||'&'||'not;zone truck trip tables are developed. The resulting Origin-Destination Trip Length Frequency (00 TLF) distributions by trip type are applied to the Gravity Model (GM) for comparison with comparable TLFs from the GM. The gravity model is calibrated to obtain friction factor curves for the three trip types, Internal-Internal (I-I), Internal-External (I-E), and External-External (E-E). ~oth "macro-scale" calibration and "micro-scale" calibration are performed. The comparison of the statewide GM TLF with the 00 TLF for the macro-scale calibration does not provide suitable results because the available 00 survey data do not represent an unbiased sample of statewide truck trips. For the "micro-scale" calibration, "partial" GM trip tables that correspond to the 00 survey trip tables are extracted from the full statewide GM trip table. These "partial" GM trip tables are then merged and a partial GM TLF is created. The GM friction factor curves are adjusted until the partial GM TLF matches the 00 TLF. Three friction factor curves, one for each trip type, resulting from the micro-scale calibration produce a reasonable GM truck trip model. A key methodological issue for GM. calibration involves the use of multiple friction factor curves versus a single friction factor curve for each trip type in order to estimate truck trips with reasonable accuracy. A single friction factor curve for each of the three trip types was found to reproduce the 00 TLFs from the calibration data base. Given the very limited trip generation data available for this research, additional refinement of the gravity model using multiple mction factor curves for each trip type was not warranted. In the traditional urban transportation planning studies, the zonal trip productions and attractions and region-wide OD TLFs are available. However, for this research, the information available for the development .of the GM model is limited to Ground Counts (GC) and a limited set ofOD TLFs. The GM is calibrated using the limited OD data, but the OD data are not adequate to obtain good estimates of truck trip productions and attractions .. Consequently, zonal productions and attractions are estimated using zonal population as a first approximation. Then, Selected Link based (SELINK) analyses are used to adjust the productions and attractions and possibly recalibrate the GM. The SELINK adjustment process involves identifying the origins and destinations of all truck trips that are assigned to a specified "selected link" as the result of a standard traffic assignment. A link adjustment factor is computed as the ratio of the actual volume for the link (ground count) to the total assigned volume. This link adjustment factor is then applied to all of the origin and destination zones of the trips using that "selected link". Selected link based analyses are conducted by using both 16 selected links and 32 selected links. The result of SELINK analysis by u~ing 32 selected links provides the least %RMSE in the screenline volume analysis. In addition, the stability of the GM truck estimating model is preserved by using 32 selected links with three SELINK adjustments, that is, the GM remains calibrated despite substantial changes in the input productions and attractions. The coverage of zones provided by 32 selected links is satisfactory. Increasing the number of repetitions beyond four is not reasonable because the stability of GM model in reproducing the OD TLF reaches its limits. The total volume of truck traffic captured by 32 selected links is 107% of total trip productions. But more importantly, ~ELINK adjustment factors for all of the zones can be computed. Evaluation of the travel demand model resulting from the SELINK adjustments is conducted by using screenline volume analysis, functional class and route specific volume analysis, area specific volume analysis, production and attraction analysis, and Vehicle Miles of Travel (VMT) analysis. Screenline volume analysis by using four screenlines with 28 check points are used for evaluation of the adequacy of the overall model. The total trucks crossing the screenlines are compared to the ground count totals. L V/GC ratios of 0.958 by using 32 selected links and 1.001 by using 16 selected links are obtained. The %RM:SE for the four screenlines is inversely proportional to the average ground count totals by screenline .. The magnitude of %RM:SE for the four screenlines resulting from the fourth and last GM run by using 32 and 16 selected links is 22% and 31 % respectively. These results are similar to the overall %RMSE achieved for the 32 and 16 selected links themselves of 19% and 33% respectively. This implies that the SELINICanalysis results are reasonable for all sections of the state.Functional class and route specific volume analysis is possible by using the available 154 classification count check points. The truck traffic crossing the Interstate highways (ISH) with 37 check points, the US highways (USH) with 50 check points, and the State highways (STH) with 67 check points is compared to the actual ground count totals. The magnitude of the overall link volume to ground count ratio by route does not provide any specific pattern of over or underestimate. However, the %R11SE for the ISH shows the least value while that for the STH shows the largest value. This pattern is consistent with the screenline analysis and the overall relationship between %RMSE and ground count volume groups. Area specific volume analysis provides another broad statewide measure of the performance of the overall model. The truck traffic in the North area with 26 check points, the West area with 36 check points, the East area with 29 check points, and the South area with 64 check points are compared to the actual ground count totals. The four areas show similar results. No specific patterns in the L V/GC ratio by area are found. In addition, the %RMSE is computed for each of the four areas. The %RMSEs for the North, West, East, and South areas are 92%, 49%, 27%, and 35% respectively, whereas, the average ground counts are 481, 1383, 1532, and 3154 respectively. As for the screenline and volume range analyses, the %RMSE is inversely related to average link volume. 'The SELINK adjustments of productions and attractions resulted in a very substantial reduction in the total in-state zonal productions and attractions. The initial in-state zonal trip generation model can now be revised with a new trip production's trip rate (total adjusted productions/total population) and a new trip attraction's trip rate. Revised zonal production and attraction adjustment factors can then be developed that only reflect the impact of the SELINK adjustments that cause mcreases or , decreases from the revised zonal estimate of productions and attractions. Analysis of the revised production adjustment factors is conducted by plotting the factors on the state map. The east area of the state including the counties of Brown, Outagamie, Shawano, Wmnebago, Fond du Lac, Marathon shows comparatively large values of the revised adjustment factors. Overall, both small and large values of the revised adjustment factors are scattered around Wisconsin. This suggests that more independent variables beyond just 226; population are needed for the development of the heavy truck trip generation model. More independent variables including zonal employment data (office employees and manufacturing employees) by industry type, zonal private trucks 226; owned and zonal income data which are not available currently should be considered. A plot of frequency distribution of the in-state zones as a function of the revised production and attraction adjustment factors shows the overall " adjustment resulting from the SELINK analysis process. Overall, the revised SELINK adjustments show that the productions for many zones are reduced by, a factor of 0.5 to 0.8 while the productions for ~ relatively few zones are increased by factors from 1.1 to 4 with most of the factors in the 3.0 range. No obvious explanation for the frequency distribution could be found. The revised SELINK adjustments overall appear to be reasonable. The heavy truck VMT analysis is conducted by comparing the 1990 heavy truck VMT that is forecasted by the GM truck forecasting model, 2.975 billions, with the WisDOT computed data. This gives an estimate that is 18.3% less than the WisDOT computation of 3.642 billions of VMT. The WisDOT estimates are based on the sampling the link volumes for USH, 8TH, and CTH. This implies potential error in sampling the average link volume. The WisDOT estimate of heavy truck VMT cannot be tabulated by the three trip types, I-I, I-E ('||'&'||'pound;-I), and E-E. In contrast, the GM forecasting model shows that the proportion ofE-E VMT out of total VMT is 21.24%. In addition, tabulation of heavy truck VMT by route functional class shows that the proportion of truck traffic traversing the freeways and expressways is 76.5%. Only 14.1% of total freeway truck traffic is I-I trips, while 80% of total collector truck traffic is I-I trips. This implies that freeways are traversed mainly by I-E and E-E truck traffic while collectors are used mainly by I-I truck traffic. Other tabulations such as average heavy truck speed by trip type, average travel distance by trip type and the VMT distribution by trip type, route functional class and travel speed are useful information for highway planners to understand the characteristics of statewide heavy truck trip patternS. Heavy truck volumes for the target year 2010 are forecasted by using the GM truck forecasting model. Four scenarios are used. Fo~ better forecasting, ground count- based segment adjustment factors are developed and applied. ISH 90 '||'&'||' 94 and USH 41 are used as example routes. The forecasting results by using the ground count-based segment adjustment factors are satisfactory for long range planning purposes, but additional ground counts would be useful for USH 41. Sensitivity analysis provides estimates of the impacts of the alternative growth rates including information about changes in the trip types using key routes. The network'||'&'||'not;based GMcan easily model scenarios with different rates of growth in rural versus . . urban areas, small versus large cities, and in-state zones versus external stations. cities, and in-state zones versus external stations.

  • PDF

New Estimates of CH4 Emission Scaling Factors by Amount of Rice Straw Applied from Korea Paddy Fields (볏짚 시용에 따른 벼 재배 논에서의 메탄 배출계수 개발에 관한 연구)

  • Ju, Okjung;Won, Tae-Jin;Cho, Kwang-Rae;Choi, Byoung-Rourl;Seo, Jae-Sun;Park, In-Tae;Kim, Gun-Yeob
    • Korean Journal of Environmental Agriculture
    • /
    • v.32 no.3
    • /
    • pp.179-184
    • /
    • 2013
  • BACKGROUND: Accurate estimates of total direct $CH_4$ emissions from croplands on a country scale are important for global budgets of anthropogenic sources of $CH_4$ emissions and for the development of effective mitigation strategies. Methane production resulted by the anaerobic decomposition of organic compounds where $CO_2$ acts as inorganic electron acceptor. This process could be affected by the addition of rice straw, water management and rice variety itself. METHODS AND RESULTS: Rice (Oryza sativa L. Japonica type, var Samkwangbyeo) was cultivated in four plots: (1) Nitrogen-Phosphorus-Potassium (NPK) ($N-P_2O_5-K_2O$:90-45-57 kg/ha); (2) NPK plus 3 Mg/ha rice straw (RS3); (3) NPK plus 5 Mg/ha rice straw (RS5); (4) NPK plus 7 Mg/ha rice straw (RS7) for 3 years (2010-2012) and the rice straw incorporated in fall (Nov.) in Gyeonggi-do Hwaseong-si. Gas samples were collected using the closed static chamber which were installed in each treated plot of $152.9m^2$. According to application of 3, 5, 7 Mg/ha of rice straw, methane emission increased by 46, 101, 190%, respectively, compared to that of the NPK plot. CONCLUSION(S): We obtained a quantitative relationship between $CH_4$ emission and the amount of rice straw applied from rice fields which could be described by polynomial regression of order 2. The emission scaling factor estimated by the relationship were in the range of IPCC GPG (2000).

Improvement in Chicken Meat Pricing System in Korea (한국의 닭고기 가격 결정 시스템 개선)

  • Kim, J.J.;Park, B.K.
    • Korean Journal of Poultry Science
    • /
    • v.35 no.4
    • /
    • pp.327-333
    • /
    • 2009
  • In Korea chicken meat price is not determined in the auction markets, but it is artificially calculated using the live chicken price of one day before, transporting cost, converting rate of live chicken into carcass, and slaughtering cost. This calculated price is published through the mass media and used as the base for chicken meat transaction. By the way, since 85% of the Korean broiler industry is composed by the integrated system, the live chicken price is nothing to do with ex-factory price of chicken meat produced by the integrators. Under this pricing system, when we estimate the margin of the chicken meat through the marketing process, the margin of the integrator is fluctuated by the live chicken price of one day before, which is nothing to do with integrators; When the live chicken price is low, the margin of the integrators is low, but the margin of the selling agencies' is relatively high. On the contrast, when the live chicken price is high, the margin of the integrators is high, but the selling agencies' margin will be relatively low, because consumer's price could not be increased in parallel with increase of the live chicken price. Accordingly, the ex-factory price of chicken meat determined using the production cost of live chicken and slaughtering cost of the integrator by adding resonable margin of the integrator should be determined and published, so that it could be used for chicken meat transaction. In Japan the Zen-Noh Chicken Foods Corporation announce the ideal piece of chicken every morning, and all the transactions of the chicken meat will be determined based on this price. In Korea, it will be desirable to make bench marking from Japanese case, in other words the NH could announce the ideal price of chicken meat every morning, so that it would be the base price of chicken meat transaction. Even though the market share of the NH is less than 5%, its publicity should be accepted, since it is a subsidiary of the National Agricultural Cooperative Federation of Korea.

Larch Pellets Fabricated with Coffee Waste and the Commercializing Potential of the Pellets (커피박과 낙엽송 목분을 이용한 펠릿 제조 및 이에 대한 상용화 검토)

  • Yang, In;Han, Gyu Seong;Oh, Seung Won
    • Journal of the Korean Wood Science and Technology
    • /
    • v.46 no.1
    • /
    • pp.48-59
    • /
    • 2018
  • This study was conducted to suggest the effective management and recycling processes of coffee waste, which can be easily obtained from coffee shops and coffee-related products industries. Prior to the fabrication of pellets, the potential of coffee waste as a raw material of pellet was investigated through the examination of its chemical compositions and fuel characteristics. Major gradient included in coffee waste was holocellulose, followed by fat/oil and protein. Coffee waste contained a small quantity of ash (0.7%), such as calcium, sodium, potassium and magnesium. Interestingly, coffee waste was easily dried probably due to its porous structure. Pellets fabricated with coffee waste and larch sawdust showed good fuel characteristics, such as moisture content, ash content, density and durability. The pellets exceed greatly the minimum requirements of $1^{st}$-grade wood pellet standard designated by National Institute of Forest Science (NIFOS). Particularly, the high calorific value of coffee waste showed the potential as a raw material of pellet. However, owing to high nitrogen and sulfur contents, coffee waste is like to be used as a raw material of wood pellet for combined heat and power plants equipped with a reduction system of $NO_x$ and $SO_x$ gases. On the other hand, 91 wt% larch sawdust and 9 wt% coffee waste are required to fabricate the $1^{st}$-grade wood pellets designated by NIFOS. Pellets fabricated with the conditions are estimated to have nitrogen content of 0.298% and sulfur content of 0.03%. Lastly, if amounts of coffee waste and sawdust in the production of wood pellets are adequately adjusted according to its purchasing price, the manufacturing cost of pellet can effectively be reduced. In addition, it is expected tp prepare the effective recycling process of waste and to relieve the environmental burden with the reduction of waste from the commercialization of coffee waste/larch pellets.

The Prediction of DEA based Efficiency Rating for Venture Business Using Multi-class SVM (다분류 SVM을 이용한 DEA기반 벤처기업 효율성등급 예측모형)

  • Park, Ji-Young;Hong, Tae-Ho
    • Asia pacific journal of information systems
    • /
    • v.19 no.2
    • /
    • pp.139-155
    • /
    • 2009
  • For the last few decades, many studies have tried to explore and unveil venture companies' success factors and unique features in order to identify the sources of such companies' competitive advantages over their rivals. Such venture companies have shown tendency to give high returns for investors generally making the best use of information technology. For this reason, many venture companies are keen on attracting avid investors' attention. Investors generally make their investment decisions by carefully examining the evaluation criteria of the alternatives. To them, credit rating information provided by international rating agencies, such as Standard and Poor's, Moody's and Fitch is crucial source as to such pivotal concerns as companies stability, growth, and risk status. But these types of information are generated only for the companies issuing corporate bonds, not venture companies. Therefore, this study proposes a method for evaluating venture businesses by presenting our recent empirical results using financial data of Korean venture companies listed on KOSDAQ in Korea exchange. In addition, this paper used multi-class SVM for the prediction of DEA-based efficiency rating for venture businesses, which was derived from our proposed method. Our approach sheds light on ways to locate efficient companies generating high level of profits. Above all, in determining effective ways to evaluate a venture firm's efficiency, it is important to understand the major contributing factors of such efficiency. Therefore, this paper is constructed on the basis of following two ideas to classify which companies are more efficient venture companies: i) making DEA based multi-class rating for sample companies and ii) developing multi-class SVM-based efficiency prediction model for classifying all companies. First, the Data Envelopment Analysis(DEA) is a non-parametric multiple input-output efficiency technique that measures the relative efficiency of decision making units(DMUs) using a linear programming based model. It is non-parametric because it requires no assumption on the shape or parameters of the underlying production function. DEA has been already widely applied for evaluating the relative efficiency of DMUs. Recently, a number of DEA based studies have evaluated the efficiency of various types of companies, such as internet companies and venture companies. It has been also applied to corporate credit ratings. In this study we utilized DEA for sorting venture companies by efficiency based ratings. The Support Vector Machine(SVM), on the other hand, is a popular technique for solving data classification problems. In this paper, we employed SVM to classify the efficiency ratings in IT venture companies according to the results of DEA. The SVM method was first developed by Vapnik (1995). As one of many machine learning techniques, SVM is based on a statistical theory. Thus far, the method has shown good performances especially in generalizing capacity in classification tasks, resulting in numerous applications in many areas of business, SVM is basically the algorithm that finds the maximum margin hyperplane, which is the maximum separation between classes. According to this method, support vectors are the closest to the maximum margin hyperplane. If it is impossible to classify, we can use the kernel function. In the case of nonlinear class boundaries, we can transform the inputs into a high-dimensional feature space, This is the original input space and is mapped into a high-dimensional dot-product space. Many studies applied SVM to the prediction of bankruptcy, the forecast a financial time series, and the problem of estimating credit rating, In this study we employed SVM for developing data mining-based efficiency prediction model. We used the Gaussian radial function as a kernel function of SVM. In multi-class SVM, we adopted one-against-one approach between binary classification method and two all-together methods, proposed by Weston and Watkins(1999) and Crammer and Singer(2000), respectively. In this research, we used corporate information of 154 companies listed on KOSDAQ market in Korea exchange. We obtained companies' financial information of 2005 from the KIS(Korea Information Service, Inc.). Using this data, we made multi-class rating with DEA efficiency and built multi-class prediction model based data mining. Among three manners of multi-classification, the hit ratio of the Weston and Watkins method is the best in the test data set. In multi classification problems as efficiency ratings of venture business, it is very useful for investors to know the class with errors, one class difference, when it is difficult to find out the accurate class in the actual market. So we presented accuracy results within 1-class errors, and the Weston and Watkins method showed 85.7% accuracy in our test samples. We conclude that the DEA based multi-class approach in venture business generates more information than the binary classification problem, notwithstanding its efficiency level. We believe this model can help investors in decision making as it provides a reliably tool to evaluate venture companies in the financial domain. For the future research, we perceive the need to enhance such areas as the variable selection process, the parameter selection of kernel function, the generalization, and the sample size of multi-class.