• Title/Summary/Keyword: 도로 시스템

Search Result 13,371, Processing Time 0.041 seconds

Effects of Non-ionic Surfactant Tween 80 on the in vitro Gas Production, Dry Matter Digestibility, Enzyme Activity and Microbial Growth Rate by Rumen Mixed Microorganisms (비이온성 계면활성제 Tween 80의 첨가가 반추위 혼합 미생물에 의한 in vitro 가스발생량, 건물소화율, 효소활력 및 미생물 성장율에 미치는 영향)

  • Lee, Shin-Ja;Kim, Wan-Young;Moon, Yea-Hwang;Kim, Hyeon-Shup;Kim, Kyoung-Hoon;Ha, Jong-Kyu;Lee, Sung-Sil
    • Journal of Life Science
    • /
    • v.17 no.12
    • /
    • pp.1660-1668
    • /
    • 2007
  • The non-ionic surfactant (NIS) Tween 80 was evaluated for its ability to influence invitro cumulative gas production, dry matter digestibility, cellulolytic enzyme activities, anaerobic microbial growth rates, and adhesion to substrates by mixed rumen microorganisms on rice straw, alfalfa hay, cellulose filter paper and tall fescue hay. The addition of NIS Tween 80 at a level of 0.05% increased significantly (P<0.05) in vitro DM digestibility, cumulative gas production, microbial growth rate and cellulolytic enzyme activity from all of substrates used in this study. In vitro cumulative gas production from the NIS-treated substrates; rice straw, alfalfa hay, filter paper and tall fescue hay was significantly (P<0.05) improved by 274.8, 235.2, 231.1 and 719.5% compared with the control, when substrates were incubated for 48 hr in vitro. The addition of 0.05% NIS Tween 80 to cultures growing on alfalfa hay resulted in a significant increase in CMCase (38.1%), xylanase (121.4%), Avicelase (not changed) and amylase (38.2%) activities after 36 h incubation. These results indicated that the addition of 0.05% Tween 80 could greatly stimulate the release of some kinds of cellulolytic enzymes without decreasing cell growth rate in contrast to trends reported with aerobic microorganism. Our SEM observation showed that NIS Tween. 80 did not influence the microbial adhesion to substrates used in the study. Present data clearly show that improved gas production, DM digestibility and cellulolytic enzyme activity by Tween 80 is not due to increased bacterial adhesion on the substrates.

The Impact of Service Level Management(SLM) Process Maturity on Information Systems Success in Total Outsourcing: An Analytical Case Study (토털 아웃소싱 환경 하에서 IT서비스 수준관리(Service Level Management) 프로세스 성숙도가 정보시스템 성공에 미치는 영향에 관한 분석적 사례연구)

  • Cho, Geun Su;An, Joon Mo;Min, Hyoung Jin
    • Asia pacific journal of information systems
    • /
    • v.23 no.2
    • /
    • pp.21-39
    • /
    • 2013
  • As the utilization of information technology and the turbulence of technological change increase in organizations, the adoption of IT outsourcing also grows to manage IT resource more effectively and efficiently. In this new way of IT management technique, service level management(SLM) process becomes critical to derive success from the outsourcing in the view of end users in organization. Even though much of the research on service level management or agreement have been done during last decades, the performance of the service level management process have not been evaluated in terms of final objectives of the management efforts or success from the view of end-users. This study explores the relationship between SLM maturity and IT outsourcing success from the users' point of view by a analytical case study in four client organizations under an IT outsourcing vendor, which is a member company of a major Korean conglomerate. For setting up a model for the analysis, previous researches on service level management process maturity and information systems success are reviewed. In particular, information systems success from users' point of view are reviewed based the DeLone and McLean's study, which is argued and accepted as a comprehensively tested model of information systems success currently. The model proposed in this study argues that SLM process maturity influences information systems success, which is evaluated in terms of information quality, systems quality, service quality, and net effect proposed by DeLone and McLean. SLM process maturity can be measured in planning process, implementation process and operation and evaluation process. Instruments for measuring the factors in the proposed constructs of information systems success and SL management process maturity were collected from previous researches and evaluated for securing reliability and validity, utilizing appropriate statistical methods and pilot tests before exploring the case study. Four cases from four different companies under one vendor company were utilized for the analysis. All of the cases had been contracted in SLA(Service Level Agreement) and had implemented ITIL(IT Infrastructure Library), Six Sigma and BSC(Balanced Scored Card) methods since last several years, which means that all the client organizations pursued concerted efforts to acquire quality services from IT outsourcing from the organization and users' point of view. For comparing the differences among the four organizations in IT out-sourcing sucess, T-test and non-parametric analysis have been applied on the data set collected from the organization using survey instruments. The process maturities of planning and implementation phases of SLM are found not to influence on any dimensions of information systems success from users' point of view. It was found that the SLM maturity in the phase of operations and evaluation could influence systems quality only from users' view. This result seems to be quite against the arguments in IT outsourcing practices in the fields, which emphasize usually the importance of planning and implementation processes upfront in IT outsourcing projects. According to after-the-fact observation by an expert in an organization participating in the study, their needs and motivations for outsourcing contracts had been quite familiar already to the vendors as long-term partners under a same conglomerate, so that the maturity in the phases of planning and implementation seems not to be differentiating factors for the success of IT outsourcing. This study will be the foundation for the future research in the area of IT outsourcing management and success, in particular in the service level management. And also, it could guide managers in practice in IT outsourcing management to focus on service level management process in operation and evaluation stage especially for long-term outsourcing contracts under very unique context like Korean IT outsourcing projects. This study has some limitations in generalization because the sample size is small and the context itself is confined in an unique environment. For future exploration, survey based research could be designed and implemented.

  • PDF

Effect of Packaging Systems with High CO2 Treatment on the Quality Changes of Fig (Ficus carica L) during Storage (저장 중 무화과(Ficus carica L) 선도유지를 위한 고농도 이산화탄소 처리된 포장 시스템 적용 연구)

  • Kim, Jung-Soo;Chung, Dae-Sung;Lee, Youn Suk
    • Food Science and Preservation
    • /
    • v.19 no.6
    • /
    • pp.799-806
    • /
    • 2012
  • This experiment was conducted to establish the optimum conditions for high $CO_2$ gas treatment in combination with a proper gas-permeable packaging film to maintain the quality of fig fruit (Ficus carica L). Among the fig fruits with different high $CO_2$ treatments, the quality change was most effectively controlled during storage in the 70%-$CO_2$-treated fig fruit. Harvested fig fruit was packaged using microperforated oriented polypropylene (MP) film to maintain the optimum gas concentrations in the headspace of packaging for the modified-atmosphere system. MP film had an oxygen transmission rate of about $10,295cm^3/m^2$/day/atm at $25^{\circ}C$. The weight loss, firmness, soluble-solid content (SSC), acidity (pH), skin color (Hunter L, a, b), and decay ratio of the fig fruits were monitored during storage at 5 and $25^{\circ}C$. The results of this study showed that the OPP film, OPP film + 70% $CO_2$, and MP film+70% $CO_2$ were highly effective in reducing the loss rate, firmness and decay occurrence rate of fig fruits that were packaged with them during storage. In the case of using treatments with packages of OPP film and OPP film+70% $CO_2$, however, adverse effects like package bursting or physiological injury of the fig may occur due to the gas pressure or long exposure to $CO_2$. Therefore, the results indicated that MP film containing 70% $CO_2$ can be used as an effective treatment to extend the freshness of fig fruits for storage at a proper low temperature.

Estimation of Soybean Growth Using Polarimetric Discrimination Ratio by Radar Scatterometer (레이더 산란계 편파 차이율을 이용한 콩 생육 추정)

  • Kim, Yi-Hyun;Hong, Suk-Young
    • Korean Journal of Soil Science and Fertilizer
    • /
    • v.44 no.5
    • /
    • pp.878-886
    • /
    • 2011
  • The soybean is one of the oldest cultivated crops in the world. Microwave remote sensing is an important tool because it can penetrate into cloud independent of weather and it can acquire day or night time data. Especially a ground-based polarimetric scatterometer has advantages of monitoring crop conditions continuously with full polarization and different frequencies. In this study, soybean growth parameters and soil moisture were estimated using polarimetric discrimination ratio (PDR) by radar scatterometer. A ground-based polarimetric scatterometer operating at multiple frequencies was used to continuously monitor the soybean growth condition and soil moisture change. It was set up to obtain data automatically every 10 minutes. The temporal trend of the PDR for all bands agreed with the soybean growth data such as fresh weight, Leaf Area Index, Vegetation Water Content, plant height; i.e., increased until about DOY 271 and decreased afterward. Soil moisture lowly related with PDR in all bands during whole growth stage. In contrast, PDR is relative correlated with soil moisture during below LAI 2. We also analyzed the relationship between the PDR of each band and growth data. It was found that L-band PDR is the most correlated with fresh weight (r=0.96), LAI (r=0.91), vegetation water content (r=0.94) and soil moisture (r=0.86). In addition, the relationship between C-, X-band PDR and growth data were moderately correlated ($r{\geq}0.83$) with the exception of the soil moisture. Based on the analysis of the relation between the PDR at L, C, X-band and soybean growth parameters, we predicted the growth parameters and soil moisture using L-band PDR. Overall good agreement has been observed between retrieved growth data and observed growth data. Results from this study show that PDR appear effective to estimate soybean growth parameters and soil moisture.

Analysis on the Effect of the Crown Heating System and Warm Nutrient Supply on Energy Usage in Greenhouse, Strawberry Growth and Production (관부 난방시스템과 온수 양액 공급이 온실 에너지 사용량, 딸기 생육 및 생산성에 미치는 영향 분석)

  • Lee, Taeseok;Kim, Jingu;Park, Seokho;Lee, Jaehan;Moon, Jongpil
    • Journal of Bio-Environment Control
    • /
    • v.30 no.4
    • /
    • pp.271-277
    • /
    • 2021
  • In this study, experiments of local heating on crown and supplying warm nutrient for energy saving and improving growth of 'Seolhyang' strawberry were conducted. The temperature of inside and crown in greenhouses which were control (space heating 8℃) and test (space heating 5℃+crown heating) was measured. In the control greenhouse, the average of temperature and humidity in December was 7.1℃, 87.2%, respectively. In the test greenhouse, the average of temperature and humidity in December was 5.7℃, 88.7%. The temperature of crown and inside the bed were 7.9℃, 10.8℃ in control, 9.3℃, 12.7℃ in test. During the test period, the total 16,847×103 kcal of energy was consumed in control greenhouse including space heating. In test greenhouse including space heating, crown heating and warm water supplying, total 9,475.7×103 kcal of energy was consumed. So, energy consumption in test was 43.8% less than in the control. The total yields of strawberry during test period were 412.7g/plant for test greenhouse and 393.3g/plant for control greenhouse respectively.

Current Statues of Phenomics and its Application for Crop Improvement: Imaging Systems for High-throughput Screening (작물육종 효율 극대화를 위한 피노믹스(phenomics) 연구동향: 화상기술을 이용한 식물 표현형 분석을 중심으로)

  • Lee, Seong-Kon;Kwon, Tack-Ryoun;Suh, Eun-Jung;Bae, Shin-Chul
    • Korean Journal of Breeding Science
    • /
    • v.43 no.4
    • /
    • pp.233-240
    • /
    • 2011
  • Food security has been a main global issue due to climate changes and growing world population expected to 9 billion by 2050. While biodiversity is becoming more highlight, breeders are confronting shortage of various genetic materials needed for new variety to tackle food shortage challenge. Though biotechnology is still under debate on potential risk to human and environment, it is considered as one of alternative tools to address food supply issue for its potential to create a number of variations in genetic resource. The new technology, phenomics, is developing to improve efficiency of crop improvement. Phenomics is concerned with the measurement of phenomes which are the physical, morphological, physiological and/or biochemical traits of organisms as they change in response to genetic mutation and environmental influences. It can be served to provide better understanding of phenotypes at whole plant. For last decades, high-throughput screening (HTS) systems have been developed to measure phenomes, rapidly and quantitatively. Imaging technology such as thermal and chlorophyll fluorescence imaging systems is an area of HTS which has been used in agriculture. In this article, we review the current statues of high-throughput screening system in phenomics and its application for crop improvement.

Analysis of Fish Ecology and Water Quality for Health Assessments of Geum - River Watershed (금강본류의 건강성 평가를 위한 어류생태 및 수질 특성분석)

  • Park, Yun-Jeong;Lee, Sang-Jae;An, Kwang Guk
    • Korean Journal of Environment and Ecology
    • /
    • v.33 no.2
    • /
    • pp.187-201
    • /
    • 2019
  • This study examined the physicochemical water quality and evaluated the ecological health in 14 sites of Geum River (upstream, mid-stream, and downstream) using the fish community distribution and guilds and eight multi-variable matrices of FAI (Fish Assessment Index) during June 2008-May 2009. The analysis of the water quality variables showed no significant variation in the upstream and mid-stream but a sharp variation due to the accumulation of organic matter from the point where the treated water of Gap and Miho streams flew. The analysis of physicochemical water properties showed that BOD, COD, TN, TP, Cond, and Chl-a tended to increase while DO decreased to cause eutrophication and algae development from the downstream where Miho and Gap stream merged. The analysis of fish community showed that the species richness index and species diversity index increased in the mid-stream area but decreased in the downstream area, indicating the stable ecosystem in the upper stream and the relatively unstable ecosystem in the downstream. The analysis of the species distribution showed that the dominant species were Zacco platypus that accounted for 20.9% of all fish species and Zacco koreanus that accounted for 13.1%. The analysis of the fish tolerance and feeding guild characteristics showed that the sensitive species, the insectivore species, and the aquatic species were dominant in the mid-stream point. On the other hand, contaminants from the sewage water treatment plant of Miho stream had a profound effect in the downstream to show the dominance of tolerant species, omnivorous species, and lentic species. Therefore, it is necessary to improve water quality by reducing the load of urban pollutants and to pay attention to the conservation and restoration of aquatic ecosystems.

A Study on the Establishment Case of Technical Standard for Electronic Record Information Package (전자문서 정보패키지 구축 사례 연구 - '공인전자문서보관소 전자문서 정보패키지 기술규격 개발 연구'를 중심으로-)

  • Kim, Sung-Kyum
    • The Korean Journal of Archival Studies
    • /
    • no.16
    • /
    • pp.97-146
    • /
    • 2007
  • Those days when people used paper to make up and manage all kinds of documents in the process of their jobs are gone now. Today electronic types of documents have replaced paper. Unlike paper documents, electronic ones contribute to the maximum job efficiency with their convenience in production and storage. But they too have some disadvantages; it's difficult to distinguish originals and copies like paper documents; it's not easy to examine if there is a change or damage to the documents; they are also prone to alteration and damage by the external influences in the electronic environment; and electronic documents require enormous amounts of workforce and costs for immediate measures to be taken according to the changes to the S/W and H/W environment. Despite all those weaknesses, however, electronic documents increasingly account for more percentage in the current job environment thanks to their job convenience and efficiency of production costs. Both the government and private sector have made efforts to come up with plans to maximize their advantages and minimize their risks at the same time. One of the methods is the Authorized Retention Center which is described in the study. There are a couple of prerequisites for its smooth operation; they should guarantee the legal validity of electronic documents in the administrative aspects and first secure the reliability and authenticity of electronic documents in the technological aspects. Responding to those needs, the Ministry of Commerce, Industry and Energy and the Korea Institute for Electronic Commerce, which were the two main bodies to drive the Authorized Retention Center project, revised the Electronic Commerce Act and supplemented the provisions to guarantee the legal validity of electronic documents in 2005 and conducted researches on the ways to preserve electronic documents for a long term and secure their reliability, which had been demanded by the users of the center, in 2006. In an attempt to fulfill those goals of the Authorized Retention Center, this study researched technical standard for electronic record information package of the center and applied the ISO 14721 information package model that's the standard for the long-term preservation of digital data. It also suggested a process to produce and manage information package so that there would be the SIP, AIP and DIP metadata features for the production, preservation, and utilization by users points of electronic documents and they could be implemented according to the center's policies. Based on the previous study, the study introduced the flow charts among the production and progress process, application methods and packages of technical standard for electronic record information package at the center and suggested some issues that should be consistently researched in the field of records management based on the results.

A study on the change effect of emission regulation mode on vehicle emission gas (배기가스 규제 모드 변화가 차량 배기가스에 미치는 영향 연구)

  • Lee, Min-Ho;Kim, Ki-Ho;Lee, Joung-Min
    • Journal of the Korean Applied Science and Technology
    • /
    • v.35 no.4
    • /
    • pp.1108-1119
    • /
    • 2018
  • As the interest on the air pollution is gradually rising at home and abroad, automotive and fuel researchers have been studied on the exhaust and greenhouse gas emission reduction from vehicles through a lot of approaches, which consist of new engine design, innovative after-treatment systems, using clean (eco-friendly alternative) fuels and fuel quality improvement. This research has brought forward two main issues : exhaust emissions (regulated and non-regulated emissions, PM particle matter) and greenhouse gases of vehicle. Exhaust emissions and greenhouse gases of automotive had many problem such as the cause of ambient pollution, health effects. In order to reduce these emissions, many countries are regulating new exhaust gas test modes. Worldwide harmonized light-duty vehicle test procedure (WLTP) for emission certification has been developed in WP.29 forum in UNECE since 2007. This test procedure was applied to domestic light duty diesel vehicles at the same time as Europe. The air pollutant emissions from light-duty vehicles are regulated by the weight per distance, which the driving cycles can affect the results. Exhaust emissions of vehicle varies substantially based on climate conditions, and driving habits. Extreme outside temperatures tend to increasing the emissions, because more fuel must be used to heat or cool the cabin. Also, high driving speeds increases the emissions because of the energy required to overcome increased drag. Compared with gradual vehicle acceleration, rapid vehicle acceleration increases the emissions. Additional devices (air-conditioner and heater) and road inclines also increases the emissions. In this study, three light-duty vehicles were tested with WLTP, NEDC, and FTP-75, which are used to regulate the emissions of light-duty vehicles, and how much emissions can be affected by different driving cycles. The emissions gas have not shown statistically meaningful difference. The maximum emission gas have been found in low speed phase of WLTP which is mainly caused by cooled engine conditions. The amount of emission gas in cooled engine condition is much different as test vehicles. It means different technical solution requires in this aspect to cope with WLTP driving cycle.

Anomaly Detection for User Action with Generative Adversarial Networks (적대적 생성 모델을 활용한 사용자 행위 이상 탐지 방법)

  • Choi, Nam woong;Kim, Wooju
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.3
    • /
    • pp.43-62
    • /
    • 2019
  • At one time, the anomaly detection sector dominated the method of determining whether there was an abnormality based on the statistics derived from specific data. This methodology was possible because the dimension of the data was simple in the past, so the classical statistical method could work effectively. However, as the characteristics of data have changed complexly in the era of big data, it has become more difficult to accurately analyze and predict the data that occurs throughout the industry in the conventional way. Therefore, SVM and Decision Tree based supervised learning algorithms were used. However, there is peculiarity that supervised learning based model can only accurately predict the test data, when the number of classes is equal to the number of normal classes and most of the data generated in the industry has unbalanced data class. Therefore, the predicted results are not always valid when supervised learning model is applied. In order to overcome these drawbacks, many studies now use the unsupervised learning-based model that is not influenced by class distribution, such as autoencoder or generative adversarial networks. In this paper, we propose a method to detect anomalies using generative adversarial networks. AnoGAN, introduced in the study of Thomas et al (2017), is a classification model that performs abnormal detection of medical images. It was composed of a Convolution Neural Net and was used in the field of detection. On the other hand, sequencing data abnormality detection using generative adversarial network is a lack of research papers compared to image data. Of course, in Li et al (2018), a study by Li et al (LSTM), a type of recurrent neural network, has proposed a model to classify the abnormities of numerical sequence data, but it has not been used for categorical sequence data, as well as feature matching method applied by salans et al.(2016). So it suggests that there are a number of studies to be tried on in the ideal classification of sequence data through a generative adversarial Network. In order to learn the sequence data, the structure of the generative adversarial networks is composed of LSTM, and the 2 stacked-LSTM of the generator is composed of 32-dim hidden unit layers and 64-dim hidden unit layers. The LSTM of the discriminator consists of 64-dim hidden unit layer were used. In the process of deriving abnormal scores from existing paper of Anomaly Detection for Sequence data, entropy values of probability of actual data are used in the process of deriving abnormal scores. but in this paper, as mentioned earlier, abnormal scores have been derived by using feature matching techniques. In addition, the process of optimizing latent variables was designed with LSTM to improve model performance. The modified form of generative adversarial model was more accurate in all experiments than the autoencoder in terms of precision and was approximately 7% higher in accuracy. In terms of Robustness, Generative adversarial networks also performed better than autoencoder. Because generative adversarial networks can learn data distribution from real categorical sequence data, Unaffected by a single normal data. But autoencoder is not. Result of Robustness test showed that he accuracy of the autocoder was 92%, the accuracy of the hostile neural network was 96%, and in terms of sensitivity, the autocoder was 40% and the hostile neural network was 51%. In this paper, experiments have also been conducted to show how much performance changes due to differences in the optimization structure of potential variables. As a result, the level of 1% was improved in terms of sensitivity. These results suggest that it presented a new perspective on optimizing latent variable that were relatively insignificant.