• 제목/요약/키워드: Flash Over

검색결과 189건 처리시간 0.021초

차량 운행에 따른 엔진오일의 성능특성 평가 연구 (The study on performance of characteristics in engine oil by vehicle driving)

  • 이정민;임영관;정충섭;;한관욱;나병기
    • 에너지공학
    • /
    • 제22권2호
    • /
    • pp.237-244
    • /
    • 2013
  • 엔진오일은 다양한 내연기관의 윤활을 위한 가장 기본적인 윤활제이다. 최근 자동차사 및 엔진오일 제조사 등은 엔진오일의 교환주기를 15000~20000 km를 권장하고 있는 반면, 대부분의 정비관련업자 및 운전자들은 엔진오일의 성능진보와는 별개로 관습적으로 매5000 km 마다 엔진오일을 교환권유 또는 인식하고 있다. 빈번한 엔진오일의 교환은 폐엔진오일로 인한 환경오염과 차량 유지비 등의 증가요인으로 대두되고 있다. 따라서 본 연구에서는 신유와 실제 운행조건의 차량을 활용하여 5000 km, 10000 km를 각각 주행한 후 회수된 사용엔진오일의 다양한 물성으로 인화점, 유동점, 동점도, 저온겉보기점도, 전산가, 4구식 내마모시험 및 금속분을 조사 분석하였다. 결과적으로 사용엔진오일은 신유에 비해 전산가, 마모흔, 철과 구리 성분 증가가 다소 관찰되었지만, 주행거리별 사용엔진오일의 물성 및 금속분 변화는 거의 유사하였다.

자동변속기유(ATF) 교환주기 모니터링 연구 (The Monitoring Study of Exchange Cycle of Automatic Transmission Fluid)

  • 임영관;정충섭;이정민;한관욱;나병기
    • 공업화학
    • /
    • 제24권3호
    • /
    • pp.274-278
    • /
    • 2013
  • 자동변속기유(ATF)는 자동차의 자가변속 또는 자동변속에 사용되는 동력전달 유체로서 최근 자동차사에서는 무점검 무교환 또는 80000~100000 km를 주행한 뒤, 교체하도록 권장하고 있다. 하지만 한국석유관리원의 설문조사에 의하면 많은 운전자들이 50000 km 이하에서 자동변속기유를 교환하고 있으며, 이는 차량유지비의 증가 및 폐유에 의한 환경오염으로까지 발전될 수 있다. 본 연구에서는 신유와 실제 차량 주행 후 회수된 사용유(50000 km, 100000 km)에 대한 대표적 물성으로서 인화점, 동점도, 저온점도특성, 전산가, 윤활성 등에 대해 분석하였다. 분석결과 기포성을 제외한 모든 물성이 신유규격에 적합하였으며, 50000 km와 100000 km 주행 후 회수된 자동변속기유의 물성차이는 크지 않았다. 따라서 자동차사에서 권장하고 있는 교환주기(80000 km 이상)에서 자동변속기유를 교환하여도 큰 문제가 없을 것이라 판단되며, 이로 인해 비용절감 및 환경오염 방지에 기여할 수 있을 것이라 사료된다.

Optimization and characterization of biodiesel produced from vegetable oil

  • Mustapha, Amina T.;Abdulkareem, Saka A.;Jimoh, Abdulfatai;Agbajelola, David O.;Okafor, Joseph O.
    • Advances in Energy Research
    • /
    • 제1권2호
    • /
    • pp.147-163
    • /
    • 2013
  • The world faces several issues of energy crisis and environmental deterioration due to over-dependence on single source of which is fossil fuel. Though, fuel is needed as ingredients for industrial development and growth of any country, however the fossil fuel which is a major source of energy for this purpose has always been terrifying thus the need for alternative and renewable energy sources. The search for alternative energy sources resulted into the acceptance of a biofuel as a reliable alternative energy source. This work presents the study of optimization of process of transesterification of vegetable oil to biodiesel using NaOH as catalyst. A $2^4$ factorial design method was employed to investigate the influence of ratio of oil to methanol, temperature, NaOH concentration, and transesterification time on the yield of biodiesel from vegetable oil. Low and high levels of the key factors considered were 4:1 and 6:1 mole ratio, 30 and $60^{\circ}C$ temperatures, 0.5 and 1.0 wt% catalyst concentration, and 30 and 60 min reaction time. Results obtained revealed that oil to methanol molar ratio of 6:1, tranesetrification temperature of $60^{\circ}C$, catalyst concentration of 1.0wt % and reaction time of 30 min are the best operating conditions for the optimum yield of biofuel from vegetable oil, with optimum yield of 95.8%. Results obtained on the characterizzation of the produced biodiesel indicate that the specific gravity, cloud point, flash point, sulphur content, viscosity, diesel index, centane number, acid value, free glycerine, total glycerine and total recovery are 0.8899, 4, 13, 0.0087%, 4.83, 25, 54.6. 0.228mgKOH/g, 0.018, 0.23% and 96% respectively. Results also indicate that the qualities of the biodiesel tested for are in conformity with the set standard. A model equation was developed based on the results obtained using a statistical tool. Analysis of variance (ANOVA) of data shows that mole ratio of ground nut oil to methanol and transesterification time have the most pronounced effect on the biodiesel yield with contributions of 55.06% and 9.22% respectively. It can be inferred from the results various conducted that vegetable oil locally produced from groundnut oil can be utilized as a feedstock for biodiesel production.

차량 운행에 따른 자동변속기유(ATF) 금속분 분석평가 연구 (A study on the evaluation of metal component in automatic transmission fluid by vehicle driving)

  • 이정민;임영관;도진우;정충섭;한관욱;나병기
    • 에너지공학
    • /
    • 제23권2호
    • /
    • pp.28-34
    • /
    • 2014
  • 자동변속기유는 자동차의 자동변속기의 성능을 유지시키기 위해 사용되는 유체이다. 최근 자동차 제조사에서는 일반적으로 자동변속기유를 80000~100000 km 주행후 교환 또는 무교환을 보증하고 있지만 국내에서는 많은 운전자들이 50000 km 이하에서도 자동변속기유를 교환하고 있는 것으로 조사되었다. 빈번한 자동변속기유의 교환은 환경오염과 차량유지비용을 상승시키는 원인으로 작용되고 있다. 본 연구에서는 사용하지 않은 신유와 50000 km와 100000 km를 각각 주행한 뒤 회수된 자동변속기유를 대상으로 인화점, 연소점, 유동점, 동점도, 저온겉보기점도, 전산가, 금속분과 같은 물리적 특성을 분석하였다. 연구결과, 신유에 비해 사용유는 전산가, 유동점, 금속분이 증가되는 것을 확인하였지만, 두 종류의 사용유(50000 km, 100000 km)의 물리적 특성과 금속분 함량의 차이는 크지 않음을 알수 있었다.

산지유역의 지형위치 및 지형분석을 통한 재해 위험도 예측 (Disaster risk predicted by the Topographic Position and Landforms Analysis of Mountainous Watersheds)

  • 오채연;전계원
    • 한국방재안전학회논문집
    • /
    • 제11권2호
    • /
    • pp.1-8
    • /
    • 2018
  • 최근 기후 변화로 인해 전 세계적으로 이상기후 현상이 일어나고 있으며 우리나라도 예외는 아니다. 과거의 강우기록을 갱신하는 강우가 지속적으로 발생하고 있으며 특히 국지성 집중호우의 경우 짧은 시간에 많은 양의 강우가 좁은 지역에 발생하고 있어 산지재해 발생 또한 증가 하고 있다. 강원도의 경우 지역적 특성상 대부분 산지로 이루어져 있어 경사가 가파르고 토심 또한 얕아 산사태에 의해 많은 피해를 입고 있다. 그러므로 본 연구에서는 산지유역에 지형분류기법과 산사태 위험성 예측기법을 적용하여 재해 위험도를 예측하고자 하였다. 지형분류기법은 지형위치지수를(TPI)를 계산하여 위험 지형을 분류하고 토석류 예측기법중 하나인 SINMAP 방법을 사용하여 산지재해 발생 가능지역을 예측하였다. 그 결과 지형분류기법에서는 전체 유역 중 약 63% 이상 완경사지와 급경사지로 분류되었으며 SINMAP 분석에서는 전체 유역 중 약 58%가 위험 지역으로 분석되었다. 최근 각종 개발로 인해 산지재해의 저감 대책이 마련이 시급한 실정이며 재해 위험 구간에 대한 안정성 대책을 수립하여야 한다.

WMS 모형을 활용한 산지 소하천 유역의 유출량 산정 (Flood Runoff Computation for Mountainous Small Basins using WMS Model)

  • 장형준;이정영;이효상
    • 한국방재안전학회논문집
    • /
    • 제14권4호
    • /
    • pp.9-15
    • /
    • 2021
  • 최근 급증하는 이상기후의 영향으로 산지 지역의 돌발홍수의 발생 빈도가 증가하고 있으며, 이로 인하여 인명 및 물적 피해가 증가하고 있다. 이에 다양한 재해저감 대책방안을 수립하고 있으나 국립공원과 같이 자연보호를 최우선으로 하는 계곡 및 하천부는 제방 증고 및 하상 정리 등과 같은 인위적인 대책이 어려운 실정이다. 따라서, 본 연구에서는 우리나라 국립공원 중 계룡산 국립공원을 대상으로 유역내 계곡 및 하천부에 대하여 WMS(Watershed Modeling System) 강우유출모형을 활용한 침수 위험도 평가를 수행하였다. 분석 결과, 계룡산 국립공원 중 3개 소유역(지석골, 수통골, 동학사)에서 재현빈도 50년 이상의 강우 발생 시, 침수가 발생하는 것으로 모의 되었으며, 탐방객이 이용하는 탐방로 및 시설물에 위험성이 높게 나타남을 확인하였다. 본 연구 결과를 통하여 국립공원 내 탐방로에 대한 위험성을 정량적으로 제시하였으며, 이를 바탕으로 향후 국립공원의 안전한 관리 방향을 제시하고자 한다.

과학탐구 실험대회의 문제점 분석 (Critical Analyses of '2nd Science Inquiry Experiment Contest')

  • 백성혜
    • 한국과학교육학회지
    • /
    • 제15권2호
    • /
    • pp.173-184
    • /
    • 1995
  • The purpose of this study was to analyse the problems of 'Science Inquiry Experiment Contest(SIEC)' which was one of 8 programs of 'The 2nd Student Science Inquiry Olympic Meet(SSIOM)'. The results and conclusions of this study were as follows: 1. It needs to reconsider the role of practical work within science experiment because practical work skills form one of the mainstays in current science. But the assessment of students' laboratory skills in the contest was made little account of. It is necessary to remind of what it means to be 'good at science'. There are two aspects: knowing and doing. Both are important and, in certain respects, quite distinct. Doing science is more of a craft activity, relying more on craft skill and tacit knowledge than on the conscious application of explicit knowledge. Doing science is also divided into two aspects, 'process' and 'skill' by many science educators. 2. The report's and checklist's assessment items were overlapped. Therefore it was suggested that the checklist assessment items were set limit to the students' acts which can't be found in reports. It is important to identify those activities which produce a permanent assessable product, and those which do not. Skills connected with recording and reporting are likely to produce permanent evidence which can be evaluated after the experiment. Those connected with manipulative skills involving processes are more ephemeral and need to be assessed as they occur. The division of student's experimental skills will contribute to the accurate assess of student's scientific inquiry experimental ability. 3. There was a wide difference among the scores of one participant recorded by three evaluators. This means that there was no concrete discussion among the evaluators before the contest. Despite the items of the checklists were set by preparers of the contest experiments, the concrete discussions before the contest were necessary because students' experimental acts were very diverse. There is a variety of scientific skills. So it is necessary to assess the performance of individual students in a range of skills. But the most of the difficulties in the assessment of skills arise from the interaction between measurement and the use. To overcome the difficulties, not only must the mark needed for each skill be recorded, something which all examination groups obviously need, but also a description of the work that the student did when the skill was assessed must also be given, and not all groups need this. Fuller details must also be available for the purposes of moderation. This is a requirement for all students that there must be provision for samples of any end-product or other tangible form of evidence of candidates' work to be submitted for inspection. This is rather important if one is to be as fair as possible to students because, not only can this work be made available to moderators if necessary, but also it can be used to help in arriving at common standards among several evaluators, and in ensuring consistent standards from one evaluator over the assessment period. This need arises because there are problems associated with assessing different students on the same skill in different activities. 4. Most of the students' reports were assessed intuitively by the evaluators despite the assessment items were established concretely by preparers of the experiment. This result means that the evaluators were new to grasp the essence of the established assessment items of the experiment report and that the students' assessment scores were short of objectivity. Lastly, there are suggestions from the results and the conclusions. The students' experimental acts which were difficult to observe because they occur in a flash and which can be easily imitated should be excluded from the assessment items. Evaluators are likely to miss the time to observe the acts, and the students who are assessed later have more opportunity to practise the skill which is being assessed. It is necessary to be aware of these problems and try to reduce their influence or remove them. The skills and processes analysis has made a very useful checklist for scientific inquiry experiment assessment. But in itself it is of little value. It must be seen alongside the other vital attributes needed in the making of a good scientist, the affective aspects of commitment and confidence, the personal insights which come both through formal and informal learning, and the tacit knowledge that comes through experience, both structured and acquired in play. These four aspects must be continually interacting, in a flexible and individualistic way, throughout the scientific education of students. An increasing ability to be good at science, to be good at doing investigational practical work, will be gained through continually, successively, but often unpredictably, developing more experience, developing more insights, developing more skills, and producing more confidence and commitment.

  • PDF

정적 영상에서 Noise Reduction Software의 이해와 적용 (The Understanding and Application of Noise Reduction Software in Static Images)

  • 이형진;송호준;승종민;최진욱;김진의;김현주
    • 핵의학기술
    • /
    • 제14권1호
    • /
    • pp.54-60
    • /
    • 2010
  • 본원에 도입된 새로운 소프트웨어는 SPECT나 전신 뼈 영상에만 국한되어 사용되어 지고 있지만 보다 효과적으로 다른 검사에 적용하기 위해 팬텀을 통한 실험과 영상의 비교를 통하여 그 유용성을 찾아보고자 하였다. 실험을 위하여 Body IEC phantom과 Jaszczak ECT phantom, Capillary를 이용한 실린더 팬텀을 이용하였고, 영상의 처리 전후의 계수, statistics를 비교해 보고 contrast ratio나 BKG의 변화들을 정량적으로 분석해 보았다. Capillary source를 이용한 FWHM 비교에서는 PIXON의 경우 처리 전후의 영상에서 차이가 거의 없었고, ASTONISH의 경우 처리 후의 영상이 우수해짐을 확인할 수 있었다. 반면 Standard deviation과 그에 따른 Variance는 PIXON은 다소 감소한 반면 ASTONISH는 큰 폭으로 증가함을 보였다. IEC phantom을 이용한 BKG variability 비교에서는 PIXON의 경우 전체적으로 감소한 반면 ASTONISH는 다소 증가하는 경향을 보였고, 각각의 sphere에 대한 contrast ratio도 두 가지 방법 모두 향상됨을 확인하였다. 영상의 스케일 면에서도 PIXON의 경우 처리 후에는 window width가 약 4-5배 증가하였지만 ASTONISH에서는 큰 차이가 없었다. 팬텀 실험 분석 후 ASTONISH는 정량적 분석을 위해 ROI를 그려야 하는 기타 검사와 대조도를 강조하는 검사에 적용 가능성을 보였고, PIXON은 획득계수가 부족하거나 SNR이 낮은 핵의학 검사에 유용하게 사용될 것으로 생각되었다. 영상의 분석 인자로 많이 사용되는 정량적인 수치들은 소프트웨어의 적용 후 대체로 향상되었지만 감마카메라의 차이보다 소프트웨어간의 알고리즘 특성으로 인한 결과영상의 차이가 많아 모든 핵의학 검사의 적용에 있어서 일관성을 유지하기는 어려울 것으로 사료된다. 또한 전신 뼈 영상과 같이 검사시간의 획기적 단축과 같은 수단으로는 우수한 영상의 질을 기대하기 어렵다. 새로운 소프트웨어의 도입 시 병원의 특성에 맞는 protocol과 임상 적용 전에 많은 연구가 필요할 것으로 사료된다.

  • PDF

Information Privacy Concern in Context-Aware Personalized Services: Results of a Delphi Study

  • Lee, Yon-Nim;Kwon, Oh-Byung
    • Asia pacific journal of information systems
    • /
    • 제20권2호
    • /
    • pp.63-86
    • /
    • 2010
  • Personalized services directly and indirectly acquire personal data, in part, to provide customers with higher-value services that are specifically context-relevant (such as place and time). Information technologies continue to mature and develop, providing greatly improved performance. Sensory networks and intelligent software can now obtain context data, and that is the cornerstone for providing personalized, context-specific services. Yet, the danger of overflowing personal information is increasing because the data retrieved by the sensors usually contains privacy information. Various technical characteristics of context-aware applications have more troubling implications for information privacy. In parallel with increasing use of context for service personalization, information privacy concerns have also increased such as an unrestricted availability of context information. Those privacy concerns are consistently regarded as a critical issue facing context-aware personalized service success. The entire field of information privacy is growing as an important area of research, with many new definitions and terminologies, because of a need for a better understanding of information privacy concepts. Especially, it requires that the factors of information privacy should be revised according to the characteristics of new technologies. However, previous information privacy factors of context-aware applications have at least two shortcomings. First, there has been little overview of the technology characteristics of context-aware computing. Existing studies have only focused on a small subset of the technical characteristics of context-aware computing. Therefore, there has not been a mutually exclusive set of factors that uniquely and completely describe information privacy on context-aware applications. Second, user survey has been widely used to identify factors of information privacy in most studies despite the limitation of users' knowledge and experiences about context-aware computing technology. To date, since context-aware services have not been widely deployed on a commercial scale yet, only very few people have prior experiences with context-aware personalized services. It is difficult to build users' knowledge about context-aware technology even by increasing their understanding in various ways: scenarios, pictures, flash animation, etc. Nevertheless, conducting a survey, assuming that the participants have sufficient experience or understanding about the technologies shown in the survey, may not be absolutely valid. Moreover, some surveys are based solely on simplifying and hence unrealistic assumptions (e.g., they only consider location information as a context data). A better understanding of information privacy concern in context-aware personalized services is highly needed. Hence, the purpose of this paper is to identify a generic set of factors for elemental information privacy concern in context-aware personalized services and to develop a rank-order list of information privacy concern factors. We consider overall technology characteristics to establish a mutually exclusive set of factors. A Delphi survey, a rigorous data collection method, was deployed to obtain a reliable opinion from the experts and to produce a rank-order list. It, therefore, lends itself well to obtaining a set of universal factors of information privacy concern and its priority. An international panel of researchers and practitioners who have the expertise in privacy and context-aware system fields were involved in our research. Delphi rounds formatting will faithfully follow the procedure for the Delphi study proposed by Okoli and Pawlowski. This will involve three general rounds: (1) brainstorming for important factors; (2) narrowing down the original list to the most important ones; and (3) ranking the list of important factors. For this round only, experts were treated as individuals, not panels. Adapted from Okoli and Pawlowski, we outlined the process of administrating the study. We performed three rounds. In the first and second rounds of the Delphi questionnaire, we gathered a set of exclusive factors for information privacy concern in context-aware personalized services. The respondents were asked to provide at least five main factors for the most appropriate understanding of the information privacy concern in the first round. To do so, some of the main factors found in the literature were presented to the participants. The second round of the questionnaire discussed the main factor provided in the first round, fleshed out with relevant sub-factors. Respondents were then requested to evaluate each sub factor's suitability against the corresponding main factors to determine the final sub-factors from the candidate factors. The sub-factors were found from the literature survey. Final factors selected by over 50% of experts. In the third round, a list of factors with corresponding questions was provided, and the respondents were requested to assess the importance of each main factor and its corresponding sub factors. Finally, we calculated the mean rank of each item to make a final result. While analyzing the data, we focused on group consensus rather than individual insistence. To do so, a concordance analysis, which measures the consistency of the experts' responses over successive rounds of the Delphi, was adopted during the survey process. As a result, experts reported that context data collection and high identifiable level of identical data are the most important factor in the main factors and sub factors, respectively. Additional important sub-factors included diverse types of context data collected, tracking and recording functionalities, and embedded and disappeared sensor devices. The average score of each factor is very useful for future context-aware personalized service development in the view of the information privacy. The final factors have the following differences comparing to those proposed in other studies. First, the concern factors differ from existing studies, which are based on privacy issues that may occur during the lifecycle of acquired user information. However, our study helped to clarify these sometimes vague issues by determining which privacy concern issues are viable based on specific technical characteristics in context-aware personalized services. Since a context-aware service differs in its technical characteristics compared to other services, we selected specific characteristics that had a higher potential to increase user's privacy concerns. Secondly, this study considered privacy issues in terms of service delivery and display that were almost overlooked in existing studies by introducing IPOS as the factor division. Lastly, in each factor, it correlated the level of importance with professionals' opinions as to what extent users have privacy concerns. The reason that it did not select the traditional method questionnaire at that time is that context-aware personalized service considered the absolute lack in understanding and experience of users with new technology. For understanding users' privacy concerns, professionals in the Delphi questionnaire process selected context data collection, tracking and recording, and sensory network as the most important factors among technological characteristics of context-aware personalized services. In the creation of a context-aware personalized services, this study demonstrates the importance and relevance of determining an optimal methodology, and which technologies and in what sequence are needed, to acquire what types of users' context information. Most studies focus on which services and systems should be provided and developed by utilizing context information on the supposition, along with the development of context-aware technology. However, the results in this study show that, in terms of users' privacy, it is necessary to pay greater attention to the activities that acquire context information. To inspect the results in the evaluation of sub factor, additional studies would be necessary for approaches on reducing users' privacy concerns toward technological characteristics such as highly identifiable level of identical data, diverse types of context data collected, tracking and recording functionality, embedded and disappearing sensor devices. The factor ranked the next highest level of importance after input is a context-aware service delivery that is related to output. The results show that delivery and display showing services to users in a context-aware personalized services toward the anywhere-anytime-any device concept have been regarded as even more important than in previous computing environment. Considering the concern factors to develop context aware personalized services will help to increase service success rate and hopefully user acceptance for those services. Our future work will be to adopt these factors for qualifying context aware service development projects such as u-city development projects in terms of service quality and hence user acceptance.