• Title/Summary/Keyword: Computer operation

Search Result 4,101, Processing Time 0.042 seconds

The effects of the direct nursing care hours with establishment of the nurse substations (Nurse Substation 운영이 직접간호시간 증가에 미치는 효과)

  • Lee, Chug-Hee;Sung, Young-Hee;Kwon, In-Gak;Lee, Soon-Kyu;Jung, Yoen-Yi;Hoe, Sung-Hee;Ryoo, Sung-Suk;Kim, Jung-Suk
    • Journal of Korean Academy of Nursing Administration
    • /
    • v.3 no.2
    • /
    • pp.61-80
    • /
    • 1997
  • The purpose of this study is to measure the direct and indirect nursing care hours with establishment of nurse substations and compare the experimental nursing units with the existing nursing units For this study, two experimental nursing units: (1) a medical nursing unit and (2) a surgical nursing unit with a nurse substation were selected. And two control nursing units : (1) a medical nursing unit and (2) a surgical nursing unit without a nurse substation were selected. After a three-month experimental operation from June 1 to August 31,1996, research data were collected for three days from September 2 to 4, 1996. We investigated the effects of the direct & indirect nursing care hours with establishment of the nurse substations (improved nursing environment) without adding the staff nurses. The effect of establishment of the nurse sub-station was measured for the differences direct & indirect nursing care hours between experimental and control nursing units. An investigator measured the time for a staff nurse to practice each nursing activity and recorded it every minute. Percentage, average, standard deviation, t-test and ANOVA were used for data analysis. The results are as follows: 1. There was no significant difference between the experimental and control nursing units in staffs' working hours during their shift. 2. There were significant diffferences between the experimental and control nursing units in dierct nursing care hours (t=0.0288, p=0.0001) and indirect nursing care hours (t=0.3886, p=0.0103) per patient. 3. There was significant difference between the experimental and control nursing units in direct nursing care hours done by nurses(t=0.0012, p=0.0111) and aids(t=0.3011, p=0.0027). There was significant difference between the experimental and control nursing units in indirect nursing care hours done by head-nurses(t=0.0051, p=0.0253), nurses(t=0.0071, p=0.0024) and aids (t=0.3227, p=0.0351). There was significant difference between the experimental and control nursing units in indirect nursing care hours done by nurses(t=0.0005, p=0.0015) and aids(t=0.2400, p=0.0013) per patient. There was significant difference between the experimental and control nursing units in indirect nursing care hours done by head-nurses(t=0.0005, p=0.0379) and nurses (t=0.0035, p=0.0198) per patient. 4. Thre were significant differences between the experimental and control nursing units in direct nursing care hours (t=0.1134, p=0.0010) and indirect nursing care hours (t=0.7106, p=0.0008) per staff during the day shift. There were significant differences between the experimental and control nursing units in direct nunsing care hours during the day(t=0.0723, p=0.0003) and evening shift (t=0.0004, p=0.0285) per patient, and indirect nursing care hours during the day shift(t=0.5565, p=0.0036) per patient. 5. There were differences between the experiemental and control nursing units in dircet nursing activities including measurement and observation, medication, communication, teratment, hygiene, and nutrition, and in indirect nursing activities including confirmantion, communication, record, computer work, management of goods. But it was not statistically proven. 6. There was difference between the experimental and control nursing units in unmet-need nursing care hours per patient, but not statistically proven.

  • PDF

Evaluation of the accuracy of two different surgical guides in dental implantology: stereolithography fabricated vs. positioning device fabricated surgical guides (제작방법에 따른 임플란트 수술 가이드의 정확성비교: stereolithography와 positioning device로 제작한 수술 가이드)

  • Kwon, Chang-Ryeol;Choi, Byung-Ho;Jeong, Seung-Mi;Joo, Sang-Dong
    • The Journal of Korean Academy of Prosthodontics
    • /
    • v.50 no.4
    • /
    • pp.271-278
    • /
    • 2012
  • Purpose: Recently implant surgical guides were used for accurate and atraumatic operation. In this study, the accuracy of two different types of surgical guides, positioning device fabricated and stereolithography fabricated surgical guides, were evaluated in four different types of tooth loss models. Materials and methods: Surgical guides were fabricated with stereolithography and positioning device respectively. Implants were placed on 40 models using the two different types of surgical guides. The fitness of the surgical guides was evaluated by measuring the gap between the surgical guide and the model. The accuracy of surgical guide was evaluated on a pre- and post-surgical CT image fusion. Results: The gap between the surgical guide and the model was $1.4{\pm}0.3mm$ and $0.4{\pm}0.3mm$ for the stereolithography and positioning device surgical guide, respectively. The stereolithography showed mesiodistal angular deviation of $3.9{\pm}1.6^{\circ}$, buccolingual angular deviation of $2.7{\pm}1.5^{\circ}$ and vertical deviation of $1.9{\pm}0.9mm$, whereas the positioning device showed mesiodistal angular deviation of $0.7{\pm}0.3^{\circ}$, buccolingual angular deviation of $0.3{\pm}0.2^{\circ}$ and vertical deviation of $0.4{\pm}0.2mm$. The differences were statistically significant between the two groups (P<.05). Conclusion: The laboratory fabricated surgical guides using a positioning device allow implant placement more accurately than the stereolithography surgical guides in dental clinic.

Long-term Climate Change Research Facility for Trees: CO2-Enriched Open Top Chamber System (수목의 장기 기후변화 연구시설: CO2 폭로용 상부 개방형 온실)

  • Lee, Jae-Cheon;Kim, Du-Hyun;Kim, Gil-Nam;Kim, Pan-Gi;Han, Sim-Hee
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.14 no.1
    • /
    • pp.19-27
    • /
    • 2012
  • The open-top chamber (OTC) system is designed for long term studies on the climate change impact on the major tree species and their community in Korea. In Korea Forest Research Institute (KFRI), the modified OTC system has been operating since September 2009. The OTC facility consists of six decagon chambers (10 meters in diameter by 7 meters high) with controlled gas concentration. In each chamber, a series of vertical vent pipes are installed to disperse carbon dioxide or normal air into the center of the chamber. The OTC is equipped with remote controlled computer system in order to maintain a stable and elevated concentration of carbon dioxide in the chamber throughout the experimental period. The experiment consisted of 4 treatments: two elevated $CO_2$ levels ($1.4{\times}$ and $1.8{\times}$ ambient $CO_2$) and two controls (inside and outdoors of the OTC). Average operational rate was the lowest (94.2%) in June 2010 but increased to 98% in July 2010 and was 100% during January to December 2011. In 2010~2011, $CO_2$ concentrations inside the OTCs reached the target programmed values, and have been maintained stable in 2011. In 2011, $CO_2$ concentrations of 106%, 100% and 94% of target values has been recorded in control OTC, $1.4{\times}$ $CO_2$-enriched OTC and $1.8{\times}$ $CO_2$-enriched OTC, respectively. With all OTC chambers, the difference between outside and inside temperatures was the highest ($1.2{\sim}2.0^{\circ}C$) at 10 am to 2 pm. Temperature difference between six OTC chambers was not detected. The relative humidity inside and outside the chambers was the same, with minor variations (0~1%). The system required the highest amount of $CO_2$ for operation in June, and consumed 11.33 and 17.04 ton in June 2010 and 2011, respectively.

An Experimental Study on Establishing Criteria of Gripping Work in Construction Site (건설 현장 악력 작업안전 기준 설정에 관한 실험적 연구)

  • 손기상;이인홍;최만진;안병준
    • Journal of the Korean Society of Safety
    • /
    • v.10 no.3
    • /
    • pp.81-95
    • /
    • 1995
  • Now, safety assurance in construction sites should be accomplished by its own organization rather than control of the code or government. It is believed that the safety assurance can be considerably improved by a lecture or an education using the existing theories or literatures up to now, but it is thought that fundamental safety assurance we not able to be accomplished without developing safety devices '||'&'||' equipment or taking fundamental measures, based on the result analyzed from workers behaviors. There are various behaviors of the workers showed in construction site, but only tests for hammerusing works such as form, re-bar, stone workers directly related to the grip strength are mainly performed, investigated and measured here for the study. The above works are similar to power grip, 7th picture on seven items which are categorized for hand grip types(Ammermin 1956 ; Jones ; Kobrick 1958). Measurements of grip strength are commonly taken in anthropometric surveys. They are easy to administer but unfortunately it is rather dubious whether they yield any data that are of interest to the engineer. Very fewer controls of tools are grasped and squeesed studies showed very little overall correlation between grip strength and other measures of bodily strength (Laubach, Kromer, and Thordsen 1972), but hammer-using work which is practically progressed in construction site are mainly influenced with grip strength. According to the investigation on work measurement, it is shown that 77% of form worker are using hammer to be related to grip strength. In this study, it is particularly noticed that wearing safety gloves in construction site is required for workers safety but 20% difference between grip strength with safety gloves and without ones are commonly neglected in the site(Fig. 1). Nevertheless, safety operation with consideration of the above 20% difference is not considered in the construction site. Factors of age, kinds of work, working time, with or without safety gloves are in vestigated '||'&'||' collected at the sites for this study. Test, not at each working hour but at 14 : 00 when the almost all of the workers think the most tired, resulting from the questionaires, also when it is shown on the research report has been performed and compared for main kinds of works : form '||'&'||' re-bar work. Tests were performed with both left SE rightand of the workers simultaneously in construction site using Rand Dynamometer(Model 78010, Lafayette Instrument Co., Indiana, U.S.A) by reading grip strength on the gauge while they are pulling, and then by interviewing on their ages, works, experiences and etc., directly. The above tests have been performed for the dates of 15th march-26th May '95 with consideration of site condition. And even if various factors of ambient temperature on the testing date, working condition, individual worker's habit and worker's condition of the previous ate are concerned with the study. Those are considered as constants in this study. Samples are formwork 53, rebar 62, electrician 5, plumber 4, welding 1 from D construction Co., Ltd, ; formwork 12, re-bar 5, electrician 2, from S construction Co., Ltd, , formwork 78, re-bar 18, plumber 31, electrician 13, labor 48, plumber 31, plasterer 15, concrete placer 6, water proof worker 3, maisony 5 from B construction Co., Ltd. As In the previously mentioned, main aspect to be investigated in this study will be from '||'&'||' re-bar work because grip strength will be directly applied to these two kinds of works ; form '||'&'||' re-bar work, eventhough there are total 405 samples taken. It is thought that a frequency of accident occurrence will be mainly two work postures "looking up '||'&'||' looking down" to be mainly sorted, but this factor is not clarified in this study because It will be needed a lot of work more. Tests has been done at possible large scale of horizontally work-extended sites within one hour in order to prevent or decrease errors '||'&'||' discrepancies from time lag of the test. Additionally, the statistical package computer program SPSS PC+has been used for the study.

  • PDF

Smartphone Security Using Fingerprint Password (다중 지문 시퀀스를 이용한 스마트폰 보안)

  • Bae, Kyoung-Yul
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.3
    • /
    • pp.45-55
    • /
    • 2013
  • Thereby using smartphone and mobile device be more popular the more people utilize mobile device in many area such as education, news, financial. In January, 2007 Apple release i-phone it touch off rapid increasing in user of smartphone and it create new market and these broaden its utilization area. Smartphone use WiFi or 3G mobile radio communication network and it has a feature that can access to internet whenever and anywhere. Also using smartphone application people can search arrival time of public transportation in real time and application is used in mobile banking and stock trading. Computer's function is replaced by smartphone so it involves important user's information such as financial and personal pictures, videos. Present smartphone security systems are not only too simple but the unlocking methods are spreading out covertly. I-phone is secured by using combination of number and character but USA's IT magazine Engadget reveal that it is easily unlocked by using combination with some part of number pad and buttons Android operation system is using pattern system and it is known as using 9 point dot so user can utilize various variable but according to Jonathan smith professor of University of Pennsylvania Android security system is easily unlocked by tracing fingerprint which remains on the smartphone screen. So both of Android and I-phone OS are vulnerable at security threat. Compared with problem of password and pattern finger recognition has advantage in security and possibility of loss. The reason why current using finger recognition smart phone, and device are not so popular is that there are many problem: not providing reasonable price, breaching human rights. In addition, finger recognition sensor is not providing reasonable price to customers but through continuous development of the smartphone and device, it will be more miniaturized and its price will fall. So once utilization of finger recognition is actively used in smartphone and if its utilization area broaden to financial transaction. Utilization of biometrics in smart device will be debated briskly. So in this thesis we will propose fingerprint numbering system which is combined fingerprint and password to fortify existing fingerprint recognition. Consisted by 4 number of password has this kind of problem so we will replace existing 4number password and pattern system and consolidate with fingerprint recognition and password reinforce security. In original fingerprint recognition system there is only 10 numbers of cases but if numbering to fingerprint we can consist of a password as a new method. Using proposed method user enter fingerprint as invested number to the finger. So attacker will have difficulty to collect all kind of fingerprint to forge and infer user's password. After fingerprint numbering, system can use the method of recognization of entering several fingerprint at the same time or enter fingerprint in regular sequence. In this thesis we adapt entering fingerprint in regular sequence and if in this system allow duplication when entering fingerprint. In case of allowing duplication a number of possible combinations is $\sum_{I=1}^{10}\;{_{10}P_i}$ and its total cases of number is 9,864,100. So by this method user retain security the other hand attacker will have a number of difficulties to conjecture and it is needed to obtain user's fingerprint thus this system will enhance user's security. This system is method not accept only one fingerprint but accept multiple finger in regular sequence. In this thesis we introduce the method in the environment of smartphone by using multiple numbered fingerprint enter to authorize user. Present smartphone authorization using pattern and password and fingerprint are exposed to high risk so if proposed system overcome delay time when user enter their finger to recognition device and relate to other biometric method it will have more concrete security. The problem should be solved after this research is reducing fingerprint's numbering time and hardware development should be preceded. If in the future using fingerprint public certification becomes popular. The fingerprint recognition in the smartphone will become important security issue so this thesis will utilize to fortify fingerprint recognition research.

A Variable Latency Newton-Raphson's Floating Point Number Reciprocal Computation (가변 시간 뉴톤-랍손 부동소수점 역수 계산기)

  • Kim Sung-Gi;Cho Gyeong-Yeon
    • The KIPS Transactions:PartA
    • /
    • v.12A no.2 s.92
    • /
    • pp.95-102
    • /
    • 2005
  • The Newton-Raphson iterative algorithm for finding a floating point reciprocal which is widely used for a floating point division, calculates the reciprocal by performing a fixed number of multiplications. In this paper, a variable latency Newton-Raphson's reciprocal algorithm is proposed that performs multiplications a variable number of times until the error becomes smaller than a given value. To find the reciprocal of a floating point number F, the algorithm repeats the following operations: '$'X_{i+1}=X=X_i*(2-e_r-F*X_i),\;i\in\{0,\;1,\;2,...n-1\}'$ with the initial value $'X_0=\frac{1}{F}{\pm}e_0'$. The bits to the right of p fractional bits in intermediate multiplication results are truncated, and this truncation error is less than $'e_r=2^{-p}'$. The value of p is 27 for the single precision floating point, and 57 for the double precision floating point. Let $'X_i=\frac{1}{F}+e_i{'}$, these is $'X_{i+1}=\frac{1}{F}-e_{i+1},\;where\;{'}e_{i+1}, is less than the smallest number which is representable by floating point number. So, $X_{i+1}$ is approximate to $'\frac{1}{F}{'}$. Since the number of multiplications performed by the proposed algorithm is dependent on the input values, the average number of multiplications per an operation is derived from many reciprocal tables $(X_0=\frac{1}{F}{\pm}e_0)$ with varying sizes. The superiority of this algorithm is proved by comparing this average number with the fixed number of multiplications of the conventional algorithm. Since the proposed algorithm only performs the multiplications until the error gets smaller than a given value, it can be used to improve the performance of a reciprocal unit. Also, it can be used to construct optimized approximate reciprocal tables. The results of this paper can be applied to many areas that utilize floating point numbers, such as digital signal processing, computer graphics, multimedia scientific computing, etc.

Modern Paper Quality Control

  • Olavi Komppa
    • Proceedings of the Korea Technical Association of the Pulp and Paper Industry Conference
    • /
    • 2000.06a
    • /
    • pp.16-23
    • /
    • 2000
  • The increasing functional needs of top-quality printing papers and packaging paperboards, and especially the rapid developments in electronic printing processes and various computer printers during past few years, set new targets and requirements for modern paper quality. Most of these paper grades of today have relatively high filler content, are moderately or heavily calendered , and have many coating layers for the best appearance and performance. In practice, this means that many of the traditional quality assurance methods, mostly designed to measure papers made of pure. native pulp only, can not reliably (or at all) be used to analyze or rank the quality of modern papers. Hence, introduction of new measurement techniques is necessary to assure and further develop the paper quality today and in the future. Paper formation , i.e. small scale (millimeter scale) variation of basis weight, is the most important quality parameter of paper-making due to its influence on practically all the other quality properties of paper. The ideal paper would be completely uniform so that the basis weight of each small point (area) measured would be the same. In practice, of course, this is not possible because there always exists relatively large local variations in paper. However, these small scale basis weight variations are the major reason for many other quality problems, including calender blacking uneven coating result, uneven printing result, etc. The traditionally used visual inspection or optical measurement of the paper does not give us a reliable understanding of the material variations in the paper because in modern paper making process the optical behavior of paper is strongly affected by using e.g. fillers, dye or coating colors. Futhermore, the opacity (optical density) of the paper is changed at different process stages like wet pressing and calendering. The greatest advantage of using beta transmission method to measure paper formation is that it can be very reliably calibrated to measure true basis weight variation of all kinds of paper and board, independently on sample basis weight or paper grade. This gives us the possibility to measure, compare and judge papers made of different raw materials, different color, or even to measure heavily calendered, coated or printed papers. Scientific research of paper physics has shown that the orientation of the top layer (paper surface) fibers of the sheet paly the key role in paper curling and cockling , causing the typical practical problems (paper jam) with modern fax and copy machines, electronic printing , etc. On the other hand, the fiber orientation at the surface and middle layer of the sheet controls the bending stiffness of paperboard . Therefore, a reliable measurement of paper surface fiber orientation gives us a magnificent tool to investigate and predict paper curling and coclking tendency, and provides the necessary information to finetune, the manufacturing process for optimum quality. many papers, especially heavily calendered and coated grades, do resist liquid and gas penetration very much, bing beyond the measurement range of the traditional instruments or resulting invonveniently long measuring time per sample . The increased surface hardness and use of filler minerals and mechanical pulp make a reliable, nonleaking sample contact to the measurement head a challenge of its own. Paper surface coating causes, as expected, a layer which has completely different permeability characteristics compared to the other layer of the sheet. The latest developments in sensor technologies have made it possible to reliably measure gas flow in well controlled conditions, allowing us to investigate the gas penetration of open structures, such as cigarette paper, tissue or sack paper, and in the low permeability range analyze even fully greaseproof papers, silicon papers, heavily coated papers and boards or even detect defects in barrier coatings ! Even nitrogen or helium may be used as the gas, giving us completely new possibilities to rank the products or to find correlation to critical process or converting parameters. All the modern paper machines include many on-line measuring instruments which are used to give the necessary information for automatic process control systems. hence, the reliability of this information obtained from different sensors is vital for good optimizing and process stability. If any of these on-line sensors do not operate perfectly ass planned (having even small measurement error or malfunction ), the process control will set the machine to operate away from the optimum , resulting loss of profit or eventual problems in quality or runnability. To assure optimum operation of the paper machines, a novel quality assurance policy for the on-line measurements has been developed, including control procedures utilizing traceable, accredited standards for the best reliability and performance.

The Integer Number Divider Using Improved Reciprocal Algorithm (개선된 역수 알고리즘을 사용한 정수 나눗셈기)

  • Song, Hong-Bok;Park, Chang-Soo;Cho, Gyeong-Yeon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.12 no.7
    • /
    • pp.1218-1226
    • /
    • 2008
  • With the development of semiconductor integrated technology and with the increasing use of multimedia functions in computer, more functions have been implemented as hardware. Nowadays, most microprocessors beyond 32 bits generally implement an integer multiplier as hardware. However, as for a divider, only specific microprocessor implements traditional SRT algorithm as hardware due to complexity of implementation and slow speed. This paper suggested an algorithm that uses a multiplier, 'w bit $\times$ w bit = 2w bit', to process $\frac{N}{D}$ integer division. That is, the reciprocal number D is first calculated, and then multiply dividend N to process integer division. In this paper, when the divisor D is '$D=0.d{\times}2^L$, 0.5 < 0.d < 1.0', approximate value of ' $\frac{1}{D}$', '$1.g{\times}2^{-L}$', which satisfies ' $0.d{\times}1.g=1+e$, $e<2^{-w}$', is defined as over reciprocal number and then an algorithm for over reciprocal number is suggested. This algorithm multiplies over reciprocal number '$01.g{\times}2^{-L}$' by dividend N to process $\frac{N}{D}$ integer division. The algorithm suggested in this paper doesn't require additional revision, because it can calculate correct reciprocal number. In addition, this algorithm uses only multiplier, so additional hardware for division is not required to implement microprocessor. Also, it shows faster speed than the conventional SRT algorithm and performs operation by word unit, accordingly it is more suitable to make compiler than the existing division algorithm. In conclusion, results from this study could be used widely for implementation SOC(System on Chip) and etc. which has been restricted to microprocessor and size of the hardware.

A Study on the Establishment of Buddhist Temple Records Management System (사찰기록 관리 체계화 방안 연구)

  • Park, Sung-Su
    • The Korean Journal of Archival Studies
    • /
    • no.26
    • /
    • pp.33-62
    • /
    • 2010
  • Buddhism was introduced in the Korea Peninsula 1600 years ago, and now there are over 10 million believers in Korea. The systematic Management of Temple Records has a spiritual and cultural value in a rapidly changing modern society. This study proposes a better management system of Buddhist temple records for the Jogye Order of Korean Buddhism. this system Not only supports transparency of religious affairs, but presents a way for a more effective management. in this study, I conducted a study on the national legislation for the preservation of buddhist temples and the local rules of religious affairs from the Jogye Order. Through this, I analyzed the problems of Buddhist records management. in the long term, to improve these problems, I purpose the establishment of temple archives be maintained by parish head offices. This study presents a retention schedule for this systematic establishment system. I present charts for the standard Buddhist records management that manage the total process systematically from the production of records to its discard. Also I present a general plan to prevent random defamation of Buddhist temple documents and impose a duty for preservation. I intend for this plan to be subject to discussion and tailored to the particular needs of temple reads. In creating these charts standard of Buddhist temple records management, I analyzed operating examples of foreign religious institutions and examined their retention periods. I also examined the retention periods and classification system from the Jogye Order. Then I presented ways for this management system to operate through computer programs. There is a need to establish a large scale management system to arrange the records of buddhist documents. We must enforce the duty of conserving records through the proposed management system. We need the system to manage even the local parish temple records through the proposed management system and the operation of the proposed archive system. This study presents research to from the basic of the preservation and the passing of traditional records to future generations. I also discovered the historical cultural and social value that these records contain. Systematically confirmed Buddhist temple records management will pave the way that these tangible and intangible cultural records handed down from history can be the cultural heritages. establishing a temple records management system will pave the way for these cultural records to be handed down to future generations as cultural heritages.

Comparison of the wall clock time for extracting remote sensing data in Hierarchical Data Format using Geospatial Data Abstraction Library by operating system and compiler (운영 체제와 컴파일러에 따른 Geospatial Data Abstraction Library의 Hierarchical Data Format 형식 원격 탐사 자료 추출 속도 비교)

  • Yoo, Byoung Hyun;Kim, Kwang Soo;Lee, Jihye
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.21 no.1
    • /
    • pp.65-73
    • /
    • 2019
  • The MODIS (Moderate Resolution Imaging Spectroradiometer) data in Hierarchical Data Format (HDF) have been processed using the Geospatial Data Abstraction Library (GDAL). Because of a relatively large data size, it would be preferable to build and install the data analysis tool with greater computing performance, which would differ by operating system and the form of distribution, e.g., source code or binary package. The objective of this study was to examine the performance of the GDAL for processing the HDF files, which would guide construction of a computer system for remote sensing data analysis. The differences in execution time were compared between environments under which the GDAL was installed. The wall clock time was measured after extracting data for each variable in the MODIS data file using a tool built lining against GDAL under a combination of operating systems (Ubuntu and openSUSE), compilers (GNU and Intel), and distribution forms. The MOD07 product, which contains atmosphere data, were processed for eight 2-D variables and two 3-D variables. The GDAL compiled with Intel compiler under Ubuntu had the shortest computation time. For openSUSE, the GDAL compiled using GNU and intel compilers had greater performance for 2-D and 3-D variables, respectively. It was found that the wall clock time was considerably long for the GDAL complied with "--with-hdf4=no" configuration option or RPM package manager under openSUSE. These results indicated that the choice of the environments under which the GDAL is installed, e.g., operation system or compiler, would have a considerable impact on the performance of a system for processing remote sensing data. Application of parallel computing approaches would improve the performance of the data processing for the HDF files, which merits further evaluation of these computational methods.