• Title/Summary/Keyword: 한국소프트웨어

Search Result 17,190, Processing Time 0.048 seconds

Records Management and Archives in Korea : Its Development and Prospects (한국 기록관리행정의 변천과 전망)

  • Nam, Hyo-Chai
    • Journal of Korean Society of Archives and Records Management
    • /
    • v.1 no.1
    • /
    • pp.19-35
    • /
    • 2001
  • After almost one century of discontinuity in the archival tradition of Chosun dynasty, Korea entered the new age of records and archival management by legislating and executing the basic laws (The Records and Archives Management of Public Agencies Ad of 1999). Annals of Chosun dynasty recorded major historical facts of the five hundred years of national affairs. The Annals are major accomplishment in human history and rare in the world. It was possible because the Annals were composed of collected, selected and complied records of primary sources written and compiled by generations of historians, As important public records are needed to be preserved in original forms in modern archives, we had to develop and establish a modern archival system to appraise and select important national records for archival preservation. However, the colonialization of Korea deprived us of the opportunity to do the task, and our fine archival tradition was not succeeded. A centralized archival system began to develop since the establishment of GARS under the Ministry of Government Administration in 1969. GARS built a modem repository in Pusan in 1984 succeeding to the tradition of History Archives of Chosun dynasty. In 1998, GARS moved its headquarter to Taejon Government Complex and acquired state-of-the-art audio visual archives preservation facilities. From 1996, GARS introduced an automated archival management system to remedy the manual registration and management system complementing the preservation microfilming. Digitization of the holdings was the key project to provided the digital images of archives to users. To do this, the GARS purchased new computer/server systems and developed application softwares. Parallel to this direction, GARS drastically renovated its manpower composition toward a high level of professionalization by recruiting more archivists with historical and library science backgrounds. Conservators and computer system operators were also recruited. The new archival laws has been in effect from January 1, 2000. The new laws made following new changes in the field of records and archival administration in Korea. First, the laws regulate the records and archives of all public agencies including the Legislature, the Judiciary, the Administration, the constitutional institutions, Army, Navy, Air Force, and National Intelligence Service. A nation-wide unified records and archives management system became available. Second, public archives and records centers are to be established according to the level of the agency; a central archives at national level, special archives for the National Assembly and the Judiciary, local government archives for metropolitan cities and provinces, records center or special records center for administrative agencies. A records manager will be responsible for the records management of each administrative divisions. Third, the records in the public agencies are registered in the computer system as they are produced. Therefore, the records are traceable and will be searched or retrieved easily through internet or computer network. Fourth, qualified records managers and archivists who are professionally trained in the field of records management and archival science will be assigned mandatorily to guarantee the professional management of records and archives. Fifth, the illegal treatment of public records and archives constitutes a punishable crime. In the future, the public records find archival management will develop along with Korean government's 'Electronic Government Project.' Following changes are in prospect. First, public agencies will digitize paper records, audio-visual records, and publications as well as electronic documents, thus promoting administrative efficiency and productivity. Second, the National Assembly already established its Special Archives. The judiciary and the National Intelligence Service will follow it. More archives will be established at city and provincial levels. Third, the more our society develop into a knowledge-based information society, the more the records management function will become one of the important national government functions. As more universities, academic associations, and civil societies participate in promoting archival awareness and in establishing archival science, and more people realize the importance of the records and archives management up to the level of national public campaign, the records and archival management in Korea will develop significantly distinguishable from present practice.

Corporate Governance and Managerial Performance in Public Enterprises: Focusing on CEOs and Internal Auditors (공기업의 지배구조와 경영성과: CEO와 내부감사인을 중심으로)

  • Yu, Seung-Won
    • KDI Journal of Economic Policy
    • /
    • v.31 no.1
    • /
    • pp.71-103
    • /
    • 2009
  • Considering the expenditure size of public institutions centering on public enterprises, about 28% of Korea's GDP in 2007, public institutions have significant influence on the Korean economy. However, still in the new government, there are voices of criticism about the need of constant reform on public enterprises due to their irresponsible management impeding national competitiveness. Especially, political controversy over appointment of executives such as CEOs of public enterprises has caused the distrust of the people. As one of various reform measures for public enterprises, this study analyzes the effect of internal governance structure of public enterprises on their managerial performance, since, regardless of privatization of public enterprises, improving the governance structure of public enterprises is a matter of great importance. There are only a few prior researches focusing on the governance structure and managerial performance of public enterprises compared to those of private enterprises. Most of prior researches studied the relationship between parachuting employment of CEO and managerial performance, and concluded that parachuting produces negative effect on managerial performance. However, different from the results of such researches, recent studies suggest that there is no relationship between employment type of CEOs and managerial performance in public enterprises. This study is distinguished from prior researches in view of following. First, prior researches focused on the relationship between employment type of public enterprises' CEOs and managerial performance. However, in addition to this, this study analyzes the relationship of internal auditors and managerial performance. Second, unlike prior researches studying the relationship between employment type of public corporations' CEOs and managerial performance with an emphasis on parachuting employment, this study researches impact of employment type as well as expertise of CEOs and internal auditors on managerial performance. Third, prior researchers mainly used non-financial indicators from various samples. However, this study eliminated subjectivity of researchers by analyzing public enterprises designated by the government and their financial statements, which were externally audited and inspected. In this study, regression analysis is applied in analyzing the relationship of independence and expertise of public enterprises' CEOs and internal auditors and managerial performance in the same year. Financial information from 2003 to 2007 of 24 public enterprises, which are designated by the government, and their personnel information from the board of directors are used as samples. Independence of CEOs is identified by dividing CEOs into persons from the same public enterprise and persons from other organization, and independence of internal auditors is determined by classifying them into two groups, people from academic field, economic world, and civic groups, and people from political community, government ministries, and military. Also, expertise of CEOs and internal auditors is divided into business expertise and financial expertise. As control variables, this study applied foundation year, asset size, government subsidies as a proportion to corporate earnings, and dummy variables by year. Analysis showed that there is significantly positive relationship between independence and financial expertise of internal auditors and managerial performance. In addition, although business expertise and financial expertise of CEOs were not statistically significant, they have positive relationship with managerial performance. However, unlike a general idea, independence of CEOs is not statistically significant, but it is negatively related to managerial performance. Contrary to general concerns, it seems that the impact of independence of public enterprises' CEOs on managerial performance has slightly decreased. Instead, it explains that expertise of public enterprises' CEOs and internal auditors plays more important role in managerial performance rather than their independence. Meanwhile, there are limitations in this study as follows. First, in contrast to private enterprises, public enterprises simultaneously pursue publicness and entrepreneurship. However, this study focuses on entrepreneurship, excluding considerations on publicness of public enterprises. Second, public enterprises in this study are limited to those in the central government. Accordingly, it should be carefully considered when the result of this study is applied to public enterprises in local governments. Finally, this study excludes factors related to transparency and democracy issues which are raised in appointment process of executives of public enterprises, as it may cause the issue of subjectivity of researchers.

  • PDF

Assessment for the Utility of Treatment Plan QA System according to Dosimetric Leaf Gap in Multileaf Collimator (다엽콜리메이터의 선량학적엽간격에 따른 치료계획 정도관리시스템의 효용성 평가)

  • Lee, Soon Sung;Choi, Sang Hyoun;Min, Chul Kee;Kim, Woo Chul;Ji, Young Hoon;Park, Seungwoo;Jung, Haijo;Kim, Mi-Sook;Yoo, Hyung Jun;Kim, Kum Bae
    • Progress in Medical Physics
    • /
    • v.26 no.3
    • /
    • pp.168-177
    • /
    • 2015
  • For evaluating the treatment planning accurately, the quality assurance for treatment planning is recommended when patients were treated with IMRT which is complex and delicate. To realize this purpose, treatment plan quality assurance software can be used to verify the delivered dose accurately before and after of treatment. The purpose of this study is to evaluate the accuracy of treatment plan quality assurance software for each IMRT plan according to MLC DLG (dosimetric leaf gap). Novalis Tx with a built-in HD120 MLC was used in this study to acquire the MLC dynalog file be imported in MobiusFx. To establish IMRT plan, Eclipse RTP system was used and target and organ structures (multi-target, mock prostate, mock head/neck, C-shape case) were contoured in I'mRT phantom. To verify the difference of dose distribution according to DLG, MLC dynalog files were imported to MobiusFx software and changed the DLG (0.5, 0.7, 1.0, 1.3, 1.6 mm) values in MobiusFx. For evaluation dose, dose distribution was evaluated by using 3D gamma index for the gamma criteria 3% and distance to agreement 3 mm, and the point dose was acquired by using the CC13 ionization chamber in isocenter of I'mRT phantom. In the result for point dose, the mock head/neck and multi-target had difference about 4% and 3% in DLG 0.5 and 0.7 mm respectively, and the other DLGs had difference less than 3%. The gamma index passing-rate of mock head/neck were below 81% for PTV and cord, and multi-target were below 30% for center and superior target in DLGs 0.5, 0.7 mm, however, inferior target of multi-target case and parotid of mock head/neck case had 100.0% passing rate in all DLGs. The point dose of mock prostate showed difference below 3.0% in all DLGs, however, the passing rate of PTV were below 95% in 0.5, 0.7 mm DLGs, and the other DLGs were above 98%. The rectum and bladder had 100.0% passing rate in all DLGs. As the difference of point dose in C-shape were 3~9% except for 1.3 mm DLG, the passing rate of PTV in 1.0 1.3 mm were 96.7, 93.0% respectively. However, passing rate of the other DLGs were below 86% and core was 100.0% passing rate in all DLGs. In this study, we verified that the accuracy of treatment planning QA system can be affected by DLG values. For precise quality assurance for treatment technique using the MLC motion like IMRT and VMAT, we should use appropriate DLG value in linear accelerator and RTP system.

Quantitative Assessment Technology of Small Animal Myocardial Infarction PET Image Using Gaussian Mixture Model (다중가우시안혼합모델을 이용한 소동물 심근경색 PET 영상의 정량적 평가 기술)

  • Woo, Sang-Keun;Lee, Yong-Jin;Lee, Won-Ho;Kim, Min-Hwan;Park, Ji-Ae;Kim, Jin-Su;Kim, Jong-Guk;Kang, Joo-Hyun;Ji, Young-Hoon;Choi, Chang-Woon;Lim, Sang-Moo;Kim, Kyeong-Min
    • Progress in Medical Physics
    • /
    • v.22 no.1
    • /
    • pp.42-51
    • /
    • 2011
  • Nuclear medicine images (SPECT, PET) were widely used tool for assessment of myocardial viability and perfusion. However it had difficult to define accurate myocardial infarct region. The purpose of this study was to investigate methodological approach for automatic measurement of rat myocardial infarct size using polar map with adaptive threshold. Rat myocardial infarction model was induced by ligation of the left circumflex artery. PET images were obtained after intravenous injection of 37 MBq $^{18}F$-FDG. After 60 min uptake, each animal was scanned for 20 min with ECG gating. PET data were reconstructed using ordered subset expectation maximization (OSEM) 2D. To automatically make the myocardial contour and generate polar map, we used QGS software (Cedars-Sinai Medical Center). The reference infarct size was defined by infarction area percentage of the total left myocardium using TTC staining. We used three threshold methods (predefined threshold, Otsu and Multi Gaussian mixture model; MGMM). Predefined threshold method was commonly used in other studies. We applied threshold value form 10% to 90% in step of 10%. Otsu algorithm calculated threshold with the maximum between class variance. MGMM method estimated the distribution of image intensity using multiple Gaussian mixture models (MGMM2, ${\cdots}$ MGMM5) and calculated adaptive threshold. The infarct size in polar map was calculated as the percentage of lower threshold area in polar map from the total polar map area. The measured infarct size using different threshold methods was evaluated by comparison with reference infarct size. The mean difference between with polar map defect size by predefined thresholds (20%, 30%, and 40%) and reference infarct size were $7.04{\pm}3.44%$, $3.87{\pm}2.09%$ and $2.15{\pm}2.07%$, respectively. Otsu verse reference infarct size was $3.56{\pm}4.16%$. MGMM methods verse reference infarct size was $2.29{\pm}1.94%$. The predefined threshold (30%) showed the smallest mean difference with reference infarct size. However, MGMM was more accurate than predefined threshold in under 10% reference infarct size case (MGMM: 0.006%, predefined threshold: 0.59%). In this study, we was to evaluate myocardial infarct size in polar map using multiple Gaussian mixture model. MGMM method was provide adaptive threshold in each subject and will be a useful for automatic measurement of infarct size.

Application of LCA on Lettuce Cropping System by Bottom-up Methodology in Protected Cultivation (시설상추 농가를 대상으로 하는 bottom-up 방식 LCA 방법론의 농업적 적용)

  • Ryu, Jong-Hee;Kim, Kye-Hoon;Kim, Gun-Yeob;So, Kyu-Ho;Kang, Kee-Kyung
    • Korean Journal of Soil Science and Fertilizer
    • /
    • v.44 no.6
    • /
    • pp.1195-1206
    • /
    • 2011
  • This study was conducted to apply LCA (Life cycle assessment) methodology to lettuce (Lactuca sativa L.) production systems in Namyang-ju as a case study. Five lettuce growing farms with three different farming systems (two farms with organic farming system, one farm with a system without agricultural chemicals and two farms with conventional farming system) were selected at Namyangju city of Gyeonggi-province in Korea. The input data for LCA were collected by interviewing with the farmers. The system boundary was set at a cropping season without heating and cooling system for reducing uncertainties in data collection and calculation. Sensitivity analysis was carried out to find out the effect of type and amount of fertilizer and energy use on GHG (Greenhouse Gas) emission. The results of establishing GTG (Gate-to-Gate) inventory revealed that the quantity of fertilizer and energy input had the largest value in producing 1 kg lettuce, the amount of pesticide input the smallest. The amount of electricity input was the largest in all farms except farm 1 which purchased seedlings from outside. The quantity of direct field emission of $CO_2$, $CH_4$ and $N_2O$ from farm 1 to farm 5 were 6.79E-03 (farm 1), 8.10E-03 (farm 2), 1.82E-02 (farm 3), 7.51E-02 (farm 4) and 1.61E-02 (farm 5) kg $kg^{-1}$ lettuce, respectively. According to the result of LCI analysis focused on GHG, it was observed that $CO_2$ emission was 2.92E-01 (farm 1), 3.76E-01 (farm 2), 4.11E-01 (farm 3), 9.40E-01 (farm 4) and $5.37E-01kg\;CO_2\;kg^{-1}\;lettuce$ (farm 5), respectively. Carbon dioxide contribute to the most GHG emission. Carbon dioxide was mainly emitted in the process of energy production, which occupied 67~91% of $CO_2$ emission from every production process from 5 farms. Due to higher proportion of $CO_2$ emission from production of compound fertilizer in conventional crop system, conventional crop system had lower proportion of $CO_2$ emission from energy production than organic crop system did. With increasing inorganic fertilizer input, the process of lettuce cultivation covered higher proportion in $N_2O$ emission. Therefore, farms 1 and 2 covered 87% of total $N_2O$ emission; and farm 3 covered 64%. The carbon footprints from farm 1 to farm 5 were 3.40E-01 (farm 1), 4.31E-01 (farm 2), 5.32E-01 (farm 3), 1.08E+00 (farm 4) and 6.14E-01 (farm 5) kg $CO_2$-eq. $kg^{-1}$ lettuce, respectively. Results of sensitivity analysis revealed the soybean meal was the most sensitive among 4 types of fertilizer. The value of compound fertilizer was the least sensitive among every fertilizer imput. Electricity showed the largest sensitivity on $CO_2$ emission. However, the value of $N_2O$ variation was almost zero.

Animal Infectious Diseases Prevention through Big Data and Deep Learning (빅데이터와 딥러닝을 활용한 동물 감염병 확산 차단)

  • Kim, Sung Hyun;Choi, Joon Ki;Kim, Jae Seok;Jang, Ah Reum;Lee, Jae Ho;Cha, Kyung Jin;Lee, Sang Won
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.4
    • /
    • pp.137-154
    • /
    • 2018
  • Animal infectious diseases, such as avian influenza and foot and mouth disease, occur almost every year and cause huge economic and social damage to the country. In order to prevent this, the anti-quarantine authorities have tried various human and material endeavors, but the infectious diseases have continued to occur. Avian influenza is known to be developed in 1878 and it rose as a national issue due to its high lethality. Food and mouth disease is considered as most critical animal infectious disease internationally. In a nation where this disease has not been spread, food and mouth disease is recognized as economic disease or political disease because it restricts international trade by making it complex to import processed and non-processed live stock, and also quarantine is costly. In a society where whole nation is connected by zone of life, there is no way to prevent the spread of infectious disease fully. Hence, there is a need to be aware of occurrence of the disease and to take action before it is distributed. Epidemiological investigation on definite diagnosis target is implemented and measures are taken to prevent the spread of disease according to the investigation results, simultaneously with the confirmation of both human infectious disease and animal infectious disease. The foundation of epidemiological investigation is figuring out to where one has been, and whom he or she has met. In a data perspective, this can be defined as an action taken to predict the cause of disease outbreak, outbreak location, and future infection, by collecting and analyzing geographic data and relation data. Recently, an attempt has been made to develop a prediction model of infectious disease by using Big Data and deep learning technology, but there is no active research on model building studies and case reports. KT and the Ministry of Science and ICT have been carrying out big data projects since 2014 as part of national R &D projects to analyze and predict the route of livestock related vehicles. To prevent animal infectious diseases, the researchers first developed a prediction model based on a regression analysis using vehicle movement data. After that, more accurate prediction model was constructed using machine learning algorithms such as Logistic Regression, Lasso, Support Vector Machine and Random Forest. In particular, the prediction model for 2017 added the risk of diffusion to the facilities, and the performance of the model was improved by considering the hyper-parameters of the modeling in various ways. Confusion Matrix and ROC Curve show that the model constructed in 2017 is superior to the machine learning model. The difference between the2016 model and the 2017 model is that visiting information on facilities such as feed factory and slaughter house, and information on bird livestock, which was limited to chicken and duck but now expanded to goose and quail, has been used for analysis in the later model. In addition, an explanation of the results was added to help the authorities in making decisions and to establish a basis for persuading stakeholders in 2017. This study reports an animal infectious disease prevention system which is constructed on the basis of hazardous vehicle movement, farm and environment Big Data. The significance of this study is that it describes the evolution process of the prediction model using Big Data which is used in the field and the model is expected to be more complete if the form of viruses is put into consideration. This will contribute to data utilization and analysis model development in related field. In addition, we expect that the system constructed in this study will provide more preventive and effective prevention.

A Study on Estimating Shear Strength of Continuum Rock Slope (연속체 암반비탈면의 강도정수 산정 연구)

  • Kim, Hyung-Min;Lee, Su-gon;Lee, Byok-Kyu;Woo, Jae-Gyung;Hur, Ik;Lee, Jun-Ki
    • Journal of the Korean Geotechnical Society
    • /
    • v.35 no.5
    • /
    • pp.5-19
    • /
    • 2019
  • Considering the natural phenomenon in which steep slopes ($65^{\circ}{\sim}85^{\circ}$) consisting of rock mass remain stable for decades, slopes steeper than 1:0.5 (the standard of slope angle for blast rock) may be applied in geotechnical conditions which are similar to those above at the design and initial construction stages. In the process of analysing the stability of a good to fair continuum rock slope that can be designed as a steep slope, a general method of estimating rock mass strength properties from design practice perspective was required. Practical and genealized engineering methods of determining the properties of a rock mass are important for a good continuum rock slope that can be designed as a steep slope. The Genealized Hoek-Brown (H-B) failure criterion and GSI (Geological Strength Index), which were revised and supplemented by Hoek et al. (2002), were assessed as rock mass characterization systems fully taking into account the effects of discontinuities, and were widely utilized as a method for calculating equivalent Mohr-Coulomb shear strength (balancing the areas) according to stress changes. The concept of calculating equivalent M-C shear strength according to the change of confining stress range was proposed, and on a slope, the equivalent shear strength changes sensitively with changes in the maximum confining stress (${{\sigma}^{\prime}}_{3max}$ or normal stress), making it difficult to use it in practical design. In this study, the method of estimating the strength properties (an iso-angle division method) that can be applied universally within the maximum confining stress range for a good to fair continuum rock mass slope is proposed by applying the H-B failure criterion. In order to assess the validity and applicability of the proposed method of estimating the shear strength (A), the rock slope, which is a study object, was selected as the type of rock (igneous, metamorphic, sedimentary) on the steep slope near the existing working design site. It is compared and analyzed with the equivalent M-C shear strength (balancing the areas) proposed by Hoek. The equivalent M-C shear strength of the balancing the areas method and iso-angle division method was estimated using the RocLab program (geotechnical properties calculation software based on the H-B failure criterion (2002)) by using the basic data of the laboratory rock triaxial compression test at the existing working design site and the face mapping of discontinuities on the rock slope of study area. The calculated equivalent M-C shear strength of the balancing the areas method was interlinked to show very large or small cohesion and internal friction angles (generally, greater than $45^{\circ}$). The equivalent M-C shear strength of the iso-angle division is in-between the equivalent M-C shear properties of the balancing the areas, and the internal friction angles show a range of $30^{\circ}$ to $42^{\circ}$. We compared and analyzed the shear strength (A) of the iso-angle division method at the study area with the shear strength (B) of the existing working design site with similar or the same grade RMR each other. The application of the proposed iso-angle division method was indirectly evaluated through the results of the stability analysis (limit equilibrium analysis and finite element analysis) applied with these the strength properties. The difference between A and B of the shear strength is about 10%. LEM results (in wet condition) showed that Fs (A) = 14.08~58.22 (average 32.9) and Fs (B) = 18.39~60.04 (average 32.2), which were similar in accordance with the same rock types. As a result of FEM, displacement (A) = 0.13~0.65 mm (average 0.27 mm) and displacement (B) = 0.14~1.07 mm (average 0.37 mm). Using the GSI and Hoek-Brown failure criterion, the significant result could be identified in the application evaluation. Therefore, the strength properties of rock mass estimated by the iso-angle division method could be applied with practical shear strength.

Evaluation on the Immunization Module of Non-chart System in Private Clinic for Development of Internet Information System of National Immunization Programme m Korea (국가 예방접종 인터넷정보시스템 개발을 위한 의원정보시스템의 예방접종 모듈 평가연구)

  • Lee, Moo-Sik;Lee, Kun-Sei;Lee, Seok-Gu;Shin, Eui-Chul;Kim, Keon-Yeop;Na, Bak-Ju;Hong, Jee-Young;Kim, Yun-Jeong;Park, Sook-Kyung;Kim, Bo-Kyung;Kwon, Yun-Hyung;Kim, Young-Taek
    • Journal of agricultural medicine and community health
    • /
    • v.29 no.1
    • /
    • pp.65-75
    • /
    • 2004
  • Objectives: Immunizations have been one of the most effective measures preventing from infectious diseases. It is quite important national infectious disease prevention policy to keep the immunizations rate high and monitor the immunizations rate continuously. To do this, Korean CDC introduced the National Immunization Registry Program(NIRP) which has been implementing since 2000 at the Public Health Centers(PHC). The National Immunization Registry Program will be near completed after sharing, connecting and transfering vaccination data between public and private sector. The aims of this study was to evaluate the immunization module of non-chart system in private clinic with health information system of public health center(made by POSDATA Co., LTD) and immunization registry program(made by BIT Computer Co., LTD). Methods: The analysis and survey were done by specialists in medical, health field, and health information fields from 2001. November to 2002. January. We made the analysis and recommendation about the immunization module of non-chart system in private clinic. Results and Conclusions: To make improvement on immunization module, the system will be revised on various function like receipt and registration, preliminary medical examination, reference and inquiry, registration of vaccine, print-out various sheet, function of transfer vaccination data, issue function of vaccination certification, function of reminder and recall, function of statistical calculation, and management of vaccine stock. There are needs of an accurate assessment of current immunization module on each private non-chart system. And further studies will be necessary to make it an accurate system under changing health policy related national immunization program. We hope that the result of this study may contribute to establish the National Immunization Registry Program.

  • PDF

An Iterative, Interactive and Unified Seismic Velocity Analysis (반복적 대화식 통합 탄성파 속도분석)

  • Suh Sayng-Yong;Chung Bu-Heung;Jang Seong-Hyung
    • Geophysics and Geophysical Exploration
    • /
    • v.2 no.1
    • /
    • pp.26-32
    • /
    • 1999
  • Among the various seismic data processing sequences, the velocity analysis is the most time consuming and man-hour intensive processing steps. For the production seismic data processing, a good velocity analysis tool as well as the high performance computer is required. The tool must give fast and accurate velocity analysis. There are two different approches in the velocity analysis, batch and interactive. In the batch processing, a velocity plot is made at every analysis point. Generally, the plot consisted of a semblance contour, super gather, and a stack pannel. The interpreter chooses the velocity function by analyzing the velocity plot. The technique is highly dependent on the interpreters skill and requires human efforts. As the high speed graphic workstations are becoming more popular, various interactive velocity analysis programs are developed. Although, the programs enabled faster picking of the velocity nodes using mouse, the main improvement of these programs is simply the replacement of the paper plot by the graphic screen. The velocity spectrum is highly sensitive to the presence of the noise, especially the coherent noise often found in the shallow region of the marine seismic data. For the accurate velocity analysis, these noise must be removed before the spectrum is computed. Also, the velocity analysis must be carried out by carefully choosing the location of the analysis point and accuarate computation of the spectrum. The analyzed velocity function must be verified by the mute and stack, and the sequence must be repeated most time. Therefore an iterative, interactive, and unified velocity analysis tool is highly required. An interactive velocity analysis program, xva(X-Window based Velocity Analysis) was invented. The program handles all processes required in the velocity analysis such as composing the super gather, computing the velocity spectrum, NMO correction, mute, and stack. Most of the parameter changes give the final stack via a few mouse clicks thereby enabling the iterative and interactive processing. A simple trace indexing scheme is introduced and a program to nike the index of the Geobit seismic disk file was invented. The index is used to reference the original input, i.e., CDP sort, directly A transformation techinique of the mute function between the T-X domain and NMOC domain is introduced and adopted to the program. The result of the transform is simliar to the remove-NMO technique in suppressing the shallow noise such as direct wave and refracted wave. However, it has two improvements, i.e., no interpolation error and very high speed computing time. By the introduction of the technique, the mute times can be easily designed from the NMOC domain and applied to the super gather in the T-X domain, thereby producing more accurate velocity spectrum interactively. The xva program consists of 28 files, 12,029 lines, 34,990 words and 304,073 characters. The program references Geobit utility libraries and can be installed under Geobit preinstalled environment. The program runs on X-Window/Motif environment. The program menu is designed according to the Motif style guide. A brief usage of the program has been discussed. The program allows fast and accurate seismic velocity analysis, which is necessary computing the AVO (Amplitude Versus Offset) based DHI (Direct Hydrocarn Indicator), and making the high quality seismic sections.

  • PDF

Medical Information Dynamic Access System in Smart Mobile Environments (스마트 모바일 환경에서 의료정보 동적접근 시스템)

  • Jeong, Chang Won;Kim, Woo Hong;Yoon, Kwon Ha;Joo, Su Chong
    • Journal of Internet Computing and Services
    • /
    • v.16 no.1
    • /
    • pp.47-55
    • /
    • 2015
  • Recently, the environment of a hospital information system is a trend to combine various SMART technologies. Accordingly, various smart devices, such as a smart phone, Tablet PC is utilized in the medical information system. Also, these environments consist of various applications executing on heterogeneous sensors, devices, systems and networks. In these hospital information system environment, applying a security service by traditional access control method cause a problems. Most of the existing security system uses the access control list structure. It is only permitted access defined by an access control matrix such as client name, service object method name. The major problem with the static approach cannot quickly adapt to changed situations. Hence, we needs to new security mechanisms which provides more flexible and can be easily adapted to various environments with very different security requirements. In addition, for addressing the changing of service medical treatment of the patient, the researching is needed. In this paper, we suggest a dynamic approach to medical information systems in smart mobile environments. We focus on how to access medical information systems according to dynamic access control methods based on the existence of the hospital's information system environments. The physical environments consist of a mobile x-ray imaging devices, dedicated mobile/general smart devices, PACS, EMR server and authorization server. The software environment was developed based on the .Net Framework for synchronization and monitoring services based on mobile X-ray imaging equipment Windows7 OS. And dedicated a smart device application, we implemented a dynamic access services through JSP and Java SDK is based on the Android OS. PACS and mobile X-ray image devices in hospital, medical information between the dedicated smart devices are based on the DICOM medical image standard information. In addition, EMR information is based on H7. In order to providing dynamic access control service, we classify the context of the patients according to conditions of bio-information such as oxygen saturation, heart rate, BP and body temperature etc. It shows event trace diagrams which divided into two parts like general situation, emergency situation. And, we designed the dynamic approach of the medical care information by authentication method. The authentication Information are contained ID/PWD, the roles, position and working hours, emergency certification codes for emergency patients. General situations of dynamic access control method may have access to medical information by the value of the authentication information. In the case of an emergency, was to have access to medical information by an emergency code, without the authentication information. And, we constructed the medical information integration database scheme that is consist medical information, patient, medical staff and medical image information according to medical information standards.y Finally, we show the usefulness of the dynamic access application service based on the smart devices for execution results of the proposed system according to patient contexts such as general and emergency situation. Especially, the proposed systems are providing effective medical information services with smart devices in emergency situation by dynamic access control methods. As results, we expect the proposed systems to be useful for u-hospital information systems and services.