• Title/Summary/Keyword: 상용 소프트웨어

Search Result 639, Processing Time 0.033 seconds

Development of Intelligent OCR Technology to Utilize Document Image Data (문서 이미지 데이터 활용을 위한 지능형 OCR 기술 개발)

  • Kim, Sangjun;Yu, Donghui;Hwang, Soyoung;Kim, Minho
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2022.05a
    • /
    • pp.212-215
    • /
    • 2022
  • In the era of so-called digital transformation today, the need for the construction and utilization of big data in various fields has increased. Today, a lot of data is produced and stored in a digital device and media-friendly manner, but the production and storage of data for a long time in the past has been dominated by print books. Therefore, the need for Optical Character Recognition (OCR) technology to utilize the vast amount of print books accumulated for a long time as big data was also required in line with the need for big data. In this study, a system for digitizing the structure and content of a document object inside a scanned book image is proposed. The proposal system largely consists of the following three steps. 1) Recognition of area information by document objects (table, equation, picture, text body) in scanned book image. 2) OCR processing for each area of the text body-table-formula module according to recognized document object areas. 3) The processed document informations gather up and returned to the JSON format. The model proposed in this study uses an open-source project that additional learning and improvement. Intelligent OCR proposed as a system in this study showed commercial OCR software-level performance in processing four types of document objects(table, equation, image, text body).

  • PDF

Empirical and Numerical Analyses of a Small Planing Ship Resistance using Longitudinal Center of Gravity Variations (경험식과 수치해석을 이용한 종방향 무게중심 변화에 따른 소형선박의 저항성능 변화에 관한 연구)

  • Michael;Jun-Taek Lim;Nam-Kyun Im;Kwang-Cheol Seo
    • Journal of the Korean Society of Marine Environment & Safety
    • /
    • v.29 no.7
    • /
    • pp.971-979
    • /
    • 2023
  • Small ships (<499 GT) constitute 46% of the existing ships, therefore, it can be concluded that they produce relatively high CO2 gas emissions. Operating in optimal trim conditions can reduce the resistance of the ship, which results in fewer greenhouse gases. An affordable way for trim optimization is to adjust the weight distribution to obtain an optimum longitudinal center of gravity (LCG). Therefore, in this study, the effect of LCG changes on the resistance of a small planing ship is studied using empirical and numerical analyses. The Savitsky method employing Maxsurf resistance and the STAR-CCM+ commercial computational fluid dynamics (CFD) software is used for the empirical and numerical analyses, respectively. Finally, the total resistance from the ship design process is compared to obtain the optimum LCG. To summarize, using numerical analysis, optimum LCG is achieved at the 46.2% length overall (LoA) at Froude Number 0.56, and 43.4% LoA at Froude Number 0.63, which provides a significant resistance reduction of 41.12 - 45.16% compared to the reference point at 29.2% LoA.

On-line Quality Assurance of Linear Accelerator with Electronic Portal Imaging System (전자포탈영상장치(EPID)를 이용한 선형가속기의 기하학적 QC/QA System)

  • Lee, Seok;Jang, Hye-Sook;Choi, Eun-Kyung;Kwon, Soo-Il;Lee, Byung-Yong
    • Progress in Medical Physics
    • /
    • v.9 no.3
    • /
    • pp.127-136
    • /
    • 1998
  • On-line geometrical quality assurance system has been developed using electronic portal imaging system(OQuE). EPID system is networked into Pentium PC in order to transmit the acquisited images to analysis PC. Geometrical QA parameters, including light-radiation field congruence, collimator rotation axis, and gantry rotation axis can be easily analyzed with the help of graphic user interface(GUI) software. Equipped with the EPID (Portal Vision, Varian, USA), geometrical quality assurance of a linear accelerator (CL/2100/CD, Varian, USA), which is networked into OQuE, was performed to evaluate this system. Light-radiation field congruence tests by center of gravity analysis shows 0.2~0.3mm differences for various field sizes. Collimator (or Gantry) rotation axis for various angles could be obtained by superposing 4 shots of angles. The radius of collimator rotation axis is measured to 0.2mm for upper jaw collimator, and 0.1mm for lower jaw. Acquisited images for various gantry angles were rotated according to the gantry angle and actual center of image point obtained from collimator axis test. The rotated images are superpositioned and analyzed as the same method as collimator rotation axis. The radius of gantry rotation axis is calculated 0.3mm for anterior/posterior direction (gantry 0$^{\circ}$ and 170$^{\circ}$) and 0.7mm for right/left direction(gantry 90$^{\circ}$ and 260$^{\circ}$). Image acquisition for data analysis is faster than conventional method and the results turn out to be excellent for the development goal and accurate within a milimeter range. The OQuE system is proven to be a good tool for the geometrical quality assurance of linear accelerator using EPID.

  • PDF

Patients Setup Verification Tool for RT (PSVTS) : DRR, Simulation, Portal and Digital images (방사선치료 시 환자자세 검증을 위한 분석용 도구 개발)

  • Lee Suk;Seong Jinsil;Kwon Soo I1;Chu Sung Sil;Lee Chang Geol;Suh Chang Ok
    • Radiation Oncology Journal
    • /
    • v.21 no.1
    • /
    • pp.100-106
    • /
    • 2003
  • Purpose : To develop a patients' setup verification tool (PSVT) to verify the alignment of the machine and the target isocenters, and the reproduclbility of patients' setup for three dimensional conformal radiotherapy (3DCRT) and intensity modulated radiotherapy (IMRT). The utilization of this system is evaluated through phantom and patient case studies. Materials and methods : We developed and clinically tested a new method for patients' setup verification, using digitally reconstructed radiography (DRR), simulation, porial and digital images. The PSVT system was networked to a Pentium PC for the transmission of the acquired images to the PC for analysis. To verify the alignment of the machine and target isocenters, orthogonal pairs of simulation images were used as verification images. Errors in the isocenter alignment were measured by comparing the verification images with DRR of CT Images. Orthogonal films were taken of all the patients once a week. These verification films were compared with the DRR were used for the treatment setup. By performing this procedure every treatment, using humanoid phantom and patient cases, the errors of localization can be analyzed, with adjustments made from the translation. The reproducibility of the patients' setup was verified using portal and digital images. Results : The PSVT system was developed to verify the alignment of the machine and the target isocenters, and the reproducibility of the patients' setup for 3DCRT and IMRT. The results show that the localization errors are 0.8$\pm$0.2 mm (AP) and 1.0$\pm$0.3 mm (Lateral) in the cases relating to the brain and 1.1$\pm$0.5 mm (AP) and 1.0$\pm$0.6 mm (Lateral) in the cases relating to the pelvis. The reproducibility of the patients' setup was verified by visualization, using real-time image acquisition, leading to the practical utilization of our software Conclusions : A PSVT system was developed for the verification of the alignment between machine and the target isocenters, and the reproduclbility of the patients' setup in 3DCRT and IMRT. With adjustment of the completed GUI-based algorithm, and a good quality DRR image, our software may be used for clinical applications.

Computational Fluid Dynamics Study of Channel Geometric Effect for Fischer-Tropsch Microchannel Reactor (전산유체역학을 이용한 Fischer-Tropsch 마이크로채널 반응기의 채널 구조 영향 분석)

  • Na, Jonggeol;Jung, Ikhwan;Kshetrimayum, Krishnadash S.;Park, Seongho;Park, Chansaem;Han, Chonghun
    • Korean Chemical Engineering Research
    • /
    • v.52 no.6
    • /
    • pp.826-833
    • /
    • 2014
  • Driven by both environmental and economic reasons, the development of small to medium scale GTL(gas-to-liquid) process for offshore applications and for utilizing other stranded or associated gas has recently been studied increasingly. Microchannel GTL reactors have been prefrered over the conventional GTL reactors for such applications, due to its compactness, and additional advantages of small heat and mass transfer distance desired for high heat transfer performance and reactor conversion. In this work, multi-microchannel reactor was simulated by using commercial CFD code, ANSYS FLUENT, to study the geometric effect of the microchannels on the heat transfer phenomena. A heat generation curve was first calculated by modeling a Fischer-Tropsch reaction in a single-microchannel reactor model using Matlab-ASPEN integration platform. The calculated heat generation curve was implemented to the CFD model. Four design variables based on the microchannel geometry namely coolant channel width, coolant channel height, coolant channel to process channel distance, and coolant channel to coolant channel distance, were selected for calculating three dependent variables namely, heat flux, maximum temperature of coolant channel, and maximum temperature of process channel. The simulation results were visualized to understand the effects of the design variables on the dependent variables. Heat flux and maximum temperature of cooling channel and process channel were found to be increasing when coolant channel width and height were decreased. Coolant channel to process channel distance was found to have no effect on the heat transfer phenomena. Finally, total heat flux was found to be increasing and maximum coolant channel temperature to be decreasing when coolant channel to coolant channel distance was decreased. Using the qualitative trend revealed from the present study, an appropriate process channel and coolant channel geometry along with the distance between the adjacent channels can be recommended for a microchannel reactor that meet a desired reactor performance on heat transfer phenomena and hence reactor conversion of a Fischer-Tropsch microchannel reactor.

Parallel Processing of Satellite Images using CUDA Library: Focused on NDVI Calculation (CUDA 라이브러리를 이용한 위성영상 병렬처리 : NDVI 연산을 중심으로)

  • LEE, Kang-Hun;JO, Myung-Hee;LEE, Won-Hee
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.19 no.3
    • /
    • pp.29-42
    • /
    • 2016
  • Remote sensing allows acquisition of information across a large area without contacting objects, and has thus been rapidly developed by application to different areas. Thus, with the development of remote sensing, satellites are able to rapidly advance in terms of their image resolution. As a result, satellites that use remote sensing have been applied to conduct research across many areas of the world. However, while research on remote sensing is being implemented across various areas, research on data processing is presently insufficient; that is, as satellite resources are further developed, data processing continues to lag behind. Accordingly, this paper discusses plans to maximize the performance of satellite image processing by utilizing the CUDA(Compute Unified Device Architecture) Library of NVIDIA, a parallel processing technique. The discussion in this paper proceeds as follows. First, standard KOMPSAT(Korea Multi-Purpose Satellite) images of various sizes are subdivided into five types. NDVI(Normalized Difference Vegetation Index) is implemented to the subdivided images. Next, ArcMap and the two techniques, each based on CPU or GPU, are used to implement NDVI. The histograms of each image are then compared after each implementation to analyze the different processing speeds when using CPU and GPU. The results indicate that both the CPU version and GPU version images are equal with the ArcMap images, and after the histogram comparison, the NDVI code was correctly implemented. In terms of the processing speed, GPU showed 5 times faster results than CPU. Accordingly, this research shows that a parallel processing technique using CUDA Library can enhance the data processing speed of satellites images, and that this data processing benefits from multiple advanced remote sensing techniques as compared to a simple pixel computation like NDVI.

Correlation Analysis of Cause factor through Ship Collision Accident, and Cause factor Analysis through Collision Time (선박 충돌사고의 원인요소 간 상관관계 및 충돌시간에 따른 원인요소 분석)

  • Youn, Donghyup;Shin, Ilsik
    • Journal of the Korean Society of Marine Environment & Safety
    • /
    • v.23 no.1
    • /
    • pp.26-32
    • /
    • 2017
  • Enlargement and speed-up of a ship and diversification of ship's type have served to greatly increase the importance of marine transport means. It's reported that accident occurrence frequency of collision is high next to engine damage among the ship accident types, and that the accident ratio according to human factors is also high. In addition, ship accidents come to occur caused by complex cause factors rather than a sole cause factor, it is necessary to investigate the cause factors through the written verdict. This study proposed the cause factors of collision ship accident on the basis of human factors in collision ship accident among the written verdicts provided by the Korean Maritime Safety Tribunal, and inquired into the cause factor and effect through the correlation analysis of accident occurrence factors. Also, this study predicted the collision accident through analyzed the major cause factor of the occurrence at the zero minute when collision on the basis of the time taken from the time point of detecting collision of ships to the time point of collision occurrence. This study used commercial software-Statistical Package for Social Sciences (SPSS Ver21.0) to do correlation analysis. For time analysis, this study analyzed the cause factor and time by analyzing the time taken from the time point of detected ships to the time point of collision occurrence on the basis of the written verdicts. The study analysis showed that there were many cases of collision ship accidents occurrence caused by more than two sorts of cause factors, and that the case (zero minute) where there is no time to spare for collision avoidance accounted for 36.1 %, and negligence in guard or surveillance of the other ship, and sailing while drowsy, or drinking was a contributor to an accident. Poor watch keeping is very strong relationship with pool ready for sail.

On-Line Determination Steady State in Simulation Output (시뮬레이션 출력의 안정상태 온라인 결정에 관한 연구)

  • 이영해;정창식;경규형
    • Proceedings of the Korea Society for Simulation Conference
    • /
    • 1996.05a
    • /
    • pp.1-3
    • /
    • 1996
  • 시뮬레이션 기법을 이용한 시스템의 분석에 있어서 실험의 자동화는 현재 많은 연구와 개발이 진행 중인 분야이다. 컴퓨터와 정보통신 시스템에 대한 시뮬레이션의 예를 들어 보면, 수많은 모델을 대한 시뮬레이션을 수행할 경우 자동화된 실험의 제어가 요구되고 있다. 시뮬레이션 수행회수, 수행길이, 데이터 수집방법 등과 관련하여 시뮬레이션 실험방법이 자동화가 되지 않으면, 시뮬레이션 실험에 필요한 시간과 인적 자원이 상당히 커지게 되며 출력데이터에 대한 분석에 있어서도 어려움이 따르게 된다. 시뮬레이션 실험방법을 자동화하면서 효율적인 시뮬레이션 출력분석을 위해서는 시뮬레이션을 수행하는 경우에 항상 발생하는 초기편의 (initial bias)를 제거하는 문제가 선결되어야 한다. 시뮬레이션 출력분석에 사용되는 데이터들이 초기편의를 반영하지 않는 안정상태에서 수집된 것이어야만 실제 시스템에 대한 올바른 해석이 가능하다. 실제로 시뮬레이션 출력분석과 관련하여 가장 중요하면서도 어려운 문제는 시뮬레이션의 출력데이터가 이루는 추계적 과정 (stochastic process)의 안정상태 평균과 이 평균에 대한 신뢰구간(confidence interval: c. i.)을 구하는 것이다. 한 신뢰구간에 포함되어 있는 정보는 의사결정자에게 얼마나 정확하게 평균을 추정할 구 있는지 알려 준다. 그러나, 신뢰구간을 구성하는 일은 하나의 시뮬레이션으로부터 얻어진 출력데이터가 일반적으로 비정체상태(nonstationary)이고 자동상관(autocorrelated)되어 있기 때문에, 전통적인 통계적인 기법을 직접적으로 이용할 수 없다. 이러한 문제를 해결하기 위해 시뮬레이션 출력데이터 분석기법이 사용된다.본 논문에서는 초기편의를 제거하기 위해서 필요한 출력데이터의 제거시점을 찾는 새로운 기법으로, 유클리드 거리(Euclidean distance: ED)를 이용한 방법과 현재 패턴 분류(pattern classification) 문제에 널리 사용 중인 역전파 신경망(backpropagation neural networks: BNN) 알고리듬을 이용하는 방법을 제시한다. 이 기법들은 대다수의 기존의 기법과는 달리 시험수행(pilot run)이 필요 없으며, 시뮬레이션의 단일수행(single run) 중에 제거시점을 결정할 수 있다. 제거시점과 관련된 기존 연구는 다음과 같다. 콘웨이방법은 현재의 데이터가 이후 데이터의 최대값이나 최소값이 아니면 이 데이터를 제거시점으로 결정하는데, 알고기듬 구조상 온라인으로 제거시점 결정이 불가능하다. 콘웨이방법이 알고리듬의 성격상 온라인이 불가능한 반면, 수정콘웨이방법 (Modified Conway Rule: MCR)은 현재의 데이터가 이전 데이터와 비교했을 때 최대값이나 최소값이 아닌 경우 현재의 데이터를 제거시점으로 결정하기 때문에 온라인이 가능하다. 평균교차방법(Crossings-of-the-Mean Rule: CMR)은 누적평균을 이용하면서 이 평균을 중심으로 관측치가 위에서 아래로, 또는 아래서 위로 교차하는 회수로 결정한다. 이 기법을 사용하려면 교차회수를 결정해야 하는데, 일반적으로 결정된 교차회수가 시스템에 상관없이 일반적으로 적용가능하지 않다는 문제점이 있다. 누적평균방법(Cumulative-Mean Rule: CMR2)은 여러 번의 시험수행을 통해서 얻어진 출력데이터에 대한 총누적평균(grand cumulative mean)을 그래프로 그린 다음, 안정상태인 점을 육안으로 결정한다. 이 방법은 여러 번의 시뮬레이션을 수행에서 얻어진 데이터들의 평균들에 대한 누적평균을 사용하기 매문에 온라인 제거시점 결정이 불가능하며, 작업자가 그래프를 보고 임의로 결정해야 하는 단점이 있다. Welch방법(Welch's Method: WM)은 브라운 브리지(Brownian bridge) 통계량()을 사용하는데, n이 무한에 가까워질 때, 이 브라운 브리지 분포(Brownian bridge distribution)에 수렴하는 성질을 이용한다. 시뮬레이션 출력데이터를 가지고 배치를 구성한 후 하나의 배치를 표본으로 사용한다. 이 기법은 알고리듬이 복잡하고, 값을 추정해야 하는 단점이 있다. Law-Kelton방법(Law-Kelton's Method: LKM)은 회귀 (regression)이론에 기초하는데, 시뮬레이션이 종료된 후 누적평균데이터에 대해서 회귀직선을 적합(fitting)시킨다. 회귀직선의 기울기가 0이라는 귀무가설이 채택되면 그 시점을 제거시점으로 결정한다. 일단 시뮬레이션이 종료된 다음, 데이터가 모아진 순서의 반대 순서로 데이터를 이용하기 때문에 온라인이 불가능하다. Welch절차(Welch's Procedure: WP)는 5회이상의 시뮬레이션수행을 통해 수집한 데이터의 이동평균을 이용해서 시각적으로 제거시점을 결정해야 하며, 반복제거방법을 사용해야 하기 때문에 온라인 제거시점의 결정이 불가능하다. 또한, 한번에 이동할 데이터의 크기(window size)를 결정해야 한다. 지금까지 알아 본 것처럼, 기존의 방법들은 시뮬레이션의 단일 수행 중의 온라인 제거시점 결정의 관점에서는 미약한 면이 있다. 또한, 현재의 시뮬레이션 상용소프트웨어는 작업자로 하여금 제거시점을 임의로 결정하도록 하기 때문에, 실험중인 시스템에 대해서 정확하고도 정량적으로 제거시점을 결정할 수 없게 되어 있다. 사용자가 임의로 제거시점을 결정하게 되면, 초기편의 문제를 효과적으로 해결하기 어려울 뿐만 아니라, 필요 이상으로 너무 많은 양을 제거하거나 초기편의를 해결하지 못할 만큼 너무 적은 양을 제거할 가능성이 커지게 된다. 또한, 기존의 방법들의 대부분은 제거시점을 찾기 위해서 시험수행이 필요하다. 즉, 안정상태 시점만을 찾기 위한 시뮬레이션 수행이 필요하며, 이렇게 사용된 시뮬레이션은 출력분석에 사용되지 않기 때문에 시간적인 손실이 크게 된다.

  • PDF

An Iterative, Interactive and Unified Seismic Velocity Analysis (반복적 대화식 통합 탄성파 속도분석)

  • Suh Sayng-Yong;Chung Bu-Heung;Jang Seong-Hyung
    • Geophysics and Geophysical Exploration
    • /
    • v.2 no.1
    • /
    • pp.26-32
    • /
    • 1999
  • Among the various seismic data processing sequences, the velocity analysis is the most time consuming and man-hour intensive processing steps. For the production seismic data processing, a good velocity analysis tool as well as the high performance computer is required. The tool must give fast and accurate velocity analysis. There are two different approches in the velocity analysis, batch and interactive. In the batch processing, a velocity plot is made at every analysis point. Generally, the plot consisted of a semblance contour, super gather, and a stack pannel. The interpreter chooses the velocity function by analyzing the velocity plot. The technique is highly dependent on the interpreters skill and requires human efforts. As the high speed graphic workstations are becoming more popular, various interactive velocity analysis programs are developed. Although, the programs enabled faster picking of the velocity nodes using mouse, the main improvement of these programs is simply the replacement of the paper plot by the graphic screen. The velocity spectrum is highly sensitive to the presence of the noise, especially the coherent noise often found in the shallow region of the marine seismic data. For the accurate velocity analysis, these noise must be removed before the spectrum is computed. Also, the velocity analysis must be carried out by carefully choosing the location of the analysis point and accuarate computation of the spectrum. The analyzed velocity function must be verified by the mute and stack, and the sequence must be repeated most time. Therefore an iterative, interactive, and unified velocity analysis tool is highly required. An interactive velocity analysis program, xva(X-Window based Velocity Analysis) was invented. The program handles all processes required in the velocity analysis such as composing the super gather, computing the velocity spectrum, NMO correction, mute, and stack. Most of the parameter changes give the final stack via a few mouse clicks thereby enabling the iterative and interactive processing. A simple trace indexing scheme is introduced and a program to nike the index of the Geobit seismic disk file was invented. The index is used to reference the original input, i.e., CDP sort, directly A transformation techinique of the mute function between the T-X domain and NMOC domain is introduced and adopted to the program. The result of the transform is simliar to the remove-NMO technique in suppressing the shallow noise such as direct wave and refracted wave. However, it has two improvements, i.e., no interpolation error and very high speed computing time. By the introduction of the technique, the mute times can be easily designed from the NMOC domain and applied to the super gather in the T-X domain, thereby producing more accurate velocity spectrum interactively. The xva program consists of 28 files, 12,029 lines, 34,990 words and 304,073 characters. The program references Geobit utility libraries and can be installed under Geobit preinstalled environment. The program runs on X-Window/Motif environment. The program menu is designed according to the Motif style guide. A brief usage of the program has been discussed. The program allows fast and accurate seismic velocity analysis, which is necessary computing the AVO (Amplitude Versus Offset) based DHI (Direct Hydrocarn Indicator), and making the high quality seismic sections.

  • PDF