• Title/Summary/Keyword: toolkit

Search Result 371, Processing Time 0.026 seconds

Comparison Study on Low Energy Physics Model of GEANT4 (GEANT4 저 에너지 전자기 물리 모델에 대한 비교 연구)

  • Park, So-Hyun;Jung, Won-Gyun;Suh, Tae-Suk
    • Journal of Radiation Protection and Research
    • /
    • v.35 no.3
    • /
    • pp.124-134
    • /
    • 2010
  • The Geant4 simulation toolkit provides improved or renewed physics model according to the version. The latest Geant4.9.3 which has been recoded by developers applies inserted Livermore data and renewed physics model to the low energy electromagnetic physics model. And also, Geant4.9.3 improved the physics factors by modified code. In this study, the stopping power and CSDA(Continuously Slowing Down Approximation) range data of electron or particles were acquired in various material and then, these data were compared with NIST(National Institute of Standards and Technology) data. Through comparison between data of Geant4 simulation and NIST, the improvement of physics model on low energy electromagnetic of Geant4.9.3 was evaluated by comparing the Geant4.9.2.

A Study for properties of IK system to 3D character animation education (3D 캐릭터 애니메이션 교육을 위한 IK SYSTEM 특성 연구(Bone, Character Studio, CAT을 중심으로))

  • Cho, Hyung-Ik
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2011.10a
    • /
    • pp.519-523
    • /
    • 2011
  • Today, one of the most important reasons that 3D software becomes a core part of the essential tools in the video contents field like the movies, animation, CF, motion graphic, games and etc. is that they can save budget of contents makings and can produce better effects than conventional methods like miniature, matt painting, extra mobilization and etc. and can save time and have the merit that they are not limited in space. In this paper, I analyzed IK(Inverse kinematics) system characteristics for the efficient education of 3D character animation particularly most used of 3D applications which is now supposed to be nearly necessary elements in game, animation, movie and contents. And by analyzing merits and demerits of each tool on Bone, Character studio and Character Animation Toolkit, systems which are most used practically in the various Inverse kinematics tools, I showed the result of analyses about the fact that educating which of the above three Inverse Kinematics tools is helpful and beneficial for the students for the efficient education in the university where should teach much in the limited time

  • PDF

The Evaluation about the Information Fidelity in the External Image Information Input - Using DICOM Validation Tool - (외부영상정보 입력 시 DICOM정보 충실성에 대한 평가 - DICOM Validation Tool 이용 -)

  • Lee, Song-Woo;Lee, Ho-Yeon;Do, Ji-Hoon;Jang, Hye-Won
    • Korean Journal of Digital Imaging in Medicine
    • /
    • v.13 no.1
    • /
    • pp.33-38
    • /
    • 2011
  • Now a days, there's many change over for PACS among the most of hospital and it standard for DICOM 3.0. These kind of using of DICOM 3.0 improves increasing of medical imaging exchange and service for patient. However, there's some problems of compatibility caused during carry out CD and DVD from hospital. For this reason, this thesis analyzed patients image targeting those storages requested to hospitals in Seoul by using Validation Toolkit which is recommended from KFDA. The analyze type is like this. Make 100 data, total 500, each of MRI CT Plain x-ray Ultrasound PET-CT images and analyzed type of error occurred and loyalty of information. If express percentage of error occurred statistically, we can get a result as follows MRI 5%, Plain x-ray 11%, CT 18%, US 25%, PET-CT 30%. The reson why percentage of error occurred in PET-CT is because of imperfective support and we could notice that we weren't devoted to information. Even though, PET-CT showed highest percentage of error occurred, currently DICOM data improved a lot compare to past. Moreover, it should be devoted to rule of IHE TOOL or DICOM. In conclusion, we can help radiographer to analyze information of image by providing clues for solving primary problem and further more, each of PACS company or equipment company can enhance fidelity for following standard of image information through realizing the actual problem during transfer of image information.

  • PDF

Prediction of Stream Flow on Probability Distributed Model using Multi-objective Function (다목적함수를 이용한 PDM 모형의 유량 분석)

  • Ahn, Sang-Eok;Lee, Hyo-Sang;Jeon, Min-Woo
    • Journal of the Korean Society of Hazard Mitigation
    • /
    • v.9 no.5
    • /
    • pp.93-102
    • /
    • 2009
  • A prediction of streamflow based on multi-objective function is presented to check the performance of Probability Distributed Model(PDM) in Miho stream basin, Chungcheongbuk-do, Korea. PDM is a lumped conceptual rainfall runoff model which has been widely used for flood prevention activities in UK Environmental Agency. The Monte Carlo Analysis Toolkit(MCAT) is a numerical analysis tools based on population sampling, which allows evaluation of performance, identifiability, regional sensitivity and etc. PDM is calibrated for five model parameters by using MCAT. The results show that the performance of model parameters(cmax and k(q)) indicates high identifiability and the others obtain equifinality. In addition, the multi-objective function is applied to PDM for seeking suitable model parameters. The solution of the multi-objective function consists of the Pareto solution accounting to various trade-offs between the different objective functions considering properties of hydrograph. The result indicated the performance of model and simulated hydrograph are acceptable in terms on Nash Sutcliffe Effciency*(=0.035), FSB(=0.161), and FDBH(=0.809) to calibration periods, validation periods as well.

Provisioning Scheme of Large Volume File for Efficient Job Execution in Grid Environment (그리드 환경에서 효율적인 작업 처리를 위한 대용량 파일 프로비저닝 방안)

  • Kim, Eun-Sung;Yeom, Beon-Y.
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.15 no.8
    • /
    • pp.525-533
    • /
    • 2009
  • Staging technique is used to provide files for a job in the Grid. If a staged file has large volume, the start time of the job is delayed and the throughput of job in the Grid may decrease. Therefore, removal of staging overhead helps the Grid operate more efficiently. In this paper, we present two methods for efficient file provisioning to clear the overhead. First, we propose RA-RFT, which extends RFT of Globus Toolkit and enables it to utilize RLS with replica information. RA-RFT can reduce file transfer time by doing partial transfer for each replica in parallel. Second, we suggest Remote Link that uses remote I/O instead of file transfer. Remote link is able to save storage of computational nodes and enables fast file provisioning via prefetching. Through various experiments, we argue that our two methods have an advantage over existing staging techniques.

Bragg-curve simulation of carbon-ion beams for particle-therapy applications: A study with the GEANT4 toolkit

  • Hamad, Morad Kh.
    • Nuclear Engineering and Technology
    • /
    • v.53 no.8
    • /
    • pp.2767-2773
    • /
    • 2021
  • We used the GEANT4 Monte Carlo MC Toolkit to simulate carbon ion beams incident on water, tissue, and bone, taking into account nuclear fragmentation reactions. Upon increasing the energy of the primary beam, the position of the Bragg-Peak transfers to a location deeper inside the phantom. For different materials, the peak is located at a shallower depth along the beam direction and becomes sharper with increasing electron density NZ. Subsequently, the generated depth dose of the Bragg curve is then benchmarked with experimental data from GSI in Germany. The results exhibit a reasonable correlation with GSI experimental data with an accuracy of between 0.02 and 0.08 cm, thus establishing the basis to adopt MC in heavy-ion treatment planning. The Kolmogorov-Smirnov K-S test further ascertained from a statistical point of view that the simulation data matched the experimentally measured data very well. The two-dimensional isodose contours at the entrance were compared to those around the peak position and in the tail region beyond the peak, showing that bone produces more dose, in comparison to both water and tissue, due to secondary doses. In the water, the results show that the maximum energy deposited per fragment is mainly attributed to secondary carbon ions, followed by secondary boron and beryllium. Furthermore, the number of protons produced is the highest, thus making the maximum contribution to the total dose deposition in the tail region. Finally, the associated spectra of neutrons and photons were analyzed. The mean neutron energy value was found to be 16.29 MeV, and 1.03 MeV for the secondary gamma. However, the neutron dose was found to be negligible as compared to the total dose due to their longer range.

Optimal Deployment for Evacuation Safety Zone at Intermodal Transfer Station (복합환승센터 피난대피구역 적정 배치 방법론 개발)

  • You, So-Young;Jeong, Eunbi
    • The Journal of The Korea Institute of Intelligent Transport Systems
    • /
    • v.18 no.1
    • /
    • pp.27-42
    • /
    • 2019
  • It is not easy to evacuate when people face with emergency situation in deep underground space because space perception and synthetic judgement are readily lowered. In stead of evacuating safely outside within the given time, evacuation safety zone is required to be designed and installed. In this study, PATS (Pedestrian movement based Assessment Toolkit for Simulation) is applied to build a comprehensive and analytic framework for seeking the optimal (or proper) numbers and locations of evacuation safety zone. With two scenarios of emergency situation at intermodal transfer center with the 6 floor in underground, the problematic location on the evacuation path has been identified and the proper locations has been presented.

Efficient Null Pointer Dereference Vulnerability Detection by Data Dependency Analysis on Binary (효율적 데이터 의존성 분석을 이용한 바이너리 기반 Null Pointer Dereference 취약점 탐지 도구)

  • Wenhui Jin;Heekuck Oh
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.33 no.2
    • /
    • pp.253-266
    • /
    • 2023
  • The Null Pointer Dereference vulnerability is a significant vulnerability that can cause severe attacks such as denial-of-service. Previous research has proposed methods for detecting vulnerabilities, but large and complex programs pose a challenge to their efficiency. In this paper, we present a lightweight tool for detecting specific functions in large binaryprograms through symbolizing variables and emulating program execution. The tool detects vulnerabilities through data dependency analysis and heuristics in each execution path. While our tool had an 8% higher false positive rate than the bap_toolkit, it detected all existing vulnerabilities in our dataset.

System Journey Map Based on Touch Point (터치포인트를 기반으로 한 시스템 여정 맵)

  • Yoo, Jae Yeon;Pan, Young Hwan
    • Design Convergence Study
    • /
    • v.14 no.2
    • /
    • pp.17-32
    • /
    • 2015
  • The perspectives of customers and providers as for the services have the same objective; however, perspectives and objective are distinctly different. But the existing methodologies have mostly pondered only based on the customer-centric perspective or separately from the perspective of providers. It would be necessary to take these two perspectives as a whole by combining them into one perspective since they not only have differences but also common points. Therefore, in this study suggested the System Journey Map in terms of internal staff based on the customer's task in service. System Journey Map is consists of four parts : a customer journey map, thus the performance of internal staff, internal staff satisfaction evaluation, and finally the performance of internal staff assessment of the senior staff. After releasing a service, with customers and internal staff made a point of contact to identify specific behavioral patterns and whether any part of the problem so that this toolkit gives us an expectation to be a useful map, which is the intangible being placed in service, not just visualize and understand for identifying problems.

Data Model Study for National Research Data Commons Service (국가연구데이터커먼즈 서비스를 위한 데이터모델 연구)

  • Cho, Minhee;Lee, Mikyoung;Song, Sa-kwang;Yim, Hyung-Jun
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2022.10a
    • /
    • pp.436-438
    • /
    • 2022
  • National Research Data Commons aims to build a system that can be used jointly by arranging analysis resources such as computing infrastructure, software, toolkit, API, and services used for data analysis together with research data to maximize the use of research data. do. The sharing and utilization system for publications and research data in the R&D process is well known. However, the environment in which data and tightly coupled software and computing infrastructure can be shared and utilized is insignificant and there is no management system. In this study, a data model is designed to systematically manage information on digital research resources required in the data-oriented R&D research process. This will be used to register and manage digital research resource information in the National Research Data Commons Service.

  • PDF