• Title/Summary/Keyword: error distribution

Search Result 2,036, Processing Time 0.036 seconds

Enhancement of Image Contrast in Linacgram through Image Processing (전산처리를 통한 Linacgram의 화질개선)

  • Suh, Hyun-Suk;Shin, Hyun-Kyo;Lee, Re-Na
    • Radiation Oncology Journal
    • /
    • v.18 no.4
    • /
    • pp.345-354
    • /
    • 2000
  • Purpose : Conventional radiation therapy Portal images gives low contrast images. The purpose of this study was to enhance image contrast of a linacgram by developing a low-cost image processing method. Materials and Methods : Chest linacgram was obtained by irradiating humanoid Phantom and scanned using Diagnostic-Pro scanner for image processing. Several types of scan method were used in scanning. These include optical density scan, histogram equalized scan, linear histogram based scan, linear histogram independent scan, linear optical density scan, logarithmic scan, and power square root scan. The histogram distribution of the scanned images were plotted and the ranges of the gray scale were compared among various scan types. The scanned images were then transformed to the gray window by pallette fitting method and the contrast of the reprocessed portal images were evaluated for image improvement. Portal images of patients were also taken at various anatomic sites and the images were processed by Gray Scale Expansion (GSE) method. The patient images were analyzed to examine the feasibility of using the GSE technique in clinic. Results :The histogram distribution showed that minimum and maximum gray scale ranges of 3192 and 21940 were obtained when the image was scanned using logarithmic method and square root method, respectively. Out of 256 gray scale, only 7 to 30$\%$ of the steps were used. After expanding the gray scale to full range, contrast of the portal images were improved. Experiment peformed with patient image showed that improved identification of organs were achieved by GSE in portal images of knee joint, head and neck, lung, and pelvis. Conclusion :Phantom study demonstrated that the GSE technique improved image contrast of a linacgram. This indicates that the decrease in image quality resulting from the dual exposure, could be improved by expanding the gray scale. As a result, the improved technique will make it possible to compare the digitally reconstructed radiographs (DRR) and simulation image for evaluating the patient positioning error.

  • PDF

Modeling and mapping fuel moisture content using equilibrium moisture content computed from weather data of the automatic mountain meteorology observation system (AMOS) (산악기상자료와 목재평형함수율에 기반한 산림연료습도 추정식 개발)

  • Lee, HoonTaek;WON, Myoung-Soo;YOON, Suk-Hee;JANG, Keun-Chang
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.22 no.3
    • /
    • pp.21-36
    • /
    • 2019
  • Dead fuel moisture content is a key variable in fire danger rating as it affects fire ignition and behavior. This study evaluates simple regression models estimating the moisture content of standardized 10-h fuel stick (10-h FMC) at three sites with different characteristics(urban and outside/inside the forest). Equilibrium moisture content (EMC) was used as an independent variable, and in-situ measured 10-h FMC was used as a dependent variable and validation data. 10-h FMC spatial distribution maps were created for dates with the most frequent fire occurrence during 2013-2018. Also, 10-h FMC values of the dates were analyzed to investigate under which 10-h FMC condition forest fire is likely to occur. As the results, fitted equations could explain considerable part of the variance in 10-h FMC (62~78%). Compared to the validation data, the models performed well with R2 ranged from 0.53 to 0.68, root mean squared error (RMSE) ranged from 2.52% to 3.43%, and bias ranged from -0.41% to 1.10%. When the 10-h FMC model fitted for one site was applied to the other sites, $R^2$ was maintained as the same while RMSE and bias increased up to 5.13% and 3.68%, respectively. The major deficiency of the 10-h FMC model was that it poorly caught the difference in the drying process after rainfall between 10-h FMC and EMC. From the analysis of 10-h FMC during the dates fire occurred, more than 70% of the fires occurred under a 10-h FMC condition of less than 10.5%. Overall, the present study suggested a simple model estimating 10-h FMC with acceptable performance. Applying the 10-h FMC model to the automatic mountain weather observation system was successfully tested to produce a national-scale 10-h FMC spatial distribution map. This data will be fundamental information for forest fire research, and will support the policy maker.

A Case Study: Improvement of Wind Risk Prediction by Reclassifying the Detection Results (풍해 예측 결과 재분류를 통한 위험 감지확률의 개선 연구)

  • Kim, Soo-ock;Hwang, Kyu-Hong
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.23 no.3
    • /
    • pp.149-155
    • /
    • 2021
  • Early warning systems for weather risk management in the agricultural sector have been developed to predict potential wind damage to crops. These systems take into account the daily maximum wind speed to determine the critical wind speed that causes fruit drops and provide the weather risk information to farmers. In an effort to increase the accuracy of wind risk predictions, an artificial neural network for binary classification was implemented. In the present study, the daily wind speed and other weather data, which were measured at weather stations at sites of interest in Jeollabuk-do and Jeollanam-do as well as Gyeongsangbuk- do and part of Gyeongsangnam- do provinces in 2019, were used for training the neural network. These weather stations include 210 synoptic and automated weather stations operated by the Korean Meteorological Administration (KMA). The wind speed data collected at the same locations between January 1 and December 12, 2020 were used to validate the neural network model. The data collected from December 13, 2020 to February 18, 2021 were used to evaluate the wind risk prediction performance before and after the use of the artificial neural network. The critical wind speed of damage risk was determined to be 11 m/s, which is the wind speed reported to cause fruit drops and damages. Furthermore, the maximum wind speeds were expressed using Weibull distribution probability density function for warning of wind damage. It was found that the accuracy of wind damage risk prediction was improved from 65.36% to 93.62% after re-classification using the artificial neural network. Nevertheless, the error rate also increased from 13.46% to 37.64%, as well. It is likely that the machine learning approach used in the present study would benefit case studies where no prediction by risk warning systems becomes a relatively serious issue.

Physical Characterization of Domestic Aggregate (국내 골재의 물리적 특성 분석)

  • Junyoung Ko;Eungyu Park;Junghae Choi;Jong-Tae Kim
    • The Journal of Engineering Geology
    • /
    • v.33 no.1
    • /
    • pp.169-187
    • /
    • 2023
  • Aggregates from 84 cities and counties in Korea were tested for quality to allow analysis of the physical characteristics of aggregates from river, land, and forest environments. River and land aggregates were analyzed for 18 test items, and forest aggregates for 12 test items. They were classified according to watershed and geology, respectively. The observed physical characteristics of the river aggregates by basin were as follows: aggregates from the Geum River basin passed through 2.5, 1.2, 0.6, 0.3, 0.15, and 0.08 mm sieves; clay lumps constituted the Nakdong River basin material; aggregates from the Seomjin River basin passed through 10, 5, and 2.5 mm sieves; those from the Youngsang River basin passed through 1.2, 0.6, 0.3, 0.15, and 0.08 mm sieves; and aggregates from the Han River basin passed through 10, 5, 2.5, 1.2, 0.6, 0.3, and 0.08 mm sieves, Stability; Standard errors were analyzed for the average amount passing through 10, 0.6, and 0.08 mm silver sieves, and performance rate showed different distribution patterns from other physical characteristics. Analysis of variance found that 16 of the 18 items, excluding the absorption rate and the performance rate, had statistically significant differences in their averages by region. Considering land aggregates by basin, those from the Nakdong River basin excluding the Geum River basin had clay lumps, those from the Seomjin River basin had 10 and 5 mm sieve passage, aggregates from the Youngsang River basin had 0.08 mm sieve passage, and those from the Han River basin had 10, 0.6, and 0.08 mm sieve passage. The standard error of the mean of the quantity showed a different distribution pattern from the other physical characteristics. Analysis of variance found a statistically significant difference in the average of all 18 items by region. Analyzing forest aggregates by geology showed distributions of porosity patterns different from those of other physical characteristics in metamorphic rocks (but not igneous rocks), and distributions of wear rate and porosity were different from those of sedimentary rocks. There were statistically significant differences in the average volume mass, water absorption rate, wear rate, and Sc/Rc items by lipid.

Performance Evaluation of Radiochromic Films and Dosimetry CheckTM for Patient-specific QA in Helical Tomotherapy (나선형 토모테라피 방사선치료의 환자별 품질관리를 위한 라디오크로믹 필름 및 Dosimetry CheckTM의 성능평가)

  • Park, Su Yeon;Chae, Moon Ki;Lim, Jun Teak;Kwon, Dong Yeol;Kim, Hak Joon;Chung, Eun Ah;Kim, Jong Sik
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.32
    • /
    • pp.93-109
    • /
    • 2020
  • Purpose: The radiochromic film (Gafchromic EBT3, Ashland Advanced Materials, USA) and 3-dimensional analysis system dosimetry checkTM (DC, MathResolutions, USA) were evaluated for patient-specific quality assurance (QA) of helical tomotherapy. Materials and Methods: Depending on the tumors' positions, three types of targets, which are the abdominal tumor (130.6㎤), retroperitoneal tumor (849.0㎤), and the whole abdominal metastasis tumor (3131.0㎤) applied to the humanoid phantom (Anderson Rando Phantom, USA). We established a total of 12 comparative treatment plans by the four geometric conditions of the beam irradiation, which are the different field widths (FW) of 2.5-cm, 5.0-cm, and pitches of 0.287, 0.43. Ionization measurements (1D) with EBT3 by inserting the cheese phantom (2D) were compared to DC measurements of the 3D dose reconstruction on CT images from beam fluence log information. For the clinical feasibility evaluation of the DC, dose reconstruction has been performed using the same cheese phantom with the EBT3 method. Recalculated dose distributions revealed the dose error information during the actual irradiation on the same CT images quantitatively compared to the treatment plan. The Thread effect, which might appear in the Helical Tomotherapy, was analyzed by ripple amplitude (%). We also performed gamma index analysis (DD: 3mm/ DTA: 3%, pass threshold limit: 95%) for pattern check of the dose distribution. Results: Ripple amplitude measurement resulted in the highest average of 23.1% in the peritoneum tumor. In the radiochromic film analysis, the absolute dose was on average 0.9±0.4%, and gamma index analysis was on average 96.4±2.2% (Passing rate: >95%), which could be limited to the large target sizes such as the whole abdominal metastasis tumor. In the DC analysis with the humanoid phantom for FW of 5.0-cm, the three regions' average was 91.8±6.4% in the 2D and 3D plan. The three planes (axial, coronal, and sagittal) and dose profile could be analyzed with the entire peritoneum tumor and the whole abdominal metastasis target, with planned dose distributions. The dose errors based on the dose-volume histogram in the DC evaluations increased depending on FW and pitch. Conclusion: The DC method could implement a dose error analysis on the 3D patient image data by the measured beam fluence log information only without any dosimetry tools for patient-specific quality assurance. Also, there may be no limit to apply for the tumor location and size; therefore, the DC could be useful in patient-specific QAl during the treatment of Helical Tomotherapy of large and irregular tumors.

The Prediction of Export Credit Guarantee Accident using Machine Learning (기계학습을 이용한 수출신용보증 사고예측)

  • Cho, Jaeyoung;Joo, Jihwan;Han, Ingoo
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.1
    • /
    • pp.83-102
    • /
    • 2021
  • The government recently announced various policies for developing big-data and artificial intelligence fields to provide a great opportunity to the public with respect to disclosure of high-quality data within public institutions. KSURE(Korea Trade Insurance Corporation) is a major public institution for financial policy in Korea, and thus the company is strongly committed to backing export companies with various systems. Nevertheless, there are still fewer cases of realized business model based on big-data analyses. In this situation, this paper aims to develop a new business model which can be applied to an ex-ante prediction for the likelihood of the insurance accident of credit guarantee. We utilize internal data from KSURE which supports export companies in Korea and apply machine learning models. Then, we conduct performance comparison among the predictive models including Logistic Regression, Random Forest, XGBoost, LightGBM, and DNN(Deep Neural Network). For decades, many researchers have tried to find better models which can help to predict bankruptcy since the ex-ante prediction is crucial for corporate managers, investors, creditors, and other stakeholders. The development of the prediction for financial distress or bankruptcy was originated from Smith(1930), Fitzpatrick(1932), or Merwin(1942). One of the most famous models is the Altman's Z-score model(Altman, 1968) which was based on the multiple discriminant analysis. This model is widely used in both research and practice by this time. The author suggests the score model that utilizes five key financial ratios to predict the probability of bankruptcy in the next two years. Ohlson(1980) introduces logit model to complement some limitations of previous models. Furthermore, Elmer and Borowski(1988) develop and examine a rule-based, automated system which conducts the financial analysis of savings and loans. Since the 1980s, researchers in Korea have started to examine analyses on the prediction of financial distress or bankruptcy. Kim(1987) analyzes financial ratios and develops the prediction model. Also, Han et al.(1995, 1996, 1997, 2003, 2005, 2006) construct the prediction model using various techniques including artificial neural network. Yang(1996) introduces multiple discriminant analysis and logit model. Besides, Kim and Kim(2001) utilize artificial neural network techniques for ex-ante prediction of insolvent enterprises. After that, many scholars have been trying to predict financial distress or bankruptcy more precisely based on diverse models such as Random Forest or SVM. One major distinction of our research from the previous research is that we focus on examining the predicted probability of default for each sample case, not only on investigating the classification accuracy of each model for the entire sample. Most predictive models in this paper show that the level of the accuracy of classification is about 70% based on the entire sample. To be specific, LightGBM model shows the highest accuracy of 71.1% and Logit model indicates the lowest accuracy of 69%. However, we confirm that there are open to multiple interpretations. In the context of the business, we have to put more emphasis on efforts to minimize type 2 error which causes more harmful operating losses for the guaranty company. Thus, we also compare the classification accuracy by splitting predicted probability of the default into ten equal intervals. When we examine the classification accuracy for each interval, Logit model has the highest accuracy of 100% for 0~10% of the predicted probability of the default, however, Logit model has a relatively lower accuracy of 61.5% for 90~100% of the predicted probability of the default. On the other hand, Random Forest, XGBoost, LightGBM, and DNN indicate more desirable results since they indicate a higher level of accuracy for both 0~10% and 90~100% of the predicted probability of the default but have a lower level of accuracy around 50% of the predicted probability of the default. When it comes to the distribution of samples for each predicted probability of the default, both LightGBM and XGBoost models have a relatively large number of samples for both 0~10% and 90~100% of the predicted probability of the default. Although Random Forest model has an advantage with regard to the perspective of classification accuracy with small number of cases, LightGBM or XGBoost could become a more desirable model since they classify large number of cases into the two extreme intervals of the predicted probability of the default, even allowing for their relatively low classification accuracy. Considering the importance of type 2 error and total prediction accuracy, XGBoost and DNN show superior performance. Next, Random Forest and LightGBM show good results, but logistic regression shows the worst performance. However, each predictive model has a comparative advantage in terms of various evaluation standards. For instance, Random Forest model shows almost 100% accuracy for samples which are expected to have a high level of the probability of default. Collectively, we can construct more comprehensive ensemble models which contain multiple classification machine learning models and conduct majority voting for maximizing its overall performance.

A New Exploratory Research on Franchisor's Provision of Exclusive Territories (가맹본부의 배타적 영업지역보호에 대한 탐색적 연구)

  • Lim, Young-Kyun;Lee, Su-Dong;Kim, Ju-Young
    • Journal of Distribution Research
    • /
    • v.17 no.1
    • /
    • pp.37-63
    • /
    • 2012
  • In franchise business, exclusive sales territory (sometimes EST in table) protection is a very important issue from an economic, social and political point of view. It affects the growth and survival of both franchisor and franchisee and often raises issues of social and political conflicts. When franchisee is not familiar with related laws and regulations, franchisor has high chance to utilize it. Exclusive sales territory protection by the manufacturer and distributors (wholesalers or retailers) means sales area restriction by which only certain distributors have right to sell products or services. The distributor, who has been granted exclusive sales territories, can protect its own territory, whereas he may be prohibited from entering in other regions. Even though exclusive sales territory is a quite critical problem in franchise business, there is not much rigorous research about the reason, results, evaluation, and future direction based on empirical data. This paper tries to address this problem not only from logical and nomological validity, but from empirical validation. While we purse an empirical analysis, we take into account the difficulties of real data collection and statistical analysis techniques. We use a set of disclosure document data collected by Korea Fair Trade Commission, instead of conventional survey method which is usually criticized for its measurement error. Existing theories about exclusive sales territory can be summarized into two groups as shown in the table below. The first one is about the effectiveness of exclusive sales territory from both franchisor and franchisee point of view. In fact, output of exclusive sales territory can be positive for franchisors but negative for franchisees. Also, it can be positive in terms of sales but negative in terms of profit. Therefore, variables and viewpoints should be set properly. The other one is about the motive or reason why exclusive sales territory is protected. The reasons can be classified into four groups - industry characteristics, franchise systems characteristics, capability to maintain exclusive sales territory, and strategic decision. Within four groups of reasons, there are more specific variables and theories as below. Based on these theories, we develop nine hypotheses which are briefly shown in the last table below with the results. In order to validate the hypothesis, data is collected from government (FTC) homepage which is open source. The sample consists of 1,896 franchisors and it contains about three year operation data, from 2006 to 2008. Within the samples, 627 have exclusive sales territory protection policy and the one with exclusive sales territory policy is not evenly distributed over 19 representative industries. Additional data are also collected from another government agency homepage, like Statistics Korea. Also, we combine data from various secondary sources to create meaningful variables as shown in the table below. All variables are dichotomized by mean or median split if they are not inherently dichotomized by its definition, since each hypothesis is composed by multiple variables and there is no solid statistical technique to incorporate all these conditions to test the hypotheses. This paper uses a simple chi-square test because hypotheses and theories are built upon quite specific conditions such as industry type, economic condition, company history and various strategic purposes. It is almost impossible to find all those samples to satisfy them and it can't be manipulated in experimental settings. However, more advanced statistical techniques are very good on clean data without exogenous variables, but not good with real complex data. The chi-square test is applied in a way that samples are grouped into four with two criteria, whether they use exclusive sales territory protection or not, and whether they satisfy conditions of each hypothesis. So the proportion of sample franchisors which satisfy conditions and protect exclusive sales territory, does significantly exceed the proportion of samples that satisfy condition and do not protect. In fact, chi-square test is equivalent with the Poisson regression which allows more flexible application. As results, only three hypotheses are accepted. When attitude toward the risk is high so loyalty fee is determined according to sales performance, EST protection makes poor results as expected. And when franchisor protects EST in order to recruit franchisee easily, EST protection makes better results. Also, when EST protection is to improve the efficiency of franchise system as a whole, it shows better performances. High efficiency is achieved as EST prohibits the free riding of franchisee who exploits other's marketing efforts, and it encourages proper investments and distributes franchisee into multiple regions evenly. Other hypotheses are not supported in the results of significance testing. Exclusive sales territory should be protected from proper motives and administered for mutual benefits. Legal restrictions driven by the government agency like FTC could be misused and cause mis-understandings. So there need more careful monitoring on real practices and more rigorous studies by both academicians and practitioners.

  • PDF

Evaluation of the Usefulness of MapPHAN for the Verification of Volumetric Modulated Arc Therapy Planning (용적세기조절회전치료 치료계획 확인에 사용되는 MapPHAN의 유용성 평가)

  • Woo, Heon;Park, Jang Pil;Min, Jae Soon;Lee, Jae Hee;Yoo, Suk Hyun
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.25 no.2
    • /
    • pp.115-121
    • /
    • 2013
  • Purpose: Latest linear accelerator and the introduction of new measurement equipment to the agency that the introduction of this equipment in the future, by analyzing the process of confirming the usefulness of the preparation process for applying it in the clinical causes some problems, should be helpful. Materials and Methods: All measurements TrueBEAM STX (Varian, USA) was used, and a file specific to each energy, irradiation conditions, the dose distribution was calculated using a computerized treatment planning equipment (Eclipse ver 10.0.39, Varian, USA). Measuring performance and cause errors in MapCHECK 2 were analyzed and measured against. In order to verify the performance of the MapCHECK 2, 6X, 6X-FFF, 10X, 10X-FFF, 15X field size $10{\times}10$ cm, gantry $0^{\circ}$, $180^{\circ}$ direction was measured by the energy. IGRT couch of the CT values affect the measurements in order to confirm, CT number values : -800 (Carbon) & -950 (COUCH in the air), -100 & 6X-950 in the state for FFF, 15X of the energy field sizes $10{\times}10$, gantry $180^{\circ}$, $135^{\circ}$, $275^{\circ}$ directionwas measured at, MapPHAN allocated to confirm the value of HU were compared, using the treatment planning computer for, Measurement error problem by the sharp edges MapPHAN Learn gantry direction MapPHAN of dependence was measured in three ways. GANTRY $90^{\circ}$, $270^{\circ}$ in the direction of the vertically erected settings 6X-FFF, 15X respectively, and Setting the state established as a horizontal field sizes $10{\times}10$, $90^{\circ}$, $45^{\circ}$, $315^{\circ}$, $270^{\circ}$ of in the direction of the energy-6X-FFF, 15X, respectively, were measured. Without intensity modulated beam of the third open arc were investigated. Results: Of basic performance MapCHECK confirm the attenuation measured by Couch, measured from the measured HU values that are assigned to the MAP-PHAN, check for calculation accuracy for the angled edge of the MapPHAN all come in a range of valid measurement errors do not affect the could see. three ways for the Gantry direction dependence, the first of the meter built into the value of the Gantry $270^{\circ}$ (relative $0^{\circ}$), $90^{\circ}$ (relative $180^{\circ}$), 6X-FFF, 15X from each -1.51, 0.83% and -0.63, -0.22% was not affected by the AP/PA direction represented. Setting the meter horizontally Gantry $90^{\circ}$, $270^{\circ}$ from the couch, Energy 6X-FFF 4.37, 2.84%, 15X, -9.63, -13.32% the difference. By-side direction measurements MapPHAN in value is not within the valid range can not, because that could be confirmed as gamma pass rate 3% of the value is greater than the value shown. You can check the Open Arc 6X-FFF, 15X energy, field size $10{\times}10$ cm $360^{\circ}$ rotation of the dose distribution in the state to look at nearly 90% pass rate to emerge. Conclusion: Based on the above results, the MapPHAN gantry direction dependence by side in the direction of the beam relative dose distribution suitable for measuring the gamma value, but accurate measurement of the absolute dose can not be considered is. this paper, a more accurate treatment plan in order to confirm, Reduce the tolerance for VMAT, such as lateral rotation investigation in order to measure accurate absolute isodose using a combination of IMF (Isocentric Mounting Fixture) MapCHEK 2, will be able to minimize the impact due to the angular dependence.

  • PDF

The Availability of the step optimization in Monaco Planning system (모나코 치료계획 시스템에서 단계적 최적화 조건 실현의 유용성)

  • Kim, Dae Sup
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.26 no.2
    • /
    • pp.207-216
    • /
    • 2014
  • Purpose : We present a method to reduce this gap and complete the treatment plan, to be made by the re-optimization is performed in the same conditions as the initial treatment plan different from Monaco treatment planning system. Materials and Methods : The optimization is carried in two steps when performing the inverse calculation for volumetric modulated radiation therapy or intensity modulated radiation therapy in Monaco treatment planning system. This study was the first plan with a complete optimization in two steps by performing all of the treatment plan, without changing the optimized condition from Step 1 to Step 2, a typical sequential optimization performed. At this time, the experiment was carried out with a pencil beam and Monte Carlo algorithm is applied In step 2. We compared initial plan and re-optimized plan with the same optimized conditions. And then evaluated the planning dose by measurement. When performing a re-optimization for the initial treatment plan, the second plan applied the step optimization. Results : When the common optimization again carried out in the same conditions in the initial treatment plan was completed, the result is not the same. From a comparison of the treatment planning system, similar to the dose-volume the histogram showed a similar trend, but exhibit different values that do not satisfy the conditions best optimized dose, dose homogeneity and dose limits. Also showed more than 20% different in comparison dosimetry. If different dose algorithms, this measure is not the same out. Conclusion : The process of performing a number of trial and error, and you get to the ultimate goal of treatment planning optimization process. If carried out to optimize the completion of the initial trust only the treatment plan, we could be made of another treatment plan. The similar treatment plan could not satisfy to optimization results. When you perform re-optimization process, you will need to apply the step optimized conditions, making sure the dose distribution through the optimization process.

Evaluation of Factors Used in AAPM TG-43 Formalism Using Segmented Sources Integration Method and Monte Carlo Simulation: Implementation of microSelectron HDR Ir-192 Source (미소선원 적분법과 몬테칼로 방법을 이용한 AAPM TG-43 선량계산 인자 평가: microSelectron HDR Ir-192 선원에 대한 적용)

  • Ahn, Woo-Sang;Jang, Won-Woo;Park, Sung-Ho;Jung, Sang-Hoon;Cho, Woon-Kap;Kim, Young-Seok;Ahn, Seung-Do
    • Progress in Medical Physics
    • /
    • v.22 no.4
    • /
    • pp.190-197
    • /
    • 2011
  • Currently, the dose distribution calculation used by commercial treatment planning systems (TPSs) for high-dose rate (HDR) brachytherapy is derived from point and line source approximation method recommended by AAPM Task Group 43 (TG-43). However, the study of Monte Carlo (MC) simulation is required in order to assess the accuracy of dose calculation around three-dimensional Ir-192 source. In this study, geometry factor was calculated using segmented sources integration method by dividing microSelectron HDR Ir-192 source into smaller parts. The Monte Carlo code (MCNPX 2.5.0) was used to calculate the dose rate $\dot{D}(r,\theta)$ at a point ($r,\theta$) away from a HDR Ir-192 source in spherical water phantom with 30 cm diameter. Finally, anisotropy function and radial dose function were calculated from obtained results. The obtained geometry factor was compared with that calculated from line source approximation. Similarly, obtained anisotropy function and radial dose function were compared with those derived from MCPT results by Williamson. The geometry factor calculated from segmented sources integration method and line source approximation was within 0.2% for $r{\geq}0.5$ cm and 1.33% for r=0.1 cm, respectively. The relative-root mean square error (R-RMSE) of anisotropy function obtained by this study and Williamson was 2.33% for r=0.25 cm and within 1% for r>0.5 cm, respectively. The R-RMSE of radial dose function was 0.46% at radial distance from 0.1 to 14.0 cm. The geometry factor acquired from segmented sources integration method and line source approximation was in good agreement for $r{\geq}0.1$ cm. However, application of segmented sources integration method seems to be valid, since this method using three-dimensional Ir-192 source provides more realistic geometry factor. The anisotropy function and radial dose function estimated from MCNPX in this study and MCPT by Williamson are in good agreement within uncertainty of Monte Carlo codes except at radial distance of r=0.25 cm. It is expected that Monte Carlo code used in this study could be applied to other sources utilized for brachytherapy.