• Title/Summary/Keyword: Software Tool

Search Result 2,141, Processing Time 0.024 seconds

3D Histology Using the Synchrotron Radiation Propagation Phase Contrast Cryo-microCT (방사광 전파위상대조 동결미세단층촬영법을 활용한 3차원 조직학)

  • Kim, Ju-Heon;Han, Sung-Mi;Song, Hyun-Ouk;Seo, Youn-Kyung;Moon, Young-Suk;Kim, Hong-Tae
    • Anatomy & Biological Anthropology
    • /
    • v.31 no.4
    • /
    • pp.133-142
    • /
    • 2018
  • 3D histology is a imaging system for the 3D structural information of cells or tissues. The synchrotron radiation propagation phase contrast micro-CT has been used in 3D imaging methods. However, the simple phase contrast micro-CT did not give sufficient micro-structural information when the specimen contains soft elements, as is the case with many biomedical tissue samples. The purpose of this study is to develop a new technique to enhance the phase contrast effect for soft tissue imaging. Experiments were performed at the imaging beam lines of Pohang Accelerator Laboratory (PAL). The biomedical tissue samples under frozen state was mounted on a computer-controlled precision stage and rotated in $0.18^{\circ}$ increments through $180^{\circ}$. An X-ray shadow of a specimen was converted into a visual image on the surface of a CdWO4 scintillator that was magnified using a microscopic objective lens(X5 or X20) before being captured with a digital CCD camera. 3-dimensional volume images of the specimen were obtained by applying a filtered back-projection algorithm to the projection images using a software package OCTOPUS. Surface reconstruction and volume segmentation and rendering were performed were performed using Amira software. In this study, We found that synchrotron phase contrast imaging of frozen tissue samples has higher contrast power for soft tissue than that of non-frozen samples. In conclusion, synchrotron radiation propagation phase contrast cryo-microCT imaging offers a promising tool for non-destructive high resolution 3D histology.

A Study on Differences of Contents and Tones of Arguments among Newspapers Using Text Mining Analysis (텍스트 마이닝을 활용한 신문사에 따른 내용 및 논조 차이점 분석)

  • Kam, Miah;Song, Min
    • Journal of Intelligence and Information Systems
    • /
    • v.18 no.3
    • /
    • pp.53-77
    • /
    • 2012
  • This study analyses the difference of contents and tones of arguments among three Korean major newspapers, the Kyunghyang Shinmoon, the HanKyoreh, and the Dong-A Ilbo. It is commonly accepted that newspapers in Korea explicitly deliver their own tone of arguments when they talk about some sensitive issues and topics. It could be controversial if readers of newspapers read the news without being aware of the type of tones of arguments because the contents and the tones of arguments can affect readers easily. Thus it is very desirable to have a new tool that can inform the readers of what tone of argument a newspaper has. This study presents the results of clustering and classification techniques as part of text mining analysis. We focus on six main subjects such as Culture, Politics, International, Editorial-opinion, Eco-business and National issues in newspapers, and attempt to identify differences and similarities among the newspapers. The basic unit of text mining analysis is a paragraph of news articles. This study uses a keyword-network analysis tool and visualizes relationships among keywords to make it easier to see the differences. Newspaper articles were gathered from KINDS, the Korean integrated news database system. KINDS preserves news articles of the Kyunghyang Shinmun, the HanKyoreh and the Dong-A Ilbo and these are open to the public. This study used these three Korean major newspapers from KINDS. About 3,030 articles from 2008 to 2012 were used. International, national issues and politics sections were gathered with some specific issues. The International section was collected with the keyword of 'Nuclear weapon of North Korea.' The National issues section was collected with the keyword of '4-major-river.' The Politics section was collected with the keyword of 'Tonghap-Jinbo Dang.' All of the articles from April 2012 to May 2012 of Eco-business, Culture and Editorial-opinion sections were also collected. All of the collected data were handled and edited into paragraphs. We got rid of stop-words using the Lucene Korean Module. We calculated keyword co-occurrence counts from the paired co-occurrence list of keywords in a paragraph. We made a co-occurrence matrix from the list. Once the co-occurrence matrix was built, we used the Cosine coefficient matrix as input for PFNet(Pathfinder Network). In order to analyze these three newspapers and find out the significant keywords in each paper, we analyzed the list of 10 highest frequency keywords and keyword-networks of 20 highest ranking frequency keywords to closely examine the relationships and show the detailed network map among keywords. We used NodeXL software to visualize the PFNet. After drawing all the networks, we compared the results with the classification results. Classification was firstly handled to identify how the tone of argument of a newspaper is different from others. Then, to analyze tones of arguments, all the paragraphs were divided into two types of tones, Positive tone and Negative tone. To identify and classify all of the tones of paragraphs and articles we had collected, supervised learning technique was used. The Na$\ddot{i}$ve Bayesian classifier algorithm provided in the MALLET package was used to classify all the paragraphs in articles. After classification, Precision, Recall and F-value were used to evaluate the results of classification. Based on the results of this study, three subjects such as Culture, Eco-business and Politics showed some differences in contents and tones of arguments among these three newspapers. In addition, for the National issues, tones of arguments on 4-major-rivers project were different from each other. It seems three newspapers have their own specific tone of argument in those sections. And keyword-networks showed different shapes with each other in the same period in the same section. It means that frequently appeared keywords in articles are different and their contents are comprised with different keywords. And the Positive-Negative classification showed the possibility of classifying newspapers' tones of arguments compared to others. These results indicate that the approach in this study is promising to be extended as a new tool to identify the different tones of arguments of newspapers.

The Precision Test Based on States of Bone Mineral Density (골밀도 상태에 따른 검사자의 재현성 평가)

  • Yoo, Jae-Sook;Kim, Eun-Hye;Kim, Ho-Seong;Shin, Sang-Ki;Cho, Si-Man
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.13 no.1
    • /
    • pp.67-72
    • /
    • 2009
  • Purpose: ISCD (International Society for Clinical Densitometry) requests that users perform mandatory Precision test to raise their quality even though there is no recommendation about patient selection for the test. Thus, we investigated the effect on precision test by measuring reproducibility of 3 bone density groups (normal, osteopenia, osteoporosis). Materials and Methods: 4 users performed precision test with 420 patients (age: $57.8{\pm}9.02$) for BMD in Asan Medical Center (JAN-2008 ~ JUN-2008). In first group (A), 4 users selected 30 patient respectively regardless of bone density condition and measured 2 part (L-spine, femur) in twice. In second group (B), 4 users measured bone density of 10 patients respectively in the same manner of first group (A) users but dividing patient into 3 stages (normal, osteopenia, osteoporosis). In third group (C), 2 users measured 30 patients respectively in the same manner of first group (A) users considering bone density condition. We used GE Lunar Prodigy Advance (Encore. V11.4) and analyzed the result by comparing %CV to LSC using precision tool from ISCD. Check back was done using SPSS. Results: In group A, the %CV calculated by 4 users (a, b, c, d) were 1.16, 1.01, 1.19, 0.65 g/$cm^2$ in L-spine and 0.69, 0.58, 0.97, 0.47 g/$cm^2$ in femur. In group B, the %CV calculated by 4 users (a, b, c, d) were 1.01, 1.19, 0.83, 1.37 g/$cm^2$ in L-spine and 1.03, 0.54, 0.69, 0.58 g/$cm^2$ in femur. When comparing results (group A, B), we found no considerable differences. In group C, the user_1's %CV of normal, osteopenia and osteoporosis were 1.26, 0.94, 0.94 g/$cm^2$ in L-spine and 0.94, 0.79, 1.01 g/$cm^2$ in femur. And the user_2's %CV were 0.97, 0.83, 0.72 g/$cm^2$ L-spine and 0.65, 0.65, 1.05 g/$cm^2$ in femur. When analyzing the result, we figured out that the difference of reproducibility was almost not found but the differences of two users' several result values have effect on total reproducibility. Conclusions: Precision test is a important factor of bone density follow up. When Machine and user's reproducibility is getting better, it’s useful in clinics because of low range of deviation. Users have to check machine's reproducibility before the test and keep the same mind doing BMD test for patient. In precision test, the difference of measured value is usually found for ROI change caused by patient position. In case of osteoporosis patient, there is difficult to make initial ROI accurately more than normal and osteopenia patient due to lack of bone recognition even though ROI is made automatically by computer software. However, initial ROI is very important and users have to make coherent ROI because we use ROI Copy function in a follow up. In this study, we performed precision test considering bone density condition and found LSC value was stayed within 3%. There was no considerable difference. Thus, patient selection could be done regardless of bone density condition.

  • PDF

Visualizing the Results of Opinion Mining from Social Media Contents: Case Study of a Noodle Company (소셜미디어 콘텐츠의 오피니언 마이닝결과 시각화: N라면 사례 분석 연구)

  • Kim, Yoosin;Kwon, Do Young;Jeong, Seung Ryul
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.4
    • /
    • pp.89-105
    • /
    • 2014
  • After emergence of Internet, social media with highly interactive Web 2.0 applications has provided very user friendly means for consumers and companies to communicate with each other. Users have routinely published contents involving their opinions and interests in social media such as blogs, forums, chatting rooms, and discussion boards, and the contents are released real-time in the Internet. For that reason, many researchers and marketers regard social media contents as the source of information for business analytics to develop business insights, and many studies have reported results on mining business intelligence from Social media content. In particular, opinion mining and sentiment analysis, as a technique to extract, classify, understand, and assess the opinions implicit in text contents, are frequently applied into social media content analysis because it emphasizes determining sentiment polarity and extracting authors' opinions. A number of frameworks, methods, techniques and tools have been presented by these researchers. However, we have found some weaknesses from their methods which are often technically complicated and are not sufficiently user-friendly for helping business decisions and planning. In this study, we attempted to formulate a more comprehensive and practical approach to conduct opinion mining with visual deliverables. First, we described the entire cycle of practical opinion mining using Social media content from the initial data gathering stage to the final presentation session. Our proposed approach to opinion mining consists of four phases: collecting, qualifying, analyzing, and visualizing. In the first phase, analysts have to choose target social media. Each target media requires different ways for analysts to gain access. There are open-API, searching tools, DB2DB interface, purchasing contents, and so son. Second phase is pre-processing to generate useful materials for meaningful analysis. If we do not remove garbage data, results of social media analysis will not provide meaningful and useful business insights. To clean social media data, natural language processing techniques should be applied. The next step is the opinion mining phase where the cleansed social media content set is to be analyzed. The qualified data set includes not only user-generated contents but also content identification information such as creation date, author name, user id, content id, hit counts, review or reply, favorite, etc. Depending on the purpose of the analysis, researchers or data analysts can select a suitable mining tool. Topic extraction and buzz analysis are usually related to market trends analysis, while sentiment analysis is utilized to conduct reputation analysis. There are also various applications, such as stock prediction, product recommendation, sales forecasting, and so on. The last phase is visualization and presentation of analysis results. The major focus and purpose of this phase are to explain results of analysis and help users to comprehend its meaning. Therefore, to the extent possible, deliverables from this phase should be made simple, clear and easy to understand, rather than complex and flashy. To illustrate our approach, we conducted a case study on a leading Korean instant noodle company. We targeted the leading company, NS Food, with 66.5% of market share; the firm has kept No. 1 position in the Korean "Ramen" business for several decades. We collected a total of 11,869 pieces of contents including blogs, forum contents and news articles. After collecting social media content data, we generated instant noodle business specific language resources for data manipulation and analysis using natural language processing. In addition, we tried to classify contents in more detail categories such as marketing features, environment, reputation, etc. In those phase, we used free ware software programs such as TM, KoNLP, ggplot2 and plyr packages in R project. As the result, we presented several useful visualization outputs like domain specific lexicons, volume and sentiment graphs, topic word cloud, heat maps, valence tree map, and other visualized images to provide vivid, full-colored examples using open library software packages of the R project. Business actors can quickly detect areas by a swift glance that are weak, strong, positive, negative, quiet or loud. Heat map is able to explain movement of sentiment or volume in categories and time matrix which shows density of color on time periods. Valence tree map, one of the most comprehensive and holistic visualization models, should be very helpful for analysts and decision makers to quickly understand the "big picture" business situation with a hierarchical structure since tree-map can present buzz volume and sentiment with a visualized result in a certain period. This case study offers real-world business insights from market sensing which would demonstrate to practical-minded business users how they can use these types of results for timely decision making in response to on-going changes in the market. We believe our approach can provide practical and reliable guide to opinion mining with visualized results that are immediately useful, not just in food industry but in other industries as well.

Open Digital Textbook for Smart Education (스마트교육을 위한 오픈 디지털교과서)

  • Koo, Young-Il;Park, Choong-Shik
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.2
    • /
    • pp.177-189
    • /
    • 2013
  • In Smart Education, the roles of digital textbook is very important as face-to-face media to learners. The standardization of digital textbook will promote the industrialization of digital textbook for contents providers and distributers as well as learner and instructors. In this study, the following three objectives-oriented digital textbooks are looking for ways to standardize. (1) digital textbooks should undertake the role of the media for blended learning which supports on-off classes, should be operating on common EPUB viewer without special dedicated viewer, should utilize the existing framework of the e-learning learning contents and learning management. The reason to consider the EPUB as the standard for digital textbooks is that digital textbooks don't need to specify antoher standard for the form of books, and can take advantage od industrial base with EPUB standards-rich content and distribution structure (2) digital textbooks should provide a low-cost open market service that are currently available as the standard open software (3) To provide appropriate learning feedback information to students, digital textbooks should provide a foundation which accumulates and manages all the learning activity information according to standard infrastructure for educational Big Data processing. In this study, the digital textbook in a smart education environment was referred to open digital textbook. The components of open digital textbooks service framework are (1) digital textbook terminals such as smart pad, smart TVs, smart phones, PC, etc., (2) digital textbooks platform to show and perform digital contents on digital textbook terminals, (3) learning contents repository, which exist on the cloud, maintains accredited learning, (4) App Store providing and distributing secondary learning contents and learning tools by learning contents developing companies, and (5) LMS as a learning support/management tool which on-site class teacher use for creating classroom instruction materials. In addition, locating all of the hardware and software implement a smart education service within the cloud must have take advantage of the cloud computing for efficient management and reducing expense. The open digital textbooks of smart education is consdered as providing e-book style interface of LMS to learners. In open digital textbooks, the representation of text, image, audio, video, equations, etc. is basic function. But painting, writing, problem solving, etc are beyond the capabilities of a simple e-book. The Communication of teacher-to-student, learner-to-learnert, tems-to-team is required by using the open digital textbook. To represent student demographics, portfolio information, and class information, the standard used in e-learning is desirable. To process learner tracking information about the activities of the learner for LMS(Learning Management System), open digital textbook must have the recording function and the commnincating function with LMS. DRM is a function for protecting various copyright. Currently DRMs of e-boook are controlled by the corresponding book viewer. If open digital textbook admitt DRM that is used in a variety of different DRM standards of various e-book viewer, the implementation of redundant features can be avoided. Security/privacy functions are required to protect information about the study or instruction from a third party UDL (Universal Design for Learning) is learning support function for those with disabilities have difficulty in learning courses. The open digital textbook, which is based on E-book standard EPUB 3.0, must (1) record the learning activity log information, and (2) communicate with the server to support the learning activity. While the recording function and the communication function, which is not determined on current standards, is implemented as a JavaScript and is utilized in the current EPUB 3.0 viewer, ths strategy of proposing such recording and communication functions as the next generation of e-book standard, or special standard (EPUB 3.0 for education) is needed. Future research in this study will implement open source program with the proposed open digital textbook standard and present a new educational services including Big Data analysis.

Linkage Map and Quantitative Trait Loci(QTL) on Pig Chromosome 6 (돼지 염색체 6번의 연관지도 및 양적형질 유전자좌위 탐색)

  • Lee, H.Y.;Choi, B.H.;Kim, T.H.;Park, E.W.;Yoon, D.H.;Lee, H.K.;Jeon, G.J.;Cheong, I.C.;Hong, K.C.
    • Journal of Animal Science and Technology
    • /
    • v.45 no.6
    • /
    • pp.939-948
    • /
    • 2003
  • The objective of this study was to identify the quantitative traits loci(QTL) for economically important traits such as growth, carcass and meat quality on pig chromosome 6. A three generation resource population was constructed from cross between Korean native boars and Landrace sows. A total of 240 F$_2$ animals were produced using intercross between 10 boars and 31 sows of F$_1$ animals. Phenotypic data including body weight at 3 weeks, backfat thickness, muscle pH, shear force and crude protein level were collected from F$_2$ animals. Animals including grandparents(F$_0$), parents(F$_1$) and offspring(F$_2$) were genotyped for 29 microsatellite markers and PCR-RFLP marker on chromosome 6. The linkage analysis was performed using CRI-MAP software version 2.4(Green et al., 1990) with FIXED option to obtain the map distances. The total length of SSC6 linkage map estimated in this study was 169.3cM. The average distance between adjacent markers was 6.05cM. For mapping of QTL, we used F$_2$ QTL Analysis Servlet of QTL express, a web-based QTL mapping tool(http://qtl.cap.ed.ac.uk). Five QTLs were detected at 5% chromosome-wide level for body weight of 3 weeks of age, shear force, meat pH at 24 hours after slaughtering, backfat thickness and crude protein level on SSC6.

Assessment of the usefulness of the Machine Performance Check system that is an evaluation tools for the determination of daily beam output (일간 빔 출력 확인을 위한 평가도구인 Machine Performance Check의 유용성 평가)

  • Lee, Sang Hyeon;Ahn, Woo Sang;Lee, Woo Seok;Choi, Jin Hyeok;Kim, Seon Yeon
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.29 no.2
    • /
    • pp.65-73
    • /
    • 2017
  • Purpose: Machine Performance Check (MPC) is a self-checking software based on the Electronic Portal Imaging Device (EPID) to measure daily beam outputs without external installation. The purpose of this study is to verify the usefulness of MPC by comparing and correlating daily beam output of QA Beamchecker PLUS. Materials and Methods: Linear accelerator (Truebeam 2.5) was used to measure 10 energies which are composed of photon beams(6, 10, 15 MV and 6, 10 MV-FFF) and electron beams(6, 9, 12, 16 and 20 MeV). A total of 80 cycles of data was obtained by measuring beam output measurement before treatment over five months period. The Pearson correlation coefficient was used to evaluate the consistency of the beam output between the MPC and the QA Beamchecker PLUS. In this study, if the Pearson correlation coefficient is; (1) 0.8 or higher, the correlation is very strong (2) between 0.6 and 0.79, the correlation is strong (3) between 0.4 and 0.59, the correlation is moderate (4) between 0.2 and 0.39, the correlation is weak (5) lower than 0.2, the correlation is very weak. Results: Output variations observed between MPC and QA Beamchecker PLUS were within 2 % for photons and electrons. The beam outputs variations of MPC were $0.29{\pm}0.26%$ and $0.30{\pm}0.26%$ for photon and electron beams, respectively. QA Beamchecker PLUS beam outputs were $0.31{\pm}0.24%$ and $0.33{\pm}0.24%$ for photon and electron beams, respectively. The Pearson correlation coefficient between MPC and QA Beamchecker PLUS indicated that photon beams were very strong at 15 MV, and strong at 6 MV, 10 MV, 6 MV-FFF and 10 MV-FFF. For electron beams, the Pearson correlation coefficient were strong at 16 MeV and 20 MeV, moderate at 9 MeV and 12 MeV, and very weak at 6 MeV. Conclusion: MPC showed significantly strong correlation with QA Beamchecker PLUS when testing with photon beams and high-energy electron beams in the evaluation of daily beam output, but the correlation when testing with low-energy electron beams (6 MeV) appeared to be low. However, MPC and QA Beamchecker PLUS are considered to be suitable for checking daily beam output, as they performed within 2 % of beam output consistency during the observation. MPC which can perform faster than the conventional daily beam output measurement tool, is considered to be an effective method for users.

  • PDF

Correlation between Semiquantitative Myocardial Perfusion Score and Absolute Myocardial Blood Flow in $^{13}N-Ammonia$ PET ($^{13}N$-암모니아 PET에서 반정량적 심근관류 점수와 절대적 심근혈류량의 상관관계)

  • Lee, Byeong-Il;Kim, Kye-Hun;Kim, Jung-Young;Kim, Su-Jin;Lee, Jae-Sung;Min, Jung-Joon;Song, Ho-Chun;Bom, Hee-Seung
    • Nuclear Medicine and Molecular Imaging
    • /
    • v.41 no.3
    • /
    • pp.194-200
    • /
    • 2007
  • Purpose: $^{13}N$-ammonia is a well known radiopharmaceutical for the measurement of a myocardial blood flow (MBF) non-invasively using PET-CT. In this study, we investigated a correlation between MBF obtained from dynamic imaging and myocardial perfusion score (MPS) obtained from static imaging for usefulness of cardiac PET study. Methods: Twelve patients (11 males, 1 female, $57.9{\pm}8.6$ years old) with suspicious coronary artery disease underwent PET-CT scan. Dynamic scans (6 min: $5\;sec\;{\times}\;12,\;10\;sec\;{\times}\;6,\;20\;sec\;{\times}\;3,\;and\;30\;sec\;{\times}\;6$) were initiated simultaneously with bolus injection of 11 MBq/kg $^{13}N-ammonia$ to acquire rest and stress image. Gating image was acquired during 13 minutes continuously. Nine-segment model (4 basal walls, 4 mid walls, and apex) was used for a measurement of MBF. Time activity curve of input function and myocardium was extracted from ROI methods in 9 regions for quantification. The MPS were evaluated using quantitative analysis software. To compare between 20-segment model and 9-segment model, 6 basal segments were excluded and averaged segmental scores were used. Results: There are weak correlation between MBF (rest, 0.18-2.38 ml/min/g; stress, 0.40-4.95 ml/min/g) and MPS (rest 22-91%, stress, 14-90%), however the correlation coefficient between corrected MBF and MPS in rest state was higher than stress state (rest r=0.59; stress r=0.80). As a thickening increased, correlation between MBF and MPS also showed good correlation at each segments. Conclusions: Corrected and translated MPS as its characteristics using $^{13}N$-ammonia showed good correlation with absolute MBF measured by dynamic image in this study. Therefore, we showed MPS is one of good indices which reflect MBF. We anticipate PET-CT could be used as useful tool for evaluation of myocardial function in nuclear cardiac study.

Upper Body Surface Change Analysis using 3-D Body Scanner (3차원 인체 측정기를 이용한 체표변화 분석)

  • Lee Jeongran;Ashdoon Susan P.
    • Journal of the Korean Society of Clothing and Textiles
    • /
    • v.29 no.12 s.148
    • /
    • pp.1595-1607
    • /
    • 2005
  • Three-dimensional(3-D) body scanners used to capture anthropometric measurements are now becoming a common research tool far apparel. This study had two goals, to test the accuracy and reliability of 3-D measurements of dynamic postures, and !o analyze the change in upper body surface measurements between the standard anthropometric position and various dynamic positions. A comparison of body surface measurements using two different measuring methods, 3-D scan measurements using virtual tools on the computer screen and traditional manual measurements for a standard anthropometric posture and for a posture with shoulder flexion were $-2\~20mm$. Girth items showed some disagreement of values between the two methods. None of the measurements were significantly different except f3r the neckbase girth for any of the measuring methods or postures. Scan measurements of the upper body items showed significant linear surface change in the dynamic postures. Shoulder length, interscye front and back, and biacromion length were the items most affected in the dynamic postures. Changes of linear body surface were very similar for the two measuring methods within the same posture. The repeatability of data taken from the 3-D scans using virtual tools showed satisfactory results. Three times repeated scan measurements f3r the scapula protraction and scapula elevation posture were proven to be statistically the same for all measurement items. Measurements from automatic measuring software that measured the 3-D scan with no manual intervention were compared with the measurements using virtual tools. Many measurements from the automatic program were larger and showed quite different values.

Development of Model Plans in Three Dimensional Conformal Radiotherapy for Brain Tumors (뇌종양 환자의 3차원 입체조형 치료를 위한 뇌내 주요 부위의 모델치료계획의 개발)

  • Pyo Hongryull;Lee Sanghoon;Kim GwiEon;Keum Kichang;Chang Sekyung;Suh Chang-Ok
    • Radiation Oncology Journal
    • /
    • v.20 no.1
    • /
    • pp.1-16
    • /
    • 2002
  • Purpose : Three dimensional conformal radiotherapy planning is being used widely for the treatment of patients with brain tumor. However, it takes much time to develop an optimal treatment plan, therefore, it is difficult to apply this technique to all patients. To increase the efficiency of this technique, we need to develop standard radiotherapy plant for each site of the brain. Therefore we developed several 3 dimensional conformal radiotherapy plans (3D plans) for tumors at each site of brain, compared them with each other, and with 2 dimensional radiotherapy plans. Finally model plans for each site of the brain were decide. Materials and Methods : Imaginary tumors, with sizes commonly observed in the clinic, were designed for each site of the brain and drawn on CT images. The planning target volumes (PTVs) were as follows; temporal $tumor-5.7\times8.2\times7.6\;cm$, suprasellar $tumor-3\times4\times4.1\;cm$, thalamic $tumor-3.1\times5.9\times3.7\;cm$, frontoparietal $tumor-5.5\times7\times5.5\;cm$, and occipitoparietal $tumor-5\times5.5\times5\;cm$. Plans using paralled opposed 2 portals and/or 3 portals including fronto-vertex and 2 lateral fields were developed manually as the conventional 2D plans, and 3D noncoplanar conformal plans were developed using beam's eye view and the automatic block drawing tool. Total tumor dose was 54 Gy for a suprasellar tumor, 59.4 Gy and 72 Gy for the other tumors. All dose plans (including 2D plans) were calculated using 3D plan software. Developed plans were compared with each other using dose-volume histograms (DVH), normal tissue complication probabilities (NTCP) and variable dose statistic values (minimum, maximum and mean dose, D5, V83, V85 and V95). Finally a best radiotherapy plan for each site of brain was selected. Results : 1) Temporal tumor; NTCPs and DVHs of the normal tissue of all 3D plans were superior to 2D plans and this trend was more definite when total dose was escalated to 72 Gy (NTCPs of normal brain 2D $plans:27\%,\;8\%\rightarrow\;3D\;plans:1\%,\;1\%$). Various dose statistic values did not show any consistent trend. A 3D plan using 3 noncoplanar portals was selected as a model radiotherapy plan. 2) Suprasellar tumor; NTCPs of all 3D plans and 2D plans did not show significant difference because the total dose of this tumor was only 54 Gy. DVHs of normal brain and brainstem were significantly different for different plans. D5, V85, V95 and mean values showed some consistent trend that was compatible with DVH. All 3D plans were superior to 2D plans even when 3 portals (fronto-vertex and 2 lateral fields) were used for 2D plans. A 3D plan using 7 portals was worse than plans using fewer portals. A 3D plan using 5 noncoplanar portals was selected as a model plan. 3) Thalamic tumor; NTCPs of all 3D plans were lower than the 2D plans when the total dose was elevated to 72 Gy. DVHs of normal tissues showed similar results. V83, V85, V95 showed some consistent differences between plans but not between 3D plans. 3D plans using 5 noncoplanar portals were selected as a model plan. 4) Parietal (fronto- and occipito-) tumors; all NTCPs of the normal brain in 3D plans were lower than in 2D plans. DVH also showed the same results. V83, V85, V95 showed consistent trends with NTCP and DVH. 3D plans using 5 portals for frontoparietal tumor and 6 portals for occipitoparietal tumor were selected as model plans. Conclusion : NTCP and DVH showed reasonable differences between plans and were through to be useful for comparing plans. All 3D plans were superior to 2D plans. Best 3D plans were selected for tumors in each site of brain using NTCP, DVH and finally by the planner's decision.