• 제목/요약/키워드: Histogram analysis

Search Result 490, Processing Time 0.029 seconds

Comparative Study Between Respiratory Gated Conventional 2-D Plan and 3-D Conformal Plan for Predicting Radiation Hepatitis (간암에서 호흡주기를 고려한 2-차원 방사선 치료 방법과 3-차원 입체조형 치료방법에서 방사선 간염 예측의 비교연구)

  • Lee Sang-wook;Kim Gwi Eon;Chung Kap Soo;Lee Chang Geol;Seong Jinsil;Suh Chang Ok
    • Radiation Oncology Journal
    • /
    • v.16 no.4
    • /
    • pp.455-467
    • /
    • 1998
  • Purpose : To evaluate influences associated with radiation treatment planning obtained with the patient breathing freely. Materials and Methods : We compared reduction or elimination of planning target volume (PTV) margins with 2-D conventional plan with inclusion of PTV margins associated with breathing with 3-D conformal therapy. The respiratory non gated 3-D conformal treatment plans were compared with respiratory gated conventional 2-D plans in 4 patients with hepatocellular carcinomas. Isodose distribution, dose statistics, and dose volume histogram (DVH) of PTVs were used to evaluate differences between respiratory gated conventional 2-D plans and respiratory non gated 3-D conformal treatment plans. In addition. the risk of radiation exposure of surrounding normal liver and organs are evaluated by means of DVH and normal tissue complication probabilities (NTCPs). Results : The vertical movement of liver ranged 2-3 cm in all patients. We found no difference between respiratory gated 2-D plans and 3-D conformal treatment plans with the patients breathing freely. Treatment planning using DVH analysis of PTV and the normal liver was used for all patients. DVH and calculated NTCP showed no difference in respiratory gated 2-D plans and respiratory non gated 3-D conformal treatment plans. Conclusion : Respiratory gated radiation therapy was very important in hepatic tumors because radiation induced hepatitis was dependent on remaining normal liver volume. Further investigational studies for respiratory gated radiation.

  • PDF

Data Mining Algorithm Based on Fuzzy Decision Tree for Pattern Classification (퍼지 결정트리를 이용한 패턴분류를 위한 데이터 마이닝 알고리즘)

  • Lee, Jung-Geun;Kim, Myeong-Won
    • Journal of KIISE:Software and Applications
    • /
    • v.26 no.11
    • /
    • pp.1314-1323
    • /
    • 1999
  • 컴퓨터의 사용이 일반화됨에 따라 데이타를 생성하고 수집하는 것이 용이해졌다. 이에 따라 데이타로부터 자동적으로 유용한 지식을 얻는 기술이 필요하게 되었다. 데이타 마이닝에서 얻어진 지식은 정확성과 이해성을 충족해야 한다. 본 논문에서는 데이타 마이닝을 위하여 퍼지 결정트리에 기반한 효율적인 퍼지 규칙을 생성하는 알고리즘을 제안한다. 퍼지 결정트리는 ID3와 C4.5의 이해성과 퍼지이론의 추론과 표현력을 결합한 방법이다. 특히, 퍼지 규칙은 속성 축에 평행하게 판단 경계선을 결정하는 방법으로는 어려운 속성 축에 평행하지 않는 경계선을 갖는 패턴을 효율적으로 분류한다. 제안된 알고리즘은 첫째, 각 속성 데이타의 히스토그램 분석을 통해 적절한 소속함수를 생성한다. 둘째, 주어진 소속함수를 바탕으로 ID3와 C4.5와 유사한 방법으로 퍼지 결정트리를 생성한다. 또한, 유전자 알고리즘을 이용하여 소속함수를 조율한다. IRIS 데이타, Wisconsin breast cancer 데이타, credit screening 데이타 등 벤치마크 데이타들에 대한 실험 결과 제안된 방법이 C4.5 방법을 포함한 다른 방법보다 성능과 규칙의 이해성에서 보다 효율적임을 보인다.Abstract With an extended use of computers, we can easily generate and collect data. There is a need to acquire useful knowledge from data automatically. In data mining the acquired knowledge needs to be both accurate and comprehensible. In this paper, we propose an efficient fuzzy rule generation algorithm based on fuzzy decision tree for data mining. We combine the comprehensibility of rules generated based on decision tree such as ID3 and C4.5 and the expressive power of fuzzy sets. Particularly, fuzzy rules allow us to effectively classify patterns of non-axis-parallel decision boundaries, which are difficult to do using attribute-based classification methods.In our algorithm we first determine an appropriate set of membership functions for each attribute of data using histogram analysis. Given a set of membership functions then we construct a fuzzy decision tree in a similar way to that of ID3 and C4.5. We also apply genetic algorithm to tune the initial set of membership functions. We have experimented our algorithm with several benchmark data sets including the IRIS data, the Wisconsin breast cancer data, and the credit screening data. The experiment results show that our method is more efficient in performance and comprehensibility of rules compared with other methods including C4.5.

Development of Recognition Application of Facial Expression for Laughter Theraphy on Smartphone (스마트폰에서 웃음 치료를 위한 표정인식 애플리케이션 개발)

  • Kang, Sun-Kyung;Li, Yu-Jie;Song, Won-Chang;Kim, Young-Un;Jung, Sung-Tae
    • Journal of Korea Multimedia Society
    • /
    • v.14 no.4
    • /
    • pp.494-503
    • /
    • 2011
  • In this paper, we propose a recognition application of facial expression for laughter theraphy on smartphone. It detects face region by using AdaBoost face detection algorithm from the front camera image of a smartphone. After detecting the face image, it detects the lip region from the detected face image. From the next frame, it doesn't detect the face image but tracks the lip region which were detected in the previous frame by using the three step block matching algorithm. The size of the detected lip image varies according to the distance between camera and user. So, it scales the detected lip image with a fixed size. After that, it minimizes the effect of illumination variation by applying the bilateral symmetry and histogram matching illumination normalization. After that, it computes lip eigen vector by using PCA(Principal Component Analysis) and recognizes laughter expression by using a multilayer perceptron artificial network. The experiment results show that the proposed method could deal with 16.7 frame/s and the proposed illumination normalization method could reduce the variations of illumination better than the existing methods for better recognition performance.

Development of Landslide Detection Algorithm Using Fully Polarimetric ALOS-2 SAR Data (Fully-Polarimetric ALOS-2 자료를 이용한 산사태 탐지 알고리즘 개발)

  • Kim, Minhwa;Cho, KeunHoo;Park, Sang-Eun;Cho, Jae-Hyoung;Moon, Hyoi;Han, Seung-hoon
    • Economic and Environmental Geology
    • /
    • v.52 no.4
    • /
    • pp.313-322
    • /
    • 2019
  • SAR (Synthetic Aperture Radar) remote sensing data is a very useful tool for near-real-time identification of landslide affected areas that can occur over a large area due to heavy rains or typhoons. This study aims to develop an effective algorithm for automatically delineating landslide areas from the polarimetric SAR data acquired after the landslide event. To detect landslides from SAR observations, reduction of the speckle effects in the estimation of polarimetric SAR parameters and the orthorectification of geometric distortions on sloping terrain are essential processing steps. Based on the experimental analysis, it was found that the IDAN filter can provide a better estimation of the polarimetric parameters. In addition, it was appropriate to apply orthorectification process after estimating polarimetric parameters in the slant range domain. Furthermore, it was found that the polarimetric entropy is the most appropriate parameters among various polarimetric parameters. Based on those analyses, we proposed an automatic landslide detection algorithm using the histogram thresholding of the polarimetric parameters with the aid of terrain slope information. The landslide detection algorithm was applied to the ALOS-2 PALSAR-2 data which observed landslide areas in Japan triggered by Typhoon in September 2011. Experimental results showed that the landslide areas were successfully identified by using the proposed algorithm with a detection rate of about 82% and a false alarm rate of about 3%.

The Effect of MLC Leaf Motion Constraints on Plan Quality and Delivery Accuracy in VMAT (체적조절호형방사선치료 시 갠트리 회전과 다엽콜리메이터의 이동 속도에 따른 선량분포 평가)

  • Kim, Yon-Lae;Chung, Jin-Beom;Lee, Jeong-woo;Shin, Young-Joo;Kang, Dong-Jin;Jung, Jae-Yong
    • Journal of radiological science and technology
    • /
    • v.42 no.3
    • /
    • pp.217-222
    • /
    • 2019
  • The purpose of this study is to evaluate the dose distribution by gantry rotation and MLC moving speed on treatment planning system(TPS) and linear accelerator. The dose analyzer phantom(Delta 4) was scanned by CT simulator for treatment planning. The planning target volumes(PTVs) of prostate and pancreas was prescribed 6,500 cGy, 5,000 cGy on VMAT(Volumetric Modulated Arc Therapy) by TPS while MLC speed changed. The analyzer phantom was irradiated linear accelerator using by planned parameters. Dose distribution of PTVs were evaluated by the homogeneity index, conformity index, dose volume histogram of organ at risk(rectum, bladder, spinal cord, kidney). And irradiated dose analysis were evaluated dose distribution and conformity by gamma index. The PTV dose of pancreas was 4,993 cGy during 0.1 cm/deg leaf and gantry that was the most closest prescribed dose(5,000 cGy). The dose of spinal cord, left kidney, and right kidney were accessed the lowest during 0.1 cm/deg, 1.5 cm/deg, 0.3 cm/deg. The PTV dose of prostate was 6,466 cGy during 0.1 cm/deg leaf and gantry that was the most closest prescribed dose(6,500 cGy). The dose of bladder and rectum were accessed the lowest during 0.3 cm/deg, 2.0 cm/deg. For gamma index, pancreas and prostate were analyzed the lowest error 100% at 0.8, 1.0 cm/deg and 99.6% at 0.3, 0.5 cm/deg. We should used the optimal leaf speed according to the gantry rotation if the treatment cases are performed VMAT.

Phase Segmentation of PVA Fiber-Reinforced Cementitious Composites Using U-net Deep Learning Approach (U-net 딥러닝 기법을 활용한 PVA 섬유 보강 시멘트 복합체의 섬유 분리)

  • Jeewoo Suh;Tong-Seok Han
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.36 no.5
    • /
    • pp.323-330
    • /
    • 2023
  • The development of an analysis model that reflects the microstructure characteristics of polyvinyl alcohol (PVA) fiber-reinforced cementitious composites, which have a highly complex microstructure, enables synergy between efficient material design and real experiments. PVA fiber orientations are an important factor that influences the mechanical behavior of PVA fiber-reinforced cementitious composites. Owing to the difficulty in distinguishing the gray level value obtained from micro-CT images of PVA fibers from adjacent phases, fiber segmentation is time-consuming work. In this study, a micro-CT test with a voxel size of 0.65 ㎛3 was performed to investigate the three-dimensional distribution of fibers. To segment the fibers and generate training data, histogram, morphology, and gradient-based phase-segmentation methods were used. A U-net model was proposed to segment fibers from micro-CT images of PVA fiber-reinforced cementitious composites. Data augmentation was applied to increase the accuracy of the training, using a total of 1024 images as training data. The performance of the model was evaluated using accuracy, precision, recall, and F1 score. The trained model achieved a high fiber segmentation performance and efficiency, and the approach can be applied to other specimens as well.

Mechanical Properties Evaluation of 3D Printing Recycled Concrete utilizing Wasted Shell Aggregate (패각 잔골재를 활용한 3D 프린팅 자원순환 콘크리트의 역학적 성능 평가)

  • Jeewoo Suh;Ju-Hyeon Park;Tong-Seok Han
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.37 no.1
    • /
    • pp.33-40
    • /
    • 2024
  • The volume of shells, a prominent form of marine waste, is steadily increasing each year. However, a significant portion of these shells is either discarded or left near coastlines, posing environmental and social concerns. Utilizing shells as a substitute for traditional aggregates presents a potential solution, especially considering the diminishing availability of natural aggregates. This approach could effectively reduce transportation logistics costs, thereby promoting resource recycling. In this study, we explore the feasibility of employing wasted shell aggregates in 3D concrete printing technology for marine structures. Despite the advantages, it is observed that 3D printing concrete with wasted shells as aggregates results in lower strength compared to ordinary concrete, attributed to pores at the interface of shells and cement paste. Microstructure characterization becomes essential for evaluating mechanical properties. We conduct an analysis of the mechanical properties and microstructure of 3D printing concrete specimens incorporating wasted shells. Additionally, a mix design is proposed, taking into account flowability, extrudability, and buildability. To assess mechanical properties, compression and bonding strength specimens are fabricated using a 3D printer, and subsequent strength tests are conducted. Microstructure characteristics are analyzed through scanning electron microscope tests, providing high-resolution images. A histogram-based segmentation method is applied to segment pores, and porosity is compared based on the type of wasted shell. Pore characteristics are quantified using a probability function, establishing a correlation between the mechanical properties and microstructure characteristics of the specimens according to the type of wasted shell.

Rough Set Analysis for Stock Market Timing (러프집합분석을 이용한 매매시점 결정)

  • Huh, Jin-Nyung;Kim, Kyoung-Jae;Han, In-Goo
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.3
    • /
    • pp.77-97
    • /
    • 2010
  • Market timing is an investment strategy which is used for obtaining excessive return from financial market. In general, detection of market timing means determining when to buy and sell to get excess return from trading. In many market timing systems, trading rules have been used as an engine to generate signals for trade. On the other hand, some researchers proposed the rough set analysis as a proper tool for market timing because it does not generate a signal for trade when the pattern of the market is uncertain by using the control function. The data for the rough set analysis should be discretized of numeric value because the rough set only accepts categorical data for analysis. Discretization searches for proper "cuts" for numeric data that determine intervals. All values that lie within each interval are transformed into same value. In general, there are four methods for data discretization in rough set analysis including equal frequency scaling, expert's knowledge-based discretization, minimum entropy scaling, and na$\ddot{i}$ve and Boolean reasoning-based discretization. Equal frequency scaling fixes a number of intervals and examines the histogram of each variable, then determines cuts so that approximately the same number of samples fall into each of the intervals. Expert's knowledge-based discretization determines cuts according to knowledge of domain experts through literature review or interview with experts. Minimum entropy scaling implements the algorithm based on recursively partitioning the value set of each variable so that a local measure of entropy is optimized. Na$\ddot{i}$ve and Booleanreasoning-based discretization searches categorical values by using Na$\ddot{i}$ve scaling the data, then finds the optimized dicretization thresholds through Boolean reasoning. Although the rough set analysis is promising for market timing, there is little research on the impact of the various data discretization methods on performance from trading using the rough set analysis. In this study, we compare stock market timing models using rough set analysis with various data discretization methods. The research data used in this study are the KOSPI 200 from May 1996 to October 1998. KOSPI 200 is the underlying index of the KOSPI 200 futures which is the first derivative instrument in the Korean stock market. The KOSPI 200 is a market value weighted index which consists of 200 stocks selected by criteria on liquidity and their status in corresponding industry including manufacturing, construction, communication, electricity and gas, distribution and services, and financing. The total number of samples is 660 trading days. In addition, this study uses popular technical indicators as independent variables. The experimental results show that the most profitable method for the training sample is the na$\ddot{i}$ve and Boolean reasoning but the expert's knowledge-based discretization is the most profitable method for the validation sample. In addition, the expert's knowledge-based discretization produced robust performance for both of training and validation sample. We also compared rough set analysis and decision tree. This study experimented C4.5 for the comparison purpose. The results show that rough set analysis with expert's knowledge-based discretization produced more profitable rules than C4.5.

(Image Analysis of Electrophoresis Gels by using Region Growing with Multiple Peaks) (다중 피크의 영역 성장 기법에 의한 전기영동 젤의 영상 분석)

  • 김영원;전병환
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.5_6
    • /
    • pp.444-453
    • /
    • 2003
  • Recently, a great interest of bio-technology(BT) is concentrated and the image analysis technique for electrophoresis gels is highly requested to analyze genetic information or to look for some new bio-activation materials. For this purpose, the location and quantity of each band in a lane should be measured. In most of existing techniques, the approach of peak searching in a profile of a lane is used. But this peak is improper as the representative of a band, because its location does not correspond to that of the brightest pixel or the center of gravity. Also, it is improper to measure band quantity in most of these approaches because various enhancement processes are commonly applied to original images to extract peaks easily. In this paper, we adopt an approach to measure accumulated brightness as a band quantity in each band region, which Is extracted by not using any process of changing relative brightness, and the gravity center of the region is calculated as a band location. Actually, we first extract lanes with an entropy-based threshold calculated on a gel-image histogram. And then, three other methods are proposed and applied to extract bands. In the MER method, peaks and valleys are searched on a vertical search line by which each lane is bisected. And the minimum enclosing rectangle of each band is set between successive two valleys. On the other hand, in the RG-1 method, each band is extracted by using region growing with a peak as a seed, separating overlapped neighbor bands. In the RG-2 method, peaks and valleys are searched on two vertical lines by which each lane is trisected, and the left and right peaks nay be paired up if they seem to belong to the same band, and then each band region is grown up with a peak or both peaks if exist. To compare above three methods, we have measured the location and amount of bands. As a result, the average errors in band location of MER, RG-1, and RG-2 were 6%, 3%, and 1%, respectively, when the lane length is normalized to a unit value. And the average errors in band amount were 8%, 5%, and 2%, respectively, when the sum of band amount is normalized to a unit value. In conclusion, RG-2 was shown to be more reliable in the accuracy of measuring the location and amount of bands.

Assessment of Fire-Damaged Mortar using Color image Analysis (색도 이미지 분석을 이용한 화재 피해 모르타르의 손상 평가)

  • Park, Kwang-Min;Lee, Byung-Do;Yoo, Sung-Hun;Ham, Nam-Hyuk;Roh, Young-Sook
    • Journal of the Korea institute for structural maintenance and inspection
    • /
    • v.23 no.3
    • /
    • pp.83-91
    • /
    • 2019
  • The purpose of this study is to assess a fire-damaged concrete structure using a digital camera and image processing software. To simulate it, mortar and paste samples of W/C=0.5(general strength) and 0.3(high strength) were put into an electric furnace and simulated from $100^{\circ}C$ to $1000^{\circ}C$. Here, the paste was processed into a powder to measure CIELAB chromaticity, and the samples were taken with a digital camera. The RGB chromaticity was measured by color intensity analyzer software. As a result, the residual compressive strength of W/C=0.5 and 0.3 was 87.2 % and 86.7 % at the heating temperature of $400^{\circ}C$. However there was a sudden decrease in strength at the temperature above $500^{\circ}C$, while the residual compressive strength of W/C=0.5 and 0.3 was 55.2 % and 51.9 % of residual strength. At the temperature $700^{\circ}C$ or higher, W/C=0.5 and W/C=0.3 show 26.3% and 27.8% of residual strength, so that the durability of the structure could not be secured. The results of $L^*a^*b$ color analysis show that $b^*$ increases rapidly after $700^{\circ}C$. It is analyzed that the intensity of yellow becomes strong after $700^{\circ}C$. Further, the RGB analysis found that the histogram kurtosis and frequency of Red and Green increases after $700^{\circ}C$. It is analyzed that number of Red and Green pixels are increased. Therefore, it is deemed possible to estimate the degree of damage by checking the change in yellow($b^*$ or R+G) when analyzing the chromaticity of the fire-damaged concrete structures.