• Title/Summary/Keyword: weighted histogram

Search Result 54, Processing Time 0.022 seconds

Automated Brain Region Extraction Method in Head MR Image Sets (머리 MR영상에서 자동화된 뇌영역 추출)

  • Cho, Dong-Uk;Kim, Tae-Woo;Shin, Seung-Soo
    • The Journal of the Korea Contents Association
    • /
    • v.2 no.3
    • /
    • pp.1-15
    • /
    • 2002
  • A noel automated brain region extraction method in single channel MR images for visualization and analysis of a human brain is presented. The method generates a volume of brain masks by automatic thresholding using a dual curve fitting technique and by 3D morphological operations. The dual curve fitting can reduce an error in clue fitting to the histogram of MR images. The 3D morphological operations, including erosion, labeling of connected-components, max-feature operation, and dilation, are applied to the cubic volume of masks reconstructed from the thresholded Drain masks. This method can automatically extract a brain region in any displayed type of sequences, including extreme slices, of SPGR, T1-, T2-, and PD-weighted MR image data sets which are not required to contain the entire brain. In the experiments, the algorithm was applied to 20 sets of MR images and showed over 0.97 of similarity index in comparison with manual drawing.

  • PDF

Object Tracking Algorithm Using Weighted Color Centroids Shifting (가중 컬러 중심 이동을 이용한 물체 추적 알고리즘)

  • Choi, Eun-Cheol;Lee, Suk-Ho;Kang, Moon-Gi
    • Journal of Broadcast Engineering
    • /
    • v.15 no.2
    • /
    • pp.236-247
    • /
    • 2010
  • Recently, mean shift tracking algorithms have been proposed which use the information of color histogram together with some spatial information provided by the kernel. In spite of their fast speed, the algorithms are suffer from an inherent instability problem which is due to the use of an isotropic kernel for spatiality and the use of the Bhattacharyya coefficient as a similarity function. In this paper, we analyze how the kernel and the Bhattacharyya coefficient can arouse the instability problem. Based on the analysis, we propose a novel tracking scheme that uses a new representation of the location of the target which is constrained by the color, the area, and the spatiality information of the target in a more stable way than the mean shift algorithm. With this representation, the target localization in the next frame can be achieved by one step computation, which makes the tracking stable, even in difficult situations such as low-rate-frame environment, and partial occlusion.

Comparison of Dose Distribution in Spine Radiosurgery Plans: Simultaneously Integrated Boost and RTOG 0631 Protocol (척추뼈전이암 환자의 체부정위방사선치료계획 비교: 동시통합추가치료법 대 RTOG 0631 프로토콜)

  • Park, Su Yeon;Oh, Dongryul;Park, Hee Chul;Kim, Jin Sung;Kim, Jong Sik;Shin, Eun Hyuk;Kim, Hye Young;Jung, Sang Hoon;Han, Youngyih
    • Progress in Medical Physics
    • /
    • v.25 no.3
    • /
    • pp.176-184
    • /
    • 2014
  • In this study, we compared dose distributions from simultaneously integrated boost (SIB) method versus the RTOG 0631 protocol for spine radiosurgery. Spine radiosurgery plans were performed in five patients with localized spinal metastases from hepatocellular carcinoma. The computed tomography (CT) and T1- and T2-weighted magnetic resonance imaging (MRI) were fused for delineating of GTV and spinal cord. In SIB plan, the clinical target volume (CTV1) was included the whole compartments of the involved spine, while RTOG 0631 protocol defines the CTV2 as the involved vertebral body and both left and right pedicles. The CTV2 includes transverse process and posterior element according to the extent of GTV. The doses were prescribed 18 Gy to GTV and 10 Gy to CTV1 in SIB plan, while the prescription of RTOG 0631 protocol was applied 18 Gy to CTV2. The results of dose-volume histogram (DVH) showed that there were competitive in target coverage, while the doses of spinal cord and other normal organs were lower in SIB method than in RTOG 0631 protocol. The 85% irradiated volume of VB in RTOG 0631 protocol was similar to that in the SIB plan. However, the dose to normal organs in RTOG 0631 had a tendency to higher than that in SIB plan. The SIB plan might be an alternative method in case of predictive serious complications of surrounded normal organs. In conclusion, although both approaches of SIB or RTOG 0631 showed competitive planning results, tumor control probability (TCP) and normal tissue complication probability (NTCP) through diverse clinical researches should be analyzed in the future.

Rough Set Analysis for Stock Market Timing (러프집합분석을 이용한 매매시점 결정)

  • Huh, Jin-Nyung;Kim, Kyoung-Jae;Han, In-Goo
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.3
    • /
    • pp.77-97
    • /
    • 2010
  • Market timing is an investment strategy which is used for obtaining excessive return from financial market. In general, detection of market timing means determining when to buy and sell to get excess return from trading. In many market timing systems, trading rules have been used as an engine to generate signals for trade. On the other hand, some researchers proposed the rough set analysis as a proper tool for market timing because it does not generate a signal for trade when the pattern of the market is uncertain by using the control function. The data for the rough set analysis should be discretized of numeric value because the rough set only accepts categorical data for analysis. Discretization searches for proper "cuts" for numeric data that determine intervals. All values that lie within each interval are transformed into same value. In general, there are four methods for data discretization in rough set analysis including equal frequency scaling, expert's knowledge-based discretization, minimum entropy scaling, and na$\ddot{i}$ve and Boolean reasoning-based discretization. Equal frequency scaling fixes a number of intervals and examines the histogram of each variable, then determines cuts so that approximately the same number of samples fall into each of the intervals. Expert's knowledge-based discretization determines cuts according to knowledge of domain experts through literature review or interview with experts. Minimum entropy scaling implements the algorithm based on recursively partitioning the value set of each variable so that a local measure of entropy is optimized. Na$\ddot{i}$ve and Booleanreasoning-based discretization searches categorical values by using Na$\ddot{i}$ve scaling the data, then finds the optimized dicretization thresholds through Boolean reasoning. Although the rough set analysis is promising for market timing, there is little research on the impact of the various data discretization methods on performance from trading using the rough set analysis. In this study, we compare stock market timing models using rough set analysis with various data discretization methods. The research data used in this study are the KOSPI 200 from May 1996 to October 1998. KOSPI 200 is the underlying index of the KOSPI 200 futures which is the first derivative instrument in the Korean stock market. The KOSPI 200 is a market value weighted index which consists of 200 stocks selected by criteria on liquidity and their status in corresponding industry including manufacturing, construction, communication, electricity and gas, distribution and services, and financing. The total number of samples is 660 trading days. In addition, this study uses popular technical indicators as independent variables. The experimental results show that the most profitable method for the training sample is the na$\ddot{i}$ve and Boolean reasoning but the expert's knowledge-based discretization is the most profitable method for the validation sample. In addition, the expert's knowledge-based discretization produced robust performance for both of training and validation sample. We also compared rough set analysis and decision tree. This study experimented C4.5 for the comparison purpose. The results show that rough set analysis with expert's knowledge-based discretization produced more profitable rules than C4.5.