• Title/Summary/Keyword: Normalization

Search Result 1,421, Processing Time 0.022 seconds

Region-Segmental Scheme in Local Normalization Process of Digital Image (디지털영상 국부정규화처리의 영역분할 구도)

  • Hwang, Jung-Won;Hwang, Jae-Ho
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.44 no.4 s.316
    • /
    • pp.78-85
    • /
    • 2007
  • This paper presents a segmental scheme for regions-composed images in local normalization process. The scheme is based on local statistics computed through a moving window. The normalization algorithm uses linear or nonlinear functions to transfer the pixel distribution and the homogeneous affine of regions which is corrupted by additive noise. It adjusts the mean and standard deviation for nearest-neighbor interpoint distance between current and the normalized image signals and changes the segmentation performance according to local statistics and parameter variation adaptively. The performance of newly advanced local normalization algorithm is evaluated and compared to the performance of conventional normalization methods. Experimental results are presented to show the region segmentation properties of these approaches.

Print-tip Normalization for DNA Microarray Data (DNA 마이크로어레이 자료의 PRINT-TIP별 표준화(NORMALIZATION) 방법)

  • Yi Sung-Gon;Park Taesung;Kang Sung Hyun;Lee Seung-Yeaun;Lee Yang Sung
    • The Korean Journal of Applied Statistics
    • /
    • v.18 no.1
    • /
    • pp.115-127
    • /
    • 2005
  • DNA microarray experiments allow us to study expression of thousands of genes simultaneously, Normalization is a process for removing noises occurred during the microarray experiment, Print-tip is regarded as one main sources of noises, In this paper, we review normalization methods most commonly used in the microarray experiments, Especially, we investigate the effects of print-tips through simulated data sets.

URL Signatures for Improving URL Normalization (URL 정규화 향상을 위한 URL 서명)

  • Soon, Lay-Ki;Lee, Sang-Ho
    • Journal of KIISE:Databases
    • /
    • v.36 no.2
    • /
    • pp.139-149
    • /
    • 2009
  • In the standard URL normalization mechanism, URLs are normalized syntactically by a set of predefined steps. In this paper, we propose to complement the standard URL normalization by incorporating the semantically meaningful metadata of the web pages. The metadata taken into consideration are the body texts and the page size of the web pages, which can be extracted during HTML parsing. The results from our first exploratory experiment indicate that the body texts are effective in identifying equivalent URLs. Hence, given a URL which has undergone the standard normalization, we construct its URL signature by hashing the body text of the associated web page using Message-Digest algorithm 5 in the second experiment. URLs which share identical signatures are considered to be equivalent in our scheme. The results in the second experiment show that our proposed URL signatures were able to further reduce redundant URLs by 32.94% in comparison with the standard URL normalization.

Theoretical Investigation of Metal Artifact Reduction Based on Sinogram Normalization in Computed Tomography (컴퓨터 단층영상에서 사이노그램 정규화를 이용한 금속 영상왜곡 저감 방법의 이론적 고찰)

  • Jeon, Hosang;Youn, Hanbean;Nam, Jiho;Kim, Ho Kyung
    • Progress in Medical Physics
    • /
    • v.24 no.4
    • /
    • pp.303-314
    • /
    • 2013
  • Image quality of computed tomography (CT) is very vulnerable to metal artifacts. Recently, the thickness and background normalization techniques have been introduced. Since they provide flat sinograms, it is easy to determine metal traces and a simple linear interpolation would be enough to describe the missing data in sinograms. In this study, we have developed a theory describing two normalization methods and compared two methods with respect to various sizes and numbers of metal inserts by using simple numerical simulations. The developed theory showed that the background normalization provide flatter sinograms than the thickness normalization, which was validated with the simulation results. Numerical simulation results with respect to various sizes and numbers of metal inserts showed that the background normalization was better than the thickness normalization for metal artifact corrections. Although the residual artifacts still existed, we have showed that the background normalization without the segmentation procedure was better than the thickness normalization for metal artifact corrections. Since the background normalization without the segmentation procedure is simple and it does not require any users' intervention, it can be readily installed in conventional CT systems.

Normalization of XQuery Queries for Efficient XML Query Processing (효율적인 XML질의 처리를 위한 XQuery 질의의 정규화)

  • 김서영;이기훈;황규영
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.10 no.5
    • /
    • pp.419-433
    • /
    • 2004
  • As XML becomes a standard for data representation, integration, and exchange on the Web, several XML query languages have been proposed. World Wide Web Consortium(W3C) has proposed XQuery as a standard for the XML query language. Like SQL, XQuery allows nested queries. Thus, normalization rules have been proposed to transform nested XQuery queries to semantically equivalent ones that could be executed more efficiently. However, previous normalization rules are applicable only to restricted forms of nested XQuery queries. Specifically, they can not handle FLWR expressions having nested expressions in the where clause. In this paper, we propose normalization rules for XQuery queries by extending those for SQL queries. Our proposed rules can handle FLWR expressions haying nested expressions in every clause. The major contributions of this paper are as follows. First, we classily nesting types of XQuery queries according to the existence of correlation and aggregation. We then propose normalization rules for each nesting type. Second, we propose detailed algorithms that apply the normalization rules to nested XQuery queries.

Comparison of Three Normalization Methods for 3D Joint Moment in the Asymmetric Rotational Human Movements in Golf Swing Analysis

  • Lee, Dongjune;Oh, Seung Eel;Lee, In-Kwang;Sim, Taeyong;Joo, Su-bin;Park, Hyun-Joon;Mun, Joung Hwan
    • Journal of Biosystems Engineering
    • /
    • v.40 no.3
    • /
    • pp.289-295
    • /
    • 2015
  • Purpose: From the perspective of biomechanics, joint moments quantitatively show a subject's ability to perform actions. In this study, the effect of normalization in the fast and asymmetric motions of a golf swing was investigated by applying three different normalization methods to the raw joint moment. Methods: The study included 13 subjects with no previous history of musculoskeletal diseases. Golf swing analyses were performed with six infrared cameras and two force plates. The majority of the raw peak joint moments showed a significant correlation at p < 0.05. Additionally, the resulting effects after applying body weight (BW), body weight multiplied by height (BWH), and body weight multiplied by leg length (BWL) normalization methods were analyzed through correlation and regression analysis. Results: The BW, BWH, and BWL normalization methods normalized 8, 10, and 11 peak joint moments out of 18, respectively. The best method for normalizing the golf swing was found to be the BWL method, which showed significant statistical differences. Several raw peak joint moments showed no significant correlation with measured anthropometrics, which was considered to be related to the muscle coordination that occurs in the swing of skilled professional golfers. Conclusions: The results of this study show that the BWL normalization method can effectively remove differences due to physical characteristics in the golf swing analysis.

Dynamic Contrast Enhanced MRI and Intravoxel Incoherent Motion to Identify Molecular Subtypes of Breast Cancer with Different Vascular Normalization Gene Expression

  • Wan-Chen Tsai;Kai-Ming Chang;Kuo-Jang Kao
    • Korean Journal of Radiology
    • /
    • v.22 no.7
    • /
    • pp.1021-1033
    • /
    • 2021
  • Objective: To assess the expression of vascular normalization genes in different molecular subtypes of breast cancer and to determine whether molecular subtypes with a higher vascular normalization gene expression can be identified using dynamic contrast-enhanced (DCE) magnetic resonance imaging (MRI) and intravoxel incoherent motion (IVIM) diffusion-weighted imaging (DWI). Materials and Methods: This prospective study evaluated 306 female (mean age ± standard deviation, 50 ± 10 years), recruited between January 2014 and August 2017, who had de novo breast cancer larger than 1 cm in diameter (308 tumors). DCE MRI followed by IVIM DWI studies using 11 different b-values (0 to 1200 s/mm2) were performed on a 1.5T MRI system. The Tofts model and segmented biexponential IVIM analysis were used. For each tumor, the molecular subtype (according to six [I-VI] subtypes and PAM50 subtypes), expression profile of genes for vascular normalization, pericytes, and normal vascular signatures were determined using freshly frozen tissue. Statistical associations between imaging parameters and molecular subtypes were examined using logistic regression or linear regression with a significance level of p = 0.05. Results: Breast cancer subtypes III and VI and PAM50 subtypes luminal A and normal-like exhibited a higher expression of genes for vascular normalization, pericyte markers, and normal vessel function signature (p < 0.001 for all) compared to other subtypes. Subtypes III and VI and PAM50 subtypes luminal A and normal-like, versus the remaining subtypes, showed significant associations with Ktrans, kep, vp, and IAUGCBN90 on DEC MRI, with relatively smaller values in the former. The subtype grouping was significantly associated with D, with relatively less restricted diffusion in subtypes III and VI and PAM50 subtypes luminal A and normal-like. Conclusion: DCE MRI and IVIM parameters may identify molecular subtypes of breast cancers with a different vascular normalization gene expression.

New Normalization Methods using Support Vector Machine Regression Approach in cDNA Microarray Analysis

  • Sohn, In-Suk;Kim, Su-Jong;Hwang, Chang-Ha;Lee, Jae-Won
    • Proceedings of the Korean Society for Bioinformatics Conference
    • /
    • 2005.09a
    • /
    • pp.51-56
    • /
    • 2005
  • There are many sources of systematic variations in cDNA microarray experiments which affect the measured gene expression levels like differences in labeling efficiency between the two fluorescent dyes. Print-tip lowess normalization is used in situations where dye biases can depend on spot overall intensity and/or spatial location within the array. However, print-tip lowess normalization performs poorly in situation where error variability for each gene is heterogeneous over intensity ranges. We proposed the new print-tip normalization methods based on support vector machine regression(SVMR) and support vector machine quantile regression(SVMQR). SVMQR was derived by employing the basic principle of support vector machine (SVM) for the estimation of the linear and nonlinear quantile regressions. We applied our proposed methods to previous cDNA micro array data of apolipoprotein-AI-knockout (apoAI-KO) mice, diet-induced obese mice, and genistein-fed obese mice. From our statistical analysis, we found that the proposed methods perform better than the existing print-tip lowess normalization method.

  • PDF

Non-Synonymously Redundant Encodings and Normalization in Genetic Algorithms (비유사 중복 인코딩을 사용하는 유전 알고리즘을 위한 정규화 연산)

  • Choi, Sung-Soon;Moon, Byung-Ro
    • Journal of KIISE:Software and Applications
    • /
    • v.34 no.6
    • /
    • pp.503-518
    • /
    • 2007
  • Normalization transforms one parent genotype to be consistent with the other before crossover. In this paper, we explain how normalization alleviates the difficulties caused by non-synonymously redundant encodings in genetic algorithms. We define the encodings with maximally non-synonymous property and prove that the encodings induce uncorrelated search spaces. Extensive experiments for a number of problems show that normalization transforms the uncorrelated search spaces to correlated ones and leads to significant improvement in performance.