• Title/Summary/Keyword: Size normalization

Search Result 109, Processing Time 0.027 seconds

A Square Root Normalized LMS Algorithm for Adaptive Identification with Non-Stationary Inputs

  • Alouane Monia Turki-Hadj
    • Journal of Communications and Networks
    • /
    • v.9 no.1
    • /
    • pp.18-27
    • /
    • 2007
  • The conventional normalized least mean square (NLMS) algorithm is the most widely used for adaptive identification within a non-stationary input context. The convergence of the NLMS algorithm is independent of environmental changes. However, its steady state performance is impaired during input sequences with low dynamics. In this paper, we propose a new NLMS algorithm which is, in the steady state, insensitive to the time variations of the input dynamics. The square soot (SR)-NLMS algorithm is based on a normalization of the LMS adaptive filter input by the Euclidean norm of the tap-input. The tap-input power of the SR-NLMS adaptive filter is then equal to one even during sequences with low dynamics. Therefore, the amplification of the observation noise power by the tap-input power is cancelled in the misadjustment time evolution. The harmful effect of the low dynamics input sequences, on the steady state performance of the LMS adaptive filter are then reduced. In addition, the square root normalized input is more stationary than the base input. Therefore, the robustness of LMS adaptive filter with respect to the input non stationarity is enhanced. A performance analysis of the first- and the second-order statistic behavior of the proposed SR-NLMS adaptive filter is carried out. In particular, an analytical expression of the step size ensuring stability and mean convergence is derived. In addition, the results of an experimental study demonstrating the good performance of the SR-NLMS algorithm are given. A comparison of these results with those obtained from a standard NLMS algorithm, is performed. It is shown that, within a non-stationary input context, the SR-NLMS algorithm exhibits better performance than the NLMS algorithm.

Comparison of Geomorphological Parameters Derived from Different Digital Elevation Model Resolutions in Chuncheon, South Korea (수치표고모델 해상도에 따라 도출된 춘천지역의 지형학적 매개변수 비교)

  • LEE, Jun-Gu;SUH, Young-Cheol;LEE, Dong-Ha
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.21 no.1
    • /
    • pp.106-114
    • /
    • 2018
  • DEM(Digital Elevation Model) are now easily produced with advancing remote sensing technology. Depending on desired task, UAV can produce high resolution DEM. But high resolution comes with issues of data storage and processing time and cost. To check the effect of DEM resolution, this study compares six geomorphological parameters derived from different resolution DEM in a test area around Chuncheon, Korea. The comparison analysis was based on statistics of each derivatives of slope, curvature, flow direction, flow accumulation, flow length and basin. As a result, it was found that DEM remained unchanged and so did the flow accumulation area. However, slope, curvature, flow length and basin numbers were decreased with the normalization of increasing pixel size. DEM resolution should be carefully selected depending on the precision of application required.

Monte Carlo Simulation for the Measurement of Entrance Skin Dose on Newborn and Infants (영·유아의 입사피부선량 측정을 위한 몬테카를로 시뮬레이션)

  • Kim, Sang-Tae
    • The Journal of the Korea Contents Association
    • /
    • v.12 no.6
    • /
    • pp.346-352
    • /
    • 2012
  • Radiation dose estimation on the newborn and infants during radiation examinations, unlike for the adults, is not actively being progressed. Therefore, as an index to present exposure dose during radiation examinations on newborn and infants, entrance skin dose was measured, and the result was compared with results of monte carlo simulation to raise reproducibility of entrance skin dose measurement, and it was proved that various geometry implementation was possible. The resulting values through monte carlo simulation was estimated using normalization factors for entrance skin dose to calibrate radiation dose and then normalized to a unit X ray radiation field size. Average entrance skin dose per one time exposure was $78.41{\mu}Gy$ and the percentage error between measurement by dosimeter and by monte carlo simulation was found to be -4.77%. Entrance skin dose assessment by monte carlo simulation provides possible alternative method in difficult entrance skin dose estimation for the newborn and infants who visit hospital for actual diagnosis.

Natural Background Level Analysis of Heavy Metal Concentration in Korean Coastal Sediments (한국 연안 퇴적물 내 중금속 원소의 자연적 배경농도 연구)

  • Lim, Dhong-Il;Choi, Jin-Yong;Jung, Hoi-Soo;Choi, Hyun-Woo;Kim, Young-Ok
    • Ocean and Polar Research
    • /
    • v.29 no.4
    • /
    • pp.379-389
    • /
    • 2007
  • This paper presents an attempt to determine natural background levels of heavy metals which could be used for assessing heavy metal contamination. For this study, a large archive dataset of heavy metal concentration (Cu, Cr, Ni, Pb, Zn) for more than 900 surface sediment samples from various Korean coastal environments was newly compiled. These data were normalized for aluminum (grain-size normalizer) concentration to isolate natural factors from anthropogenic ones. The normalization was based on the hypothesis that heavy metal concentrations vary consistently with the concentration of aluminum, unless these metals are of anthropogenic origin. So, the samples (outliers) suspected of receivingany anthropogenic input were removed from regression to ascertain the "background" relationship between the metals and aluminum. Identification of these outliers was tested using a model of predicted limits at 95%. The process of testing for normality (Kolmogorov-Smirnov Test) and selection of outliers was iterated until a normal distribution was achieved. On the basis of the linear regression analysis of the large archive (please check) dataset, background levels, which are applicable to heavy metal assessment of Korean coastal sediments, were successfully developed for Cu, Cr, Ni, Zn. As an example, we tested the applicability of this baseline level for metal pollution assessment of Masan Bay sediments.

Genetic Algorithm for Node P겨ning of Neural Networks (신경망의 노드 가지치기를 위한 유전 알고리즘)

  • Heo, Gi-Su;Oh, Il-Seok
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.46 no.2
    • /
    • pp.65-74
    • /
    • 2009
  • In optimizing the neural network structure, there are two methods of the pruning scheme and the constructive scheme. In this paper we use the pruning scheme to optimize neural network structure, and the genetic algorithm to find out its optimum node pruning. In the conventional researches, the input and hidden layers were optimized separately. On the contrary we attempted to optimize the two layers simultaneously by encoding two layers in a chromosome. The offspring networks inherit the weights from the parent. For teaming, we used the existing error back-propagation algorithm. In our experiment with various databases from UCI Machine Learning Repository, we could get the optimal performance when the network size was reduced by about $8{\sim}25%$. As a result of t-test the proposed method was shown better performance, compared with other pruning and construction methods through the cross-validation.

Classification Prediction Error Estimation System of Microarray for a Comparison of Resampling Methods Based on Multi-Layer Perceptron (다층퍼셉트론 기반 리 샘플링 방법 비교를 위한 마이크로어레이 분류 예측 에러 추정 시스템)

  • Park, Su-Young;Jeong, Chai-Yeoung
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.14 no.2
    • /
    • pp.534-539
    • /
    • 2010
  • In genomic studies, thousands of features are collected on relatively few samples. One of the goals of these studies is to build classifiers to predict the outcome of future observations. There are three inherent steps to build classifiers: a significant gene selection, model selection and prediction assessment. In the paper, with a focus on prediction assessment, we normalize microarray data with quantile-normalization methods that adjust quartile of all slide equally and then design a system comparing several methods to estimate 'true' prediction error of a prediction model in the presence of feature selection and compare and analyze a prediction error of them. LOOCV generally performs very well with small MSE and bias, the split sample method and 2-fold CV perform with small sample size very pooly. For computationally burdensome analyses, 10-fold CV may be preferable to LOOCV.

Implementation of Digitizing System for Sea Level Measurements Record (조위관측 기록 디지타이징 시스템 구현)

  • Yu, Young-Jung;Park, Seong-Ho
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.14 no.8
    • /
    • pp.1907-1917
    • /
    • 2010
  • It is much needed research for ocean scientists to implement a digitizing system that effectively extracts and digitializes sea level records accumulated from the past. The main difficulty of such a system is huge anount of data to be processed. In this paper, we implement a digitizing system to handle such mass-data of sea level records. This system consists of a pre-process step, a digitizing step and a post-process step. In pre-process step, the system adjusts skewnesses of scanned images and normalizes the size of images automatically. Then, it extracts a graph area from images and thins the graph area in digitizing step. Finally, in the post-process step, the system tests the reliability. It is cost-effective and labour-reducing software for scientists not wasting their time to such boring manual digitizing jobs.

Long-term outcomes of surgery and radiotherapy for secreting and non-secreting pituitary adenoma

  • Kim, Mi Young;Kim, Jin Hee;Oh, Young Kee;Kim, El
    • Radiation Oncology Journal
    • /
    • v.34 no.2
    • /
    • pp.121-127
    • /
    • 2016
  • Purpose: To investigate treatment outcome and long term complication after surgery and radiotherapy (RT) for pituitary adenoma. Materials and Methods: From 1990 to 2009, 73 patients with surgery and RT for pituitary adenoma were analyzed in this study. Median age was 51 years (range, 25 to 71 years). Median tumor size was 3 cm (range, 1 to 5 cm) with suprasellar (n = 21), cavernous sinus extension (n = 14) or both (n = 5). Hormone secreting tumor was diagnosed in 29 patients; 16 patients with prolactin, 12 patients with growth hormone, and 1 patient with adrenocorticotrophic hormone. Impairment of visual acuity or visual field was presented in 33 patients at first diagnosis. Most patients (n = 64) received RT as postoperative adjuvant setting. Median RT dose was 45 Gy (range, 45 to 59.4 Gy). Results: Median follow-up duration was 8 years (range, 3 to 22 years). In secreting tumors, hormone normalization rate was 55% (16 of 29 patients). For 25 patients with evaluable visual field and visual acuity test, 21 patients (84%) showed improvement of visual disturbance after treatment. The 10-year tumor control rate for non-secreting and secreting adenoma was 100% and 58%, respectively (p < 0.001). Progression free survival rate at 10 years was 98%. Only 1 patient experienced endocrinological recurrence. Following surgery, 60% (n = 44) suffered from pituitary function deficit. Late complication associated with RT was only 1 patient, who developed cataract. Conclusion: Surgery and RT are very effective and safe in hormonal and tumor growth control for secreting and non-secreting pituitary adenoma.

Out-Boundary Rectangle Detection in Comic Images Using the Gradient Radon Transform (그래디언트 라돈변환을 이용한 만화영상의 외곽 경계사각형 검출)

  • Kim, Dong-Keun;Yang, Seung-Beom;Hwang, Chi-Jung
    • Journal of Korea Multimedia Society
    • /
    • v.14 no.4
    • /
    • pp.538-545
    • /
    • 2011
  • Today, there is a wide variety of digital contents on the Internet. Especially, comic images are one of popular digital contents. Most of them are scanned from comic books by digital scanners, but they were not normalized in sense of their size, skew and boundary margin. The normalization is very important step in comic image analysis. It can be achieved by finding out-boundary rectangles in comic images. In this paper, we propose a method for detecting the out-boundary rectangle using the gradient Radon transform in comic images. We applied the Radon transform using image gradients to extract line segments which are the out-boundary rectangle sides' candidates in comic images. The final out-boundary rectangle can be detected by local histogram and the candidate line segments. Experimental results show that our proposed method effectively detect the out-boundary rectangle in comic images.

Application of Deep Learning-Based Nuclear Medicine Lung Study Classification Model (딥러닝 기반의 핵의학 폐검사 분류 모델 적용)

  • Jeong, Eui-Hwan;Oh, Joo-Young;Lee, Ju-Young;Park, Hoon-Hee
    • Journal of radiological science and technology
    • /
    • v.45 no.1
    • /
    • pp.41-47
    • /
    • 2022
  • The purpose of this study is to apply a deep learning model that can distinguish lung perfusion and lung ventilation images in nuclear medicine, and to evaluate the image classification ability. Image data pre-processing was performed in the following order: image matrix size adjustment, min-max normalization, image center position adjustment, train/validation/test data set classification, and data augmentation. The convolutional neural network(CNN) structures of VGG-16, ResNet-18, Inception-ResNet-v2, and SE-ResNeXt-101 were used. For classification model evaluation, performance evaluation index of classification model, class activation map(CAM), and statistical image evaluation method were applied. As for the performance evaluation index of the classification model, SE-ResNeXt-101 and Inception-ResNet-v2 showed the highest performance with the same results. As a result of CAM, cardiac and right lung regions were highly activated in lung perfusion, and upper lung and neck regions were highly activated in lung ventilation. Statistical image evaluation showed a meaningful difference between SE-ResNeXt-101 and Inception-ResNet-v2. As a result of the study, the applicability of the CNN model for lung scintigraphy classification was confirmed. In the future, it is expected that it will be used as basic data for research on new artificial intelligence models and will help stable image management in clinical practice.