• Title/Summary/Keyword: Maximization methods

Search Result 146, Processing Time 0.024 seconds

Comparison of Estimation Methods in NONMEM 7.2: Application to a Real Clinical Trial Dataset (실제 임상 데이터를 이용한 NONMEM 7.2에 도입된 추정법 비교 연구)

  • Yun, Hwi-Yeol;Chae, Jung-Woo;Kwon, Kwang-Il
    • Korean Journal of Clinical Pharmacy
    • /
    • v.23 no.2
    • /
    • pp.137-141
    • /
    • 2013
  • Purpose: This study compared the performance of new NONMEM estimation methods using a population analysis dataset collected from a clinical study that consisted of 40 individuals and 567 observations after a single oral dose of glimepiride. Method: The NONMEM 7.2 estimation methods tested were first-order conditional estimation with interaction (FOCEI), importance sampling (IMP), importance sampling assisted by mode a posteriori (IMPMAP), iterative two stage (ITS), stochastic approximation expectation-maximization (SAEM), and Markov chain Monte Carlo Bayesian (BAYES) using a two-compartment open model. Results: The parameters estimated by IMP, IMPMAP, ITS, SAEM, and BAYES were similar to those estimated using FOCEI, and the objective function value (OFV) for diagnosing the model criteria was significantly decreased in FOCEI, IMPMAP, SAEM, and BAYES in comparison with IMP. Parameter precision in terms of the estimated standard error was estimated precisely with FOCEI, IMP, IMPMAP, and BAYES. The run time for the model analysis was shortest with BAYES. Conclusion: In conclusion, the new estimation methods in NONMEM 7.2 performed similarly in terms of parameter estimation, but the results in terms of parameter precision and model run times using BAYES were most suitable for analyzing this dataset.

Experimental study of noise level optimization in brain single-photon emission computed tomography images using non-local means approach with various reconstruction methods

  • Seong-Hyeon Kang;Seungwan Lee;Youngjin Lee
    • Nuclear Engineering and Technology
    • /
    • v.55 no.5
    • /
    • pp.1527-1532
    • /
    • 2023
  • The noise reduction algorithm using the non-local means (NLM) approach is very efficient in nuclear medicine imaging. In this study, the applicability of the NLM noise reduction algorithm in single-photon emission computed tomography (SPECT) images with a brain phantom and the optimization of the NLM algorithm by changing the smoothing factors according to various reconstruction methods are investigated. Brain phantom images were reconstructed using filtered back projection (FBP) and ordered subset expectation maximization (OSEM). The smoothing factor of the NLM noise reduction algorithm determined the optimal coefficient of variation (COV) and contrast-to-noise ratio (CNR) results at a value of 0.020 in the FBP and OSEM reconstruction methods. We confirmed that the FBP- and OSEM-based SPECT images using the algorithm applied with the optimal smoothing factor improved the COV and CNR by 66.94% and 8.00% on average, respectively, compared to those of the original image. In conclusion, an optimized smoothing factor was derived from the NLM approach-based algorithm in brain SPECT images and may be applicable to various nuclear medicine imaging techniques in the future.

Increase of Tc-99m RBC SPECT Sensitivity for Small Liver Hemangioma using Ordered Subset Expectation Maximization Technique (Tc-99m RBC SPECT에서 Ordered Subset Expectation Maximization 기법을 이용한 작은 간 혈관종 진단 예민도의 향상)

  • Jeon, Tae-Joo;Bong, Jung-Kyun;Kim, Hee-Joung;Kim, Myung-Jin;Lee, Jong-Doo
    • The Korean Journal of Nuclear Medicine
    • /
    • v.36 no.6
    • /
    • pp.344-356
    • /
    • 2002
  • Purpose: RBC blood pool SPECT has been used to diagnose focal liver lesion such as hemangioma owing to its high specificity. However, low spatial resolution is a major limitation of this modality. Recently, ordered subset expectation maximization (OSEM) has been introduced to obtain tomographic images for clinical application. We compared this new modified iterative reconstruction method, OSEM with conventional filtered back projection (FBP) in imaging of liver hemangioma. Materials and Methods: Sixty four projection data were acquired using dual head gamma camera in 28 lesions of 24 patients with cavernous hemangioma of liver and these raw data were transferred to LINUX based personal computer. After the replacement of header file as interfile, OSEM was performed under various conditions of subsets (1,2,4,8,16, and 32) and iteration numbers (1,2,4,8, and 16) to obtain the best setting for liver imaging. The best condition for imaging in our investigation was considered to be 4 iterations and 16 subsets. After then, all the images were processed by both FBP and OSEM. Three experts reviewed these images without any information. Results: According to blind review of 28 lesions, OSEM images revealed at least same or better image quality than those of FBP in nearly all cases. Although there showed no significant difference in detection of large lesions more than 3 cm, 5 lesions with 1.5 to 3 cm in diameter were detected by OSEM only. However, both techniques failed to depict 4 cases of small lesions less than 1.5 cm. Conclusion: OSEM revealed better contrast and define in depiction of liver hemangioma as well as higher sensitivity in detection of small lesions. Furthermore this reconstruction method dose not require high performance computer system or long reconstruction time, therefore OSEM is supposed to be good method that can be applied to RBC blood pool SPECT for the diagnosis of liver hemangioma.

Infrared Image Segmentation by Extracting and Merging Region of Interest (관심영역 추출과 통합에 의한 적외선 영상 분할)

  • Yeom, Seokwon
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.26 no.6
    • /
    • pp.493-497
    • /
    • 2016
  • Infrared (IR) imaging is capable of detecting targets that are not visible at night, thus it has been widely used for the security and defense system. However, the quality of the IR image is often degraded by low resolution and noise corruption. This paper addresses target segmentation with the IR image. Multiple regions of interest (ROI) are extracted by the multi-level segmentation and targets are segmented from the individual ROI. Each level of the multi-level segmentation is composed of a k-means clustering algorithm an expectation-maximization (EM) algorithm, and a decision process. The k-means clustering algorithm initializes the parameters of the Gaussian mixture model (GMM) and the EM algorithm iteratively estimates those parameters. Each pixel is assigned to one of clusters during the decision. This paper proposes the selection and the merging of the extracted ROIs. ROI regions are selectively merged in order to include the overlapped ROI windows. In the experiments, the proposed method is tested on an IR image capturing two pedestrians at night. The performance is compared with conventional methods showing that the proposed method outperforms others.

Comparison of Three Parameter Estimation Methods for Mixture Distributions (혼합분포모형의 매개변수 추정방법 비교)

  • Shin, Ju-Young;Kim, Sooyoung;Kim, Taereem;Heo, Jun-Haeng
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2017.05a
    • /
    • pp.45-45
    • /
    • 2017
  • 상이한 자연현상으로 발생된 자료들은 때때로 통계적으로 다른 특성을 가지는 경우가 있다. 이런 자료들은 다른 두 개 이상의 모집단에서 자료가 발생한 것으로 가정할 수 가 있다. 기존에 널리 사용되어온 분포형 모형의 경우 단일한 모집단으로부터 자료가 발생한다는 가정하에서 개발된 모형들로 위에서 언급한 자료들을 적절히 모의할 수 없다. 이런 상이한 모집단에서 발생된 자료를 모형화 하기 위해서 혼합분포모형(mixture distribution)이 개발되었다. 홍수나 가뭄 등과 같은 극치 사상의 경우 다양한 자연현상들로부터 발생하기에 혼합분포모형을 적용할 경우 보다 정확한 모의가 가능하다. 혼합분포모형은 두 개 이상의 비혼합분포모형들을 가중합하여 만들어진다. 혼합 분포모형의 형태로 인하여 기존의 분포형 모형의 매개변수 추정 모형으로 널리 사용되던 최우도법 (maximum likelihood method), 모멘트법(method of moment), 확률가중모멘트법 (probability weighted moment method) 등을 이용하여 혼합분포모형의 매개변수를 추정하는 것이 용이 하지 않다. 혼합분포모형의 매개변수 추정 방법으로는 Expectation-Maximization (EM) 알고리즘, Meta-Heuristic Maximum Likelihood (MHML) 방법, Markov Chain Monte Carlo (MCMC) 방법 등이 적용되고 있다. 현재까지 수자원 분야에서 사용되는 극치 자료를 혼합분포모형을 이용하여 모의할 때 매개변수 추정방법에 따른 특성에 대한 연구가 진행되지 않았다. 본 연구에서는 우리나라 연최대강우량 자료를 이용하여 혼합분포모형의 매개변수 추정방법 (EM 알고리즘, MHML 방법, MCMC 방법) 들의 특성들을 비교 분석하였다. 혼합분포모형으로는 Gumbel-Gumbel 혼합분포 모형을 적용하였다. 본 연구의 결과는 향후 혼합분포모형을 이용한 연구에 좋은 기초자료로 사용될 수 있을 것으로 판단된다.

  • PDF

Newly-designed adaptive non-blind deconvolution with structural similarity index in single-photon emission computed tomography

  • Kyuseok Kim;Youngjin Lee
    • Nuclear Engineering and Technology
    • /
    • v.55 no.12
    • /
    • pp.4591-4596
    • /
    • 2023
  • Single-photon emission computed tomography SPECT image reconstruction methods have a significant influence on image quality, with filtered back projection (FBP) and ordered subset expectation maximization (OSEM) being the most commonly used methods. In this study, we proposed newly-designed adaptive non-blind deconvolution with a structural similarity (SSIM) index that can take advantage of the FBP and OSEM image reconstruction methods. After acquiring brain SPECT images, the proposed image was obtained using an algorithm that applied the SSIM metric, defined by predicting the distribution and amount of blurring. As a result of the contrast to noise ratio (CNR) and coefficient of variation evaluation (COV), the resulting image of the proposed algorithm showed a similar trend in spatial resolution to that of FBP, while obtaining values similar to those of OSEM. In addition, we confirmed that the CNR and COV values of the proposed algorithm improved by approximately 1.69 and 1.59 times, respectively, compared with those of the algorithm involving an inappropriate deblurring process. To summarize, we proposed a new type of algorithm that combines the advantages of SPECT image reconstruction techniques and is expected to be applicable in various fields.

An estimation method for non-response model using Monte-Carlo expectation-maximization algorithm (Monte-Carlo expectation-maximaization 방법을 이용한 무응답 모형 추정방법)

  • Choi, Boseung;You, Hyeon Sang;Yoon, Yong Hwa
    • Journal of the Korean Data and Information Science Society
    • /
    • v.27 no.3
    • /
    • pp.587-598
    • /
    • 2016
  • In predicting an outcome of election using a variety of methods ahead of the election, non-response is one of the major issues. Therefore, to address the non-response issue, a variety of methods of non-response imputation may be employed, but the result of forecasting tend to vary according to methods. In this study, in order to improve electoral forecasts, we studied a model based method of non-response imputation attempting to apply the Monte Carlo Expectation Maximization (MCEM) algorithm, introduced by Wei and Tanner (1990). The MCEM algorithm using maximum likelihood estimates (MLEs) is applied to solve the boundary solution problem under the non-ignorable non-response mechanism. We performed the simulation studies to compare estimation performance among MCEM, maximum likelihood estimation, and Bayesian estimation method. The results of simulation studies showed that MCEM method can be a reasonable candidate for non-response model estimation. We also applied MCEM method to the Korean presidential election exit poll data of 2012 and investigated prediction performance using modified within precinct error (MWPE) criterion (Bautista et al., 2007).

ICAIM;An Improved CAIM Algorithm for Knowledge Discovery

  • Yaowapanee, Piriya;Pinngern, Ouen
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.2029-2032
    • /
    • 2004
  • The quantity of data were rapidly increased recently and caused the data overwhelming. This led to be difficult in searching the required data. The method of eliminating redundant data was needed. One of the efficient methods was Knowledge Discovery in Database (KDD). Generally data can be separate into 2 cases, continuous data and discrete data. This paper describes algorithm that transforms continuous attributes into discrete ones. We present an Improved Class Attribute Interdependence Maximization (ICAIM), which designed to work with supervised data, for discretized process. The algorithm does not require user to predefine the number of intervals. ICAIM improved CAIM by using significant test to determine which interval should be merged to one interval. Our goal is to generate a minimal number of discrete intervals and improve accuracy for classified class. We used iris plant dataset (IRIS) to test this algorithm compare with CAIM algorithm.

  • PDF

The Minimization of Tolerance Cost and Quality Loss Cost by the Statistical Tolerance Allocation Method (Statistical Tolerance Allocation을 이용한 제조비용과 품질손실비용의 최소화)

  • Kim, Sunn-Ho;Kwon, Yong-Sung;Lee, Byong-Ki;Kang, Kyung-Sik
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.24 no.2
    • /
    • pp.175-183
    • /
    • 1998
  • When a product is designed, tolerances must be given to the product so that required functions are guaranteed and production costs are minimized. In this research, a model is suggested which allocates tolerances to components optimally according to the STA(Statistical Tolerance Allocation) method. Taking into account the concept that dimensional errors have characteristics of statistical distributions, this model presents the discrete pseudo-boolean approach for the tolerance optimization by minimizing the tolerance cost and the quality loss cost. In this approach, two methods are proposed for the reduction of the problem scale; 1) a method for converting the minimization model for casts into the maximization model for cost savings, and 2) procedures to reduce the number of constraints and variables.

  • PDF

Hybrid Artificial Immune System Approach for Profit Based Unit Commitment Problem

  • Lakshmi, K.;Vasantharathna, S.
    • Journal of Electrical Engineering and Technology
    • /
    • v.8 no.5
    • /
    • pp.959-968
    • /
    • 2013
  • This paper presents a new approach with artificial immune system algorithm to solve the profit based unit commitment problem. The objective of this work is to find the optimal generation scheduling and to maximize the profit of generation companies (Gencos) when subjected to various constraints such as power balance, spinning reserve, minimum up/down time and ramp rate limits. The proposed hybrid method is developed through adaptive search which is inspired from artificial immune system and genetic algorithm to carry out profit maximization of generation companies. The effectiveness of the proposed approach has been tested for different Gencos consists of 3, 10 and 36 generating units and the results are compared with the existing methods.