• Title/Summary/Keyword: R-SVD

Search Result 19, Processing Time 0.027 seconds

Quantification of Cerebral Blood Flow Measurements by Magnetic Resonance Imaging Bolus Tracking

  • Park Byung-Rae
    • Biomedical Science Letters
    • /
    • v.11 no.2
    • /
    • pp.129-134
    • /
    • 2005
  • Three different deconvolution techniques for quantifying cerebral blood flow (CBF) from whole brain $T2^{\ast}-weighted$ bolus tracking images were implemented (parametric Fourier transform P-FT, parametric single value decomposition P-SVD and nonparametric single value decomposition NP-SVD). The techniques were tested on 206 regions from 38 hyperacute stroke patients. In the P-FT and P-SVD techniques, the tissue and arterial concentration time curves were fit to a gamma variate function and the resulting CBF values correlated very well $(CBF_{P-FT}\;=\;1.02{\cdot}CBF_{p-SVD},\;r^2\;=\;0.96)$. The NP-SVD CBF values correlated well with the P-FT CBF values only when a sufficient number of time series volumes were acquired to minimize tracer time curve truncation $(CBF_{P-FT}\;=\;0.92{\cdot}CBF_{NP-SVD},\;r^2\;=\;0.88)$. The correlation between the fitted CBV and the unfitted CBV values was also maximized in regions with minimal tracer time curve truncation $(CBV_{fit}\;=\;1.00{\cdot}CBV_{ Unfit},\;^r^2\;=\;0.89)$. When a sufficient number of time series volumes could not be acquired (due to scanner limitations) to avoid tracer time curve truncation, the P-FT and P-SVD techniques gave more reliable estimates of CBF than the NP-SVD technique.

  • PDF

Missing Data Modeling based on Matrix Factorization of Implicit Feedback Dataset (암시적 피드백 데이터의 행렬 분해 기반 누락 데이터 모델링)

  • Ji, JiaQi;Chung, Yeongjee
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.23 no.5
    • /
    • pp.495-507
    • /
    • 2019
  • Data sparsity is one of the main challenges for the recommender system. The recommender system contains massive data in which only a small part is the observed data and the others are missing data. Most studies assume that missing data is randomly missing from the dataset. Therefore, they only use observed data to train recommendation model, then recommend items to users. In actual case, however, missing data do not lost randomly. In our research, treat these missing data as negative examples of users' interest. Three sample methods are seamlessly integrated into SVD++ algorithm and then propose SVD++_W, SVD++_R and SVD++_KNN algorithm. Experimental results show that proposed sample methods effectively improve the precision in Top-N recommendation over the baseline algorithms. Among the three improved algorithms, SVD++_KNN has the best performance, which shows that the KNN sample method is a more effective way to extract the negative examples of the users' interest.

Study on Volume Measurement of Cerebral Infarct using SVD and the Bayesian Algorithm (SVD와 Bayesian 알고리즘을 이용한 뇌경색 부피 측정에 관한 연구)

  • Kim, Do-Hun;Lee, Hyo-Young
    • Journal of the Korean Society of Radiology
    • /
    • v.15 no.5
    • /
    • pp.591-602
    • /
    • 2021
  • Acute ischemic stroke(AIS) should be diagnosed within a few hours of onset of cerebral infarction symptoms using diagnostic radiology. In this study, we evaluated the clinical usefulness of SVD and the Bayesian algorithm to measure the volume of cerebral infarction using computed tomography perfusion(CTP) imaging and magnetic resonance diffusion-weighted imaging(MR DWI). We retrospectively included 50 patients (male : female = 33 : 17) who visited the emergency department with symptoms of AIS from September 2017 to September 2020. The cerebral infarct volume measured by SVD and the Bayesian algorithm was analyzed using the Wilcoxon signed rank test and expressed as a median value and an interquartile range of 25 - 75 %. The core volume measured by SVD and the Bayesian algorithm using was CTP imaging was 18.07 (7.76 - 33.98) cc and 47.3 (23.76 - 79.11) cc, respectively, while the penumbra volume was 140.24 (117.8 - 176.89) cc and 105.05 (72.52 - 141.98) cc, respectively. The mismatch ratio was 7.56 % (4.36 - 15.26 %) and 2.08 % (1.68 - 2.77 %) for SVD and the Bayesian algorithm, respectively, and all the measured values had statistically significant differences (p < 0.05). Spearman's correlation analysis showed that the correlation coefficient of the cerebral infarct volume measured by the Bayesian algorithm using CTP imaging and MR DWI was higher than that of the cerebral infarct volume measured by SVD using CTP imaging and MR DWI (r = 0.915 vs. r = 0.763 ; p < 0.01). Furthermore, the results of the Bland Altman plot analysis demonstrated that the slope of the scatter plot of the cerebral infarct volume measured by the Bayesian algorithm using CTP imaging and MR DWI was more steady than that of the cerebral infarct volume measured by SVD using CTP imaging and MR DWI (y = -0.065 vs. y = -0.749), indicating that the Bayesian algorithm was more reliable than SVD. In conclusion, the Bayesian algorithm is more accurate than SVD in measuring cerebral infarct volume. Therefore, it can be useful in clinical utility.

Large Solvent and Noise Peak Suppression by Combined SVD-Harr Wavelet Transform

  • Kim, Dae-Sung;Kim, Dai-Gyoung;Lee, Yong-Woo;Won, Ho-Shik
    • Bulletin of the Korean Chemical Society
    • /
    • v.24 no.7
    • /
    • pp.971-974
    • /
    • 2003
  • By utilizing singular value decomposition (SVD) and shift averaged Harr wavelet transform (WT) with a set of Daubechies wavelet coefficients (1/2, -1/2), a method that can simultaneously eliminate an unwanted large solvent peak and noise peaks from NMR data has been developed. Noise elimination was accomplished by shift-averaging the time domain NMR data after a large solvent peak was suppressed by SVD. The algorithms took advantage of the WT, giving excellent results for the noise elimination in the Gaussian type NMR spectral lines of NMR data pretreated with SVD, providing superb results in the adjustment of phase and magnitude of the spectrum. SVD and shift averaged Haar wavelet methods were quantitatively evaluated in terms of threshold values and signal to noise (S/N) ratio values.

Analysis of Characteristics of Satellite-derived Air Pollutant over Southeast Asia and Evaluation of Tropospheric Ozone using Statistical Methods (통계적 방법을 이용한 동남아시아지역 위성 대기오염물질 분석과 검증)

  • Baek, K.H.;Kim, Jae-Hwan
    • Journal of Korean Society for Atmospheric Environment
    • /
    • v.27 no.6
    • /
    • pp.650-662
    • /
    • 2011
  • The statistical tools such as empirical orthogonal function (EOF), and singular value decomposition (SVD) have been applied to analyze the characteristic of air pollutant over southeast Asia as well as to evaluate Zimeke's tropospheric column ozone (ZTO) determined by tropospheric residual method. In this study, we found that the EOF and SVD analyses are useful methods to extract the most significant temporal and spatial pattern from enormous amounts of satellite data. The EOF analyses with OMI $NO_2$ and OMI HCHO over southeast Asia revealed that the spatial pattern showed high correlation with fire count (r=0.8) and the EOF analysis of CO (r=0.7). This suggests that biomass burning influences a major seasonal variability on $NO_2$ and HCHO over this region. The EOF analysis of ZTO has indicated that the location of maximum ZTO was considerably shifted westward from the location of maximum of fire count and maximum month of ZTO occurred a month later than maximum month (March) of $NO_2$, HCHO and CO. For further analyses, we have performed the SVD analyses between ZTO and ozone precursor to examine their correlation and to check temporal and spatial consistency between two variables. The spatial pattern of ZTO showed latitudinal gradient that could result from latitudinal gradient of stratospheric ozone and temporal maximum of ZTO in March appears to be associated with stratospheric ozone variability that shows maximum in March. These results suggest that there are some sources of error in the tropospheric residual method associated with cloud height error, low efficiency of tropospheric ozone, and low accuracy in lower stratospheric ozone.

ANALYSIS OF EIGEN VALUES FOR EFFECTIVE CHOICE OF SNAPSHOT DATA IN PROPER ORTHOGONAL DECOMPOSITION (적합직교분해 기법에서의 효율적인 스냅샷 선정을 위한 고유값 분석)

  • Kang, H.M.;Jun, S.O.;Yee, K.
    • Journal of computational fluids engineering
    • /
    • v.22 no.1
    • /
    • pp.59-66
    • /
    • 2017
  • The guideline of selecting the number of snapshot dataset, $N_s$ in proper orthogonal decomposition(POD) was presented via the analysis of Eigen values based on the singular value decomposition(SVD). In POD, snapshot datasets from the solutions of Euler or Navier-Stokes equations are utilized to SVD and a reduced order model(ROM) is constructed as the combination of Eigen vectors. The ROM is subsequently applied to reconstruct the flowfield data with new set of flow conditions, thereby enhancing the computational efficiency. The overall computational efficiency and accuracy of POD is dependent on the number of snapshot dataset; however, there is no reliable guideline of determining $N_s$. In order to resolve this problem, the order of maximum to minimum Eigen value ratio, O(R) from SVD was analyzed and presented for the decision of $N_s$; in case of steady flow, $N_s$ should be determined to make O(R) be $10^9$. For unsteady flow, $N_s$ should be increased to make O(R) be $10^{11\sim12}$. This strategy of selecting the snapshot dataset was applied to two dimensional NACA0012 airfoil and vortex flow problems including steady and unsteady cases and the numerical accuracies according to $N_s$ and O(R) were discussed.

Dynamic Modeling of Eigenbackground for Object Tracking (객체 추적을 위한 고유 배경의 동적 모델링)

  • Kim, Sung-Young
    • Journal of the Korea Society of Computer and Information
    • /
    • v.17 no.4
    • /
    • pp.67-74
    • /
    • 2012
  • In this paper, we propose an efficient dynamic background modelling method by using eigenbackground to extract moving objects from video stream. Even if a background model has been created, the model has to be updated to adapt to change due to several reasons such as weather or lighting. In this paper, we update a background model based on R-SVD method. At this time we define a change ratio of images and update the model dynamically according this value. Also eigenbackground need to be modelled by using sufficient training images for accurate models but we reorganize input images to reduce the number of images for training models. Through simulation, we show that the proposed method improves the performance against traditional eigenbackground method without background updating and a previous method.

A Study on the Condition Monitoring for GIS Using SVD in an Attractor of Chaos Theory

  • J.S. Kang;Kim, C.H.;R.K. Aggarwal
    • KIEE International Transactions on Power Engineering
    • /
    • v.4A no.1
    • /
    • pp.33-41
    • /
    • 2004
  • Knowledge of partial discharge (PD) is important to accurately diagnose and predict the condition of insulation. The PD phenomenon is highly complex and seems to be random in its occurrence. This paper indicates the possible use of chaos theory for the recognition and distinction concerning PD signals. Chaos refers to a state where the predictive abilities of a systems future are lost and the system is rendered aperiodic. The analysis of PD using deterministic chaos comprises of the study of the basic system dynamics of the PD phenomenon. This involves the construction of the PD attractor in state space. The simulation results show that the variance of an orthogonal axis in an attractor of chaos theory increases according to the magnitude and the number of PDs. However, it is difficult to clearly identify the characteristics of the PDs. Thus, we calculated the magnitude on an orthogonal axis in an attractor using singular value decomposition (SVD) and principal component analysis (PCA) to extract the numerical characteristics. In this paper, we proposed the condition monitoring method for gas insulated switchgear (GIS) using SVD for efficient calculation of the variance. Thousands of simulations have proven the accuracy and effectiveness of the proposed algorithm.

Mode-SVD-Based Maximum Likelihood Source Localization Using Subspace Approach

  • Park, Chee-Hyun;Hong, Kwang-Seok
    • ETRI Journal
    • /
    • v.34 no.5
    • /
    • pp.684-689
    • /
    • 2012
  • A mode-singular-value-decomposition (SVD) maximum likelihood (ML) estimation procedure is proposed for the source localization problem under an additive measurement error model. In a practical situation, the noise variance is usually unknown. In this paper, we propose an algorithm that does not require the noise covariance matrix as a priori knowledge. In the proposed method, the weight is derived by the inverse of the noise magnitude square in the ML criterion. The performance of the proposed method outperforms that of the existing methods and approximates the Taylor-series ML and Cram$\acute{e}$r-Rao lower bound.

A Study of High Precision Position Estimator Using GPS/INS Sensor Fusion (GPS/INS센서 융합을 이용한 고 정밀 위치 추정에 관한 연구)

  • Lee, Jeongwhan;Kim, Hansil
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.49 no.11
    • /
    • pp.159-166
    • /
    • 2012
  • There are several ways such as GPS(Global Positioning System) and INS (Inertial Navigation System) to track the location of moving vehicle. The GPS has the advantages of having non-accumulative error even if it brings about errors. In order to obtain the position information, we need to receive at least 3 satellites information. But, the weak point is that GPS is not useful when the 혠 signal is weak or it is in the incommunicable region such as tunnel. In the case of INS, the information of the position and posture of mobile with several Hz~several hundreds Hz data speed is recorded for velocity, direction. INS shows a very precise navigational performance for a short period, but it has the disadvantage of increasing velocity components because of the accumulated error during integration over time. In this paper, sensor fusion algorithm is applied to both of INS and GPS for the position information to overcome the drawbacks. The proposed system gets an accurate position information from experiment using SVD in a non-accessible GPS terrain.