• Title/Summary/Keyword: Matrix factorization

Search Result 304, Processing Time 0.03 seconds

Study of Spectral Factorization using Circulant Matrix Factorization to Design the FIR/IIR Lattice Filters (FIR/IIR Lattice 필터의 설계를 위한 Circulant Matrix Factorization을 사용한 Spectral Factorization에 관한 연구)

  • 김상태;박종원
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.7 no.3
    • /
    • pp.437-447
    • /
    • 2003
  • We propose the methods to design the finite impulse response (FIR) and the infinite impulse response (IIR) lattice filters using Schur algorithm through the spectral factorization of the covariance matrix by circulant matrix factorization (CMF). Circulant matrix factorization is also very powerful tool used fur spectral factorization of the covariance polynomial in matrix domain to obtain the minimum phase polynomial without the polynomial root finding problem. Schur algorithm is the method for a fast Cholesky factorization of Toeplitz matrix, which easily determines the lattice filter parameters. Examples for the case of the FIR Inter and for the case of the IIR filter are included, and performance of our method check by comparing of our method and another methods (polynomial root finding and cepstral deconvolution).

BLOCK INCOMPLETE FACTORIZATION PRECONDITIONERS FOR A SYMMETRIC H-MATRIX

  • Yun, Jae-Heon;Kim, Sang-Wook
    • Bulletin of the Korean Mathematical Society
    • /
    • v.37 no.3
    • /
    • pp.551-568
    • /
    • 2000
  • We propose new parallelizable block incomplete factorization preconditioners for a symmetric block-tridiagonal H-matrix. Theoretical properties of these block preconditioners are compared with those of block incomplete factorization preconditioners for the corresponding comparison matrix. Numerical results of the preconditioned CG(PCG) method using these block preconditioners are compared with those of PCG method using a standard incomplete factorization preconditioner to see the effectiveness of the block incomplete factorization preconditioners.

  • PDF

A FAST FACTORIZATION ALGORITHM FOR A CONFLUENT CAUCHY MATRIX

  • KIM KYUNGSUP
    • Journal of the Korean Mathematical Society
    • /
    • v.42 no.1
    • /
    • pp.1-16
    • /
    • 2005
  • This paper presents a fast factorization algorithm for confluent Cauchy-like matrices. The algorithm consists of two parts. First. a confluent Cauchy-like matrix is transformed into a Cauchy-like matrix available to pivot without changing its structure. Second. a fast partial pivoting factorization algorithm for the Cauchy-like matrix is presented. A new displacement structure cannot possibly generate all entries of a transformed matrix, which is called by 'partially reconstructible'. This paper also discusses how the proposed factorization algorithm can be generally applied to partially reconstructive matrices.

Feature Parameter Extraction and Speech Recognition Using Matrix Factorization (Matrix Factorization을 이용한 음성 특징 파라미터 추출 및 인식)

  • Lee Kwang-Seok;Hur Kang-In
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.10 no.7
    • /
    • pp.1307-1311
    • /
    • 2006
  • In this paper, we propose new speech feature parameter using the Matrix Factorization for appearance part-based features of speech spectrum. The proposed parameter represents effective dimensional reduced data from multi-dimensional feature data through matrix factorization procedure under all of the matrix elements are the non-negative constraint. Reduced feature data presents p art-based features of input data. We verify about usefulness of NMF(Non-Negative Matrix Factorization) algorithm for speech feature extraction applying feature parameter that is got using NMF in Mel-scaled filter bank output. According to recognition experiment results, we confirm that proposed feature parameter is superior to MFCC(Mel-Frequency Cepstral Coefficient) in recognition performance that is used generally.

Design of FIR/IIR Lattice Filters using the Circulant Matrix Factorization (Circulant Matrix Factorization을 이용한 FIR/IIR Lattice 필터의 설계)

  • Kim Sang-Tae;Lim Yong-Kon
    • Journal of the Institute of Electronics Engineers of Korea TC
    • /
    • v.41 no.1
    • /
    • pp.35-44
    • /
    • 2004
  • We Propose the methods to design the finite impulse response (FIR) and the infinite impulse response (IIR) lattice filters using Schur algorithm through the spectral factorization of the covariance matrix by circulant matrix factorization (CMF). Circulant matrix factorization is also very powerful tool used for spectral factorization of the covariance polynomial in matrix domain to obtain the minimum phase polynomial without the polynomial root finding problem. Schur algorithm is the method for a fast Cholesky factorization of Toeplitz matrix, which easily determines the lattice filter parameters. Examples for the case of the FIR filter and for the case of the In filter are included, and performance of our method check by comparing of our method and another methods (polynomial root finding and cepstral deconvolution).

A VARIANT OF BLOCK INCOMPLETE FACTORIZATION PRECONDITIONERS FOR A SYMMETRIC H-MATRIX

  • Yun, Jae-Heon;Kim, Sang-Wook
    • Journal of applied mathematics & informatics
    • /
    • v.8 no.3
    • /
    • pp.705-720
    • /
    • 2001
  • We propose a variant of parallel block incomplete factorization preconditioners for a symmetric block-tridiagonal H-matrix. Theoretical properties of these block preconditioners are compared with those of block incomplete factoriztion preconditioners for the corresponding somparison matrix. Numerical results of the preconditioned CG(PCG) method using these block preconditioners are compared with those of PCG using other types of block incomplete factorization preconditioners. Lastly, parallel computations of the block incomplete factorization preconditioners are carried out on the Cray C90.

Compare to Factorization Machines Learning and High-order Factorization Machines Learning for Recommend system (추천시스템에 활용되는 Matrix Factorization 중 FM과 HOFM의 비교)

  • Cho, Seong-Eun
    • Journal of Digital Contents Society
    • /
    • v.19 no.4
    • /
    • pp.731-737
    • /
    • 2018
  • The recommendation system is actively researched for the purpose of suggesting information that users may be interested in in many fields such as contents, online commerce, social network, advertisement system, and the like. However, there are many recommendation systems that propose based on past preference data, and it is difficult to provide users with little or no data in the past. Therefore, interest in higher-order data analysis is increasing and Matrix Factorization is attracting attention. In this paper, we study and propose a comparison and replay of the Factorization Machines Leaning(FM) model which is attracting attention in the recommendation system and High-Order Factorization Machines Learning(HOFM) which is a high - dimensional data analysis.

LU-FACTORIZATION OF THE SQUARE-TYPE MATRIX OF THE STIRLING MATRIX

  • Ji-Hwan Jung
    • East Asian mathematical journal
    • /
    • v.39 no.5
    • /
    • pp.523-528
    • /
    • 2023
  • Let Sn = [S(i, j)]1≤i,j≤n and S*n = [S(i + j, j)]1≤i,j≤n where S(i, j) is the Stirling number of the second kind. Choi and Jo [On the determinants of the square-type Stirling matrix and Bell matrix, Int. J. Math. Math. Sci. 2021] obtained the diagonal entries of matrix U in the LU-factorization of S*n for calculating the determinant of S*n, where L = Sn. In this paper, we compute the all entries of U in the LU-factorization of matrix S*n. This implies the identities related to Stirling numbers of both kinds.

Robust Non-negative Matrix Factorization with β-Divergence for Speech Separation

  • Li, Yinan;Zhang, Xiongwei;Sun, Meng
    • ETRI Journal
    • /
    • v.39 no.1
    • /
    • pp.21-29
    • /
    • 2017
  • This paper addresses the problem of unsupervised speech separation based on robust non-negative matrix factorization (RNMF) with ${\beta}$-divergence, when neither speech nor noise training data is available beforehand. We propose a robust version of non-negative matrix factorization, inspired by the recently developed sparse and low-rank decomposition, in which the data matrix is decomposed into the sum of a low-rank matrix and a sparse matrix. Efficient multiplicative update rules to minimize the ${\beta}$-divergence-based cost function are derived. A convolutional extension of the proposed algorithm is also proposed, which considers the time dependency of the non-negative noise bases. Experimental speech separation results show that the proposed convolutional RNMF successfully separates the repeating time-varying spectral structures from the magnitude spectrum of the mixture, and does so without any prior training.

Improving on Matrix Factorization for Recommendation Systems by Using a Character-Level Convolutional Neural Network (문자 수준 컨볼루션 뉴럴 네트워크를 이용한 추천시스템에서의 행렬 분해법 개선)

  • Son, Donghee;Shim, Kyuseok
    • KIISE Transactions on Computing Practices
    • /
    • v.24 no.2
    • /
    • pp.93-98
    • /
    • 2018
  • Recommendation systems are used to provide items of interests for users to maximize a company's profit. Matrix factorization is frequently used by recommendation systems, based on an incomplete user-item rating matrix. However, as the number of items and users increase, it becomes difficult to make accurate recommendations due to the sparsity of data. To overcome this drawback, the use of text data related to items was recently suggested for matrix factorization algorithms. Furthermore, a word-level convolutional neural network was shown to be effective in the process of extracting the word-level features from the text data among these kinds of matrix factorization algorithms. However, it involves a large number of parameters to learn in the word-level convolutional neural network. Thus, we propose a matrix factorization algorithm which utilizes a character-level convolutional neural network with which to extract the character-level features from the text data. We also conducted a performance study with real-life datasets to show the effectiveness of the proposed matrix factorization algorithm.