DOI QR코드

DOI QR Code

EXPLICIT MINIMUM POLYNOMIAL, EIGENVECTOR AND INVERSE FORMULA OF DOUBLY LESLIE MATRIX

  • Received : 2014.09.02
  • Accepted : 2015.03.20
  • Published : 2015.05.30

Abstract

The special form of Schur complement is extended to have a Schur's formula to obtains the explicit formula of determinant, inverse, and eigenvector formula of the doubly Leslie matrix which is the generalized forms of the Leslie matrix. It is also a generalized form of the doubly companion matrix, and the companion matrix, respectively. The doubly Leslie matrix is a nonderogatory matrix.

Keywords

1. Introduction

One of the most popular models of population growth is a matrix-based model, first introduced by P. H. Leslie. In 1945, he published his most famous article in Biometrika, a journal. The article was entitled, On the use of matrices in certain population mathematics [1,pp. 117–120]. The Leslie model describes the growth of the female portion of a population which is assumed to have a maximum lifespan. The females are divided into age classes all of which span an equal number of years. Using data about the average birthrates and survival probabilities of each class, the model is then able to determine the growth of the population over time, [11,7].

Chen and Li in [5] asserted that, Leslie matrix models are discrete models for the development of age-structured populations. It is known that eigenvalues of a Leslie matrix are important in describing the asymptotic behavior of the corresponding population model. It is also known that the ratio of the spectral radius and the second largest(subdominant) eigenvalue in modulus of a nonperiodic Leslie matrix determines the rate of convergence of the corresponding population distributions to a stable age distribution.

A Leslie matrix arises in a discrete, age-dependent model for population growth. It is a matrix of the form

where rj≥0,0

For a given field , the set of all polynomials in x over is denoted by [x]. For a positive integer n, let Mn () be the set of all n × n matrices over . The set of all vectors, or n × 1 matrices over is denoted by n . A nonzero vector v ∈ n is called an eigenvector of A ∈ Mn () corresponding to a scalar λ ∈ if Av = λv, and the scalar λ is an eigenvalue of the matrix A. The set of eigenvalues of A is call the spectrum of A and is denoted by σ(A). In the most common case in which = , the complex numbers, Mn () is abbreviated to Mn.

Doubly companion matrices C ∈ Mn were first introduced by Butcher and Chartier in [4, pp. 274–276], given by

that is, a n × n matrix C with n > 1 is called a doubly companion matrix if its entries cij satisfy cij = 1 for all entries in the sub-maindiagonal of C and else cij = 0 for i ≠ 1 and j ≠ n.

We define a doubly Leslie matrix analogous as the doubly companion matrix by replacing the subdiagonal of the doubly companion matrix by s1 , s2 , . . . , sn−1 where sj , j = 1, 2, . . . , n − 1, respectively, and denoted by L, that is, a doubly Leslie matrix is defined to be a matrix as follows

where aj , bj ∈ , the real numbers, j = 1, 2, . . . , n. As the Leslie matrix, we restriction only sj > 0, j = 1, 2, . . . , n − 1.

For convenience, we can be written the matrix L in a partitioned form as

and Λ = diag(s1 , s2 , . . . , sn−1 ) is a diagonal matrix of order n − 1.

Note: If we define the doubly Leslie matrix in an another form such as L = where all symbols are as above, then some consequence productions will be complicates forms.

We recall some well-known results from linear algebra and matrix analysis.

Definition 1.1 ([6], Definition 1.3.1). A matrix B ∈ Mn is said to be similar to a matrix A ∈ Mn if there exists a nonsingular matrix S ∈ Mn such that B = S−1 AS.

Theorem 1.2 ([6], Theorem 1.4.8). Let A, B ∈ Mn , if x ∈ n is an eigenvector corresponding to λ ∈ σ(B) and if B is similar to A via S, then Sx is an eigenvector of A corresponding to the eigenvalue λ.

Theorem 1.3 ([6], Theorem 3.3.15). A matrix A ∈ Mn is similar to the companion matrix of its characteristic polynomial if and only if the minimal and characteristic polynomial of A are identical.

Definition 1.4 ([9], p. 664). A matrix A ∈ Mn for which the characteristic polynomial ∆A(x) equal to the minimum polynomial mA(x) are said to be nonderogatory matrix.

In the present paper we give explicit determinant, inverse matrix, and eigenvector formulae for the doubly Leslie matrix and give some related topics.

 

2. Some Properties of Schur Complement

Let M be a matrix partitioned into four blocks

where the submatrix C is assumed to be square and nonsingular. Brezinski in [3,p. 232] asserted that, the Schur complement of C in M , denoted by (M/C), is defined by

which is related to Gaussian elimination by

Suppose that B and C are k × k and (n − k) × (n − k) matrices, respectively, k < n, and C is nonsingular, as in [8, p.39] we have the following theorem.

Theorem 2.1 (Schur’s formula). Let M be a square matrix of order n × n partitioned as

where B and C are k × k and (n − k) × (n − k) matrices, respectively, k < n. If C is nonsingular, then

Proof. From the (6)

The identity (7) follows by taking the determinant of both sides. Then,

Since det = 1. Therefore

By Laplace’s theorem, expansion of det by the first k rows i.e., rows {1, 2, . . . , k}. We have

Therefore

det M = (−1) (n+1)k det C det(M/C). This completes the proof. ΢

The following useful formula, presents the inverse of a matrix in terms of Schur complements, analogous as in [14, p.19], we obtain.

Theorem 2.2. Let M be partitioned as in (4) and suppose both M and C are nonsingular. Then (M/C) is nonsingular and

Proof. The Schur complements (M/C) is nonsingular by virtue of (7). Under the given hypotheses, from (6) one checks that

Inverting both sides yields

from which the identity (8) follows. ΢

 

3. Inverse Formula of Doubly Leslie Matrix

The following theorem is follows from Theorem 2.1.

Theorem 3.1 (Determinant of doubly Leslie matrix). Let L be a doubly Leslie matrix as in (3) with partitioned as where and Λ = diag(s1 , s2 , . . . , sn−1), sj > 0, j = 1, 2, . . . , n − 1 is a diagonal matrix of order n − 1, then

Proof. Since Λ is a (n − 1) × (n − 1) submatrix of the matrix L. Then we apply the Schur’ formula (7),

As in (5), the Schur complement of Λ in L, denoted by (L/Λ) , is a 1 × 1 matrix or a scalar

Now, from (9) it is easy to see that Therefore

This completes the proof. ΢

Immediately, we have the following corollaries.

Corollary 3.2. Let L be a Leslie matrix defined as in (1) with partitioned as L = j = 1, 2, . . . , n. and Λ = diag(s1 , s2 , . . . , sn−1 ), sj > 0, j = 1, 2, . . . , n − 1 is a diagonal matrix of order n − 1, then

Corollary 3.3. Let C = be a doubly companion matrix, where p = [ a1 a2 . . . an−1 ]T, and q = [ bn−1 bn−2 . . . b1 ]T, then

Corollary 3.4. Let C = be a companion matrix, where p = [ a1 a2 . . . an−1 ]T , then det C = (−1)nan .

Now we wish to find the inverse of doubly Leslie matrix.

Theorem 3.5. Let L = be a doubly Leslie matrix,where p = [ a1 a2 . . . an−1 ]T , q = [ bn−1 bn−2 . . . b1 ]T , and Λ = diag(s1, s2 , . . . , sn−1 ), where sj > 0, j = 1, 2, . . . , n − 1 is a diagonal matrix of order n − 1. If det L≠0 then

where (L/Λ) = − , as in (10), and Λ−1 = diag.

Proof. Apply the identity (8) to the matrix L, we have

The Schur complement of Λ in L is (L/Λ), in (10) showed that (L/Λ) is a scalar. Then

Immediately, we have the following corollaries.

Corollary 3.6. Let L be a Leslie matrix defined in Corollary 3.2. If det L ≠ 0 then

where (L/Λ) = −an .

Corollary 3.7. Let C be a doubly companion matrix defined in Corollary 3.3. If det C ≠ 0 then

where

Corollary 3.8. Let C be a companion matrix defined in Corollary 3.4. If det C ≠ 0 then

 

4. Explicit Minimum Polynomial of Doubly Leslie Matrix

The author in [12, Theorem 33] asserted that, the doubly companion matrix is nonderogatory. Now, we wish to show that any doubly Leslie matrix L in (3) is similar to a companion matrix, that is, it is a nonderogatory.

Theorem 4.1. The doubly Leslie matrix matrix L defined in (3) is nonderogatory and the characteristic polynomial and the explicit minimum polynomial is

where c1 = a1, ci = ai sk , and d1 = b1 , di = bi sk , for i = 2, 3, . . . , n.

Proof. Let

Firstly to show that L is similar to a doubly companion matrix.

By a similarity transformation with a diagonal matrix

L can be transformed to a doubly companion matrix,

For convenient, let us denote the doubly companion matrix D−1 LD by

where c1 = a1 , ci = aisk , and d1 = b1, di = bisk , for i = 2, 3, . . . , n.

Let J be the backward identity matrix of order n × n(or reversal matrix of order n × n), J (= J−1 ), which showing that

To show that the matrix Γ is similar to a companion matrix. We shall prove by explicit construction the existence of an invertible matrix M such that M−1 ΓM is a companion matrix. Now, chosen a matrix M of size n × n,

Then M is nonsingular matrix. In fact the matrix M is an lower triangular Toeplitz matrix with diagonal-constant 1, and

where e1 = [ 1 0 . . . 0 ]T ∈n is the unit column vector.

Computation shows that

where

The matrix

is the desired companion matrix. Then, we have the doubly Leslie matrix L is similar to the companion matrix C. By Theorem 1.3, the characteristic polynomial ∆L (x) equal to the minimum polynomial mL (x), we have

That is

 

5. Explicit Eigenvector Formula of Doubly Leslie Matrix

Now analogous as eigenvector of a companion matrix in [2,pp. 630–631] and in [10, p.6], we obtain.

Theorem 5.1. Let λ be an eigenvalue of a doubly Leslie matrix L defined in (3). Then

is an eigenvector of L corresponding to the eigenvalue λ, where d1 = b1, di = bisk , for i = 2, 3, . . . , n.

Proof. From Theorem 4.1, L is similar to the companion matrix C as in (14). Then they have the same eigenvalues in common. Let λ be an eigenvalue of L, then λ also an eigenvalue of C. Since λ is a root of the characteristic polynomial ∆ L (x), we have

From (13), we have,

Therefore

Then, we put a vector u = [1 λ · · · λn−2 λn−1]T. We must show that this vector u is an eigenvector of C corresponding to the eigenvalue λ. Form equation (14), C = (DJM)−1 L(DJM), we have

it is easy to see that the first component in the vector u cannot be zero, the vector u is not a zero-vector, it is an eigenvector of C corresponding to λ.

Since (DJM)−1 L(DJM) = C. Theorem 1.2 asserted that (DJM)u is an eigenvector of L corresponding to the eigenvalue λ. Hence, the explicit form of an eigenvector corresponding to an eigenvalue λ of the matrix L is

that is

it is easy to see that the last component in the vector v cannot be zero, which proves the assertion. ΍

The following corollaries are particular case of Theorem 5.1.

If b1 = b2 = · · · = bn = 0, then the matrix become a Leslie matrix, we have the following corollary.

Corollary 5.2. Let λ be an eigenvalue of a Leslie matrix L defined in Corollary 3.2. Then

is an eigenvector of L corresponding to the eigenvalue λ. A nonzero scalar multiple of v namely

is also an eigenvector of L corresponding to the eigenvalue λ.

If s1 = s2 = · · · = sn = 1, then we have the following corollary, as in [13, pp. 270–272].

Corollary 5.3. Let λ be an eigenvalue of a doubly companion matrix C defined in Corollary 3.3. Then

is an eigenvector of C corresponding to the eigenvalue λ.

Corollary 5.4. Let λ be an eigenvalue of a companion matrix C defined in Corollary 3.4. Then

is an eigenvector of C corresponding to the eigenvalue λ.

 

6. Conclusion

The doubly Leslie matrix is a nonderogatory matrix. This paper has explored a special form of a Schur complement to obtained the determinant, inverse, and explicit eigenvector formulas of the doubly Leslie matrix which is the generalized forms of the Leslie matrix. It is also a generalized form of the doubly companion matrix, and the companion matrix, respectively.

References

  1. N. Bacaër, A Short History of Mathematical Population Dynamics, Springer, New York, 2011.
  2. L. Brand, The companion matrix and its properties, The American Mathematical Monthly, 71(6) (1964) 629-634. https://doi.org/10.2307/2312322
  3. C. Brezinski, Other Manifestations of the Schur Complement, Linear Algebra Appl., 111 (1988) 231-247. https://doi.org/10.1016/0024-3795(88)90062-6
  4. J.C. Butcher and P. Chartier, The effective order of singly-implicit Runge-Kutta methods, Numerical Algorithms, 20 (1999), 269-284. https://doi.org/10.1023/A:1019176422613
  5. M.Q. Chen and X. Li, Spectral properties of a near-periodic row-stochastic Leslie matrix, Linear Algebra Appl., 409 (2005), 166-186. https://doi.org/10.1016/j.laa.2005.07.005
  6. R.A. Horn and C.R. Johnson, Matrix Analysis, Cambridge University Press, Cambridge, UK, 1996.
  7. S.J. Kirkland and M. Neumann, Convexity and concavity of the Perron root and vector of Leslie matrices with applications to a population model, SIAM J. Matrix Anal. Appl., 15(4) (1994), 1092-1107. https://doi.org/10.1137/S0895479893249228
  8. P. Lancaster and M. Tismenetsky, The Theory of Matrices Second Edition with Applications, Academic Press Inc., San Diego, 1985.
  9. C.D. Meyer, Matrix Analysis and Applied Linear Algebra, SIAM, Philadelphia, 2000.
  10. S. Moritsugu and K. Kuriyama, A linear algebra method for solving systems of algebraic equations, J. Jap. Soc. Symb. Alg. Comp. (J. JSSAC), 7/4 (2000), 2-22.
  11. D. Poole, Linear Algebra: A Modern Introduction, Second Edition, Thomson Learning, London, 2006.
  12. W. Wanicharpichat, Nonderogatory of sum and product of doubly companion matrices, Thai J. Math., 9(2) (2011), 337-348.
  13. W. Wanicharpichat, Explicit eigenvectors formulae for lower doubly companion matrices, Thai J. Math., 11(2) (2013), 261-274.
  14. F. Zhang, The Schur Complement and Its Applications, in Series: Numerical Methods and Algorithms, Springer, Inc., New York, 2005.

Cited by

  1. A Note on NIEP for Leslie and Doubly Leslie Matrices vol.8, pp.4, 2020, https://doi.org/10.3390/math8040559