Method of Eigenvalues and Eigenvectors
The Concept of Eigenvalues and Eigenvectors
Consider a linear homogeneous system of \(n\) differential equations with constant coefficients, which can be written in matrix form as
where the following notation is used:
We look for non-trivial solutions of the homogeneous system in the form of
where \(\mathbf{V} \ne 0\) is a constant \(n\)-dimensional vector, which will be defined later.
Substituting the above expression for \(\mathbf{X}\left( t \right)\) into the system of equations, we obtain:
This equation means that under the action of a linear operator \(A\) the vector \(\mathbf{V}\) is converted to a collinear vector \(\lambda \mathbf{V}.\) Any vector with this property is called an eigenvector of the linear transformation \(A,\) and the number \(\lambda\) is called an eigenvalue.
Thus, we conclude that in order the vector function \(\mathbf{X}\left( t \right) = {e^{\lambda t}}\mathbf{V}\) be a solution of the homogeneous linear system, it is necessary and sufficient that the number \(\lambda\) be an eigenvalue of the matrix \(A,\) and the vector \(\mathbf{V}\) be the corresponding eigenvector of this matrix.
As it can be seen, the solution of a linear system of equations can be constructed by an algebraic method. Therefore, we provide some necessary information on linear algebra.
Finding Eigenvalues and Eigenvectors of a Linear Transformation
Let's go back to the matrix-vector equation obtained above:
It can be rewritten as
where \(\mathbf{0}\) is the zero vector.
Recall that the product of the identity matrix \(I\) of order \(n\) and \(n\)-dimensional vector \(\mathbf{V}\) is equal to the vector itself:
Therefore, our equation becomes
It follows from this relationship that the determinant of \({A - \lambda I}\) is zero:
Indeed, if we assume that \(\det \left( {A - \lambda I} \right) \ne 0,\) then the matrix will have the inverse matrix \({\left( {A - \lambda I} \right)^{ - 1}}.\) Multiplying on the left both sides of the equation by the inverse matrix \({\left( {A - \lambda I} \right)^{ - 1}},\) we get:
This, however, contradicts to the definition of the eigenvector, which must be different from zero. Consequently, the eigenvalues \(\lambda\) must satisfy the equation
which is called the auxiliary or characteristic equation of the linear transformation \(A.\) The polynomial on the left side of the equation is called the characteristic polynomial of the linear transformation (or linear operator) \(A.\) The set of all eigenvalues \({\lambda _1},{\lambda _2}, \ldots ,{\lambda _n}\) forms the spectrum of the operator \(A.\)
So the first step in finding the solution of a system of linear differential equations is solving the auxiliary equation and finding all eigenvalues \({\lambda _1},{\lambda _2}, \ldots ,{\lambda _n}.\)
Next, substituting each eigenvalue \({\lambda _i}\) in the system of equations
and solving it, we find the eigenvectors corresponding to the given eigenvalue \({\lambda _i}.\) Note that after the substitution of the eigenvalues the system becomes singular, i.e. some of the equations will be the same. This follows from the fact that the determinant of the system is zero. As a result, the system of equations will have an infinite set of solutions, i.e. eigenvectors can be determined only to within a constant factor.
Fundamental System of Solutions of a Linear Homogeneous System
Expanding the determinant of the characteristic equation of the \(n\)th order, we have, in general, the following equation:
where
Here the number \({k_i}\) is called the algebraic multiplicity of the eigenvalue \({\lambda_i}.\) For each such eigenvalue, there exists \({s_i}\) linearly independent eigenvectors. The number \({s_i}\) is called the geometric multiplicity of the eigenvalue \({\lambda_i}.\)
It is proved in linear algebra that the geometric multiplicity \({s_i}\) does not exceed the algebraic multiplicity \({k_i},\) i.e. the following relation holds:
It turns out that the general solution of the homogeneous system essentially depends on the multiplicity of the eigenvalues. Consider the possible cases that arise here.
\(1.\) Case \({s_i} = {k_i} = 1.\) All Roots of the Auxiliary Equation are Real and Distinct.
In this simplest case, each eigenvalue \({\lambda _i}\) has one associated eigenvector \({\mathbf{V}_i}.\) These vectors form a set of linearly independent solutions
that is, a fundamental system of solutions of the homogeneous system.
By the linear independence of the eigenvectors the corresponding Wronskian is different from zero:
The general solution is given by
where \({C_1},\) \({C_2}, \ldots ,\) \({C_n}\) are arbitrary constants.
The auxiliary equation may have complex roots. If all the entries of the matrix \(A\) are real, then the complex roots always appear in pairs of complex conjugate numbers. Suppose that we have a pair of complex eigenvalues \({\lambda _i} = \alpha \pm \beta i.\) This pair of complex conjugate numbers is associated to a pair of linearly independent real solutions of the form
Thus, the real and imaginary parts of the complex solution form a pair of real solutions.
\(2.\) Case \({s_i} = {k_i} \gt 1.\) The Auxiliary Equation Has Multiple Roots, Whose Geometric and Algebraic Multiplicities are Equal.
This case is similar to the previous one. Despite the existence of eigenvalues of multiplicity greater than \(1,\) we can define \(n\) linearly independent eigenvectors. In particular, any symmetric matrix with real entries that has \(n\) eigenvalues, will have \(n\) eigenvectors. Similarly, a unitary matrix has the same properties. In general, a square matrix of size \(n \times n\) must be diagonalizable in order to have \(n\) eigenvectors.
The general solution of the system of \(n\) differential equations can be represented as
Here the total number of terms is \(n,\) \({C_{ij}}\) are arbitrary constants.
\(3.\) Case \({s_i} \lt {k_i}.\) The Auxiliary Equation Has Multiple Roots, Whose Geometric Multiplicity is Less Than the Algebraic Multiplicity.
In some matrices \(A\) (such matrices are called defective), an eigenvalue \({\lambda_i}\) of multiplicity \({k_i}\) may have fewer than \({k_i}\) linearly independent eigenvectors. In this case, instead of missing eigenvectors we can find so-called generalized eigenvectors, so as to get a set of \(n\) linearly independent vectors and construct the corresponding fundamental system of solution. Two ways are usually used for this purpose:
- Construction of the General Solution of a System of Equations Using the Method of Undetermined Coefficients;
- Construction of the General Solution of a System of Equations Using the Jordan Form.
A detailed description of these methods is presented separately on the specified web pages. Below we consider examples of systems of differential equations corresponding to Cases \(1\) and \(2.\)