Do We Really Need Eigen Values?
Many problems in mathematics and physics reduce "eigenvalue
problems", and people often expend lots of effort trying to
determine the eigenvalues of a general NxN system, and this
is sometimes reduced to finding the roots of the Nth degree
characteristic polynomial. The task of efficiently finding
all the complex roots of a high-degree polynomial is non-
trivial, and is complicated by the need to consider various
special cases such as repeated roots.
However, depending on what we're really trying to accomplish,
we may not need to find the eigenvalues at all. Of course, the
description of an action is usually simplest when its components
are resolved along the directions of the eigenvectors, e.g., the
eigenvectors of a rotation matrix define its axis of rotation.
As a result, it's easy to write equations for the positions of
points on a rotating sphere if we work in coordinates that are
aligned with the eigenvectors of the rotation, whereas if we work
in some other arbitrary coordinates the equations describing the
motion will be more complicated.
However, there are many situations in which the characteristic
axes of the phenomenon can be exploited without ever explicitly
determining the eigenvalues. For example, suppose we have the
system of three continuous variables x(t), y(t), and z(t) that
satisfy the equations
ax + by + cz = x'
dx + ey + fz = y'
gx + hy + iz = z'
and we want to determine the exact root-matched recurrence formulas
for x(nT), y(nT), and z(nT) for a given discrete time interval T.
The characteristic polynomial is
r^3 - Ar^2 + Br - C = 0
where
A = a+e+i
B = ae-bd+ai-cg+ei-fh
C = aei+bfg+cdh-ceg-bdi-afh
The root-matched recurrence coefficients are proportional to the
elementary symmetric functions (with alternating signs) of the
quantities exp(r_k T), where r_k, k=1,2,3 are the eigenvalues, so
a common approach is to solve the characteristic equation for the
eigenvalues and then compute the recurrence coefficients. Even for
a simple cubic the precise determination of the (complex) roots in
embedded code, covering all possible cases, can be tricky and
time-consuming. For higher order systems it's even worse.
However, we don't really need to solve for the eigenvalues. Let
w(k) denote the sum of the kth powers of the roots r_1, r_2, r_3.
Clearly w(0)=3, and the values of w(k) for k=1,2,... are given by
w(1) + A = 0
w(2) + w(1) A + 2 B = 0
and
w(k) + A w(k-1) + B w(k-2) + C w(k-3) = 0
for all k > N-1. Now let E(k) denote the sum of the kth powers of
the values exp(r_k T), k=1,2,3. Expanding each of the exponential
functions as a Taylor series and combining terms by powers of T gives
inf (kT)^j
E(k) = SUM ------ w(j) k=1,2,3
j=0 j!
Letting s_k denote (-1)^k times the kth elementary symmetric function
of the quantities exp(r_j T), we can now compute s_0, s_1, s_2, and s_3
using the identities
s_0 = 1
E(1) + 1s_1 = 0
E(2) + E(1)s_1 + 2s_2 = 0
E(3) + E(2)s_1 + E(1)s_2 + 3s_3 = 0
Then the exact homogeneous recurrence formula for x(t) is
s_0 x(nT) + s_1 x((n-1)T) + s_2 x((n-2)T) + s_3 x((n-3)T) = 0
and the recurrences for y(t) and z(t) are identical. The summation
for E(k) is quite well-behaved and can be implemented quite easily in
embedded code. This method, which is directly applicable to systems
of any order, requires no root-finding algorithm and no special
consideration of repeated roots, etc.
Return to MathPages Main Menu
Сайт управляется системой
uCoz