EE301 Introduction to System Theory - PowerPoint PPT Presentation

1 / 61
About This Presentation
Title:

EE301 Introduction to System Theory

Description:

ECE301, Fall 2006, Copy Right P. B. Luh. 2. Properties of A ... Eigenvectors form a convenient set of basis ... of yi = Lxi w.r.t. {w1, w2, .., wm} ... – PowerPoint PPT presentation

Number of Views:48
Avg rating:3.0/5.0
Slides: 62
Provided by: Pete208
Category:

less

Transcript and Presenter's Notes

Title: EE301 Introduction to System Theory


1
EE301 Introduction to System Theory
  • Reading Assignment 3.5, 3.6, 3.8, 3.9, Brogan 7
    8
  • Problem Set No. 4 Ch. 3 3.13, 3.14, 3.18, 3.22,
    3.30, 3.32, 3.37
  • Last Time Linear Spaces and Linear Operators
  • Linear Operators and Representations
  • Matrix Representation of Linear Operators
  • Change of Basis
  • Norm of a Linear Operator
  • Adjoint Transformation

2
  • Properties of A
  • Systems of Linear Algebraic Equations
  • Orthogonal Complement
  • Pseudo Inverse
  • Term project proposal due Monday, 10/16
  • What is the problem
  • Why do you want to work on it
  • What are the major difficulties
  • What is the method to be investigated and its key
    ideas
  • What is your plan of attack
  • What new results do you expect to get and why are
    they novel
  • Expected results, insights, and significance
  • Numerical implementation and testing are crucial

3
  • Theorem. Let x1, x2, .., xn be a basis of X ,
    w1, .., wm a basis of Y. Then a linear
    operator L (X, F) ? (Y, F) is uniquely
    determined by n pairs of mapping
  • yi ? Lxi, i 1, 2, .., n. Furthermore, let
  • ai be the representation of yi w.r.t. w1, w2,
    .., wm
  • A be the matrix formed as a1, a2, .., an
  • ? be the representation of any x ? X w.r.t. x1,
    .., xn
  • Then the representation ? of y Lx w.r.t. is ?
    A ?
  • Suppose that L (X, F) ? (X, F), and the basis is
    changed from e1, e2, .., en to ?e1,?e2,
    ..,?en
  • ith column of P Representation of ei w.r.t.
    ?e1, ..,?en

4
Adjoint Transformation
  • Suppose that X and Y are pre-Hilbert spaces
  • For x ? X, y ? Y, Ax ? Y, ltAx, ygty is well defined
  • The adjoint operator A (Y, F) ? (X, F) is
    defined by the following

To preserve orthogonality and norm A
Complex conjugate transpose
5
Orthogonal Complement
  • Given a set S, the set of all vectors ? to S is
    called the orthogonal complement of S, and is
    denoted as S?

R(A)? N(A), R(A)? N(A)
6
Pseudo Inverse
  • Consider Ax y. Key results
  • If ?(A) ? ?(A y), then there is no solution
  • If ?(A) ?(A y) n, then there is a unique
    solution
  • If ?(A) ?(A y) lt n, there are infinite
    number of sols.
  • Definition. A is full rank. Then among all x1 ?
    X satisfying
  • Ax1 - y min x Ax1 - y,
  • let x0 be the unique one with minimum norm
  • The pseudo inverse A of A is the operator
    mapping y into x0 as y varies over Y
  • For m ? n, A (AA)-1A
  • For n ? m, A A(AA)-1

7
  • Today Linear Spaces and Linear Operators
  • Eigenvalues and Eigenvectors
  • Case 1 All Eigenvalues are Distinct
  • Case 2 Eigenvalues with Multiplicity gt 1
  • Functions of a Square Matrix
  • Polynomials of a Square Matrix
  • Cayley Hamilton Theorem Minimal Polynomial
  • General Functions of a Square Matrix
  • Next Time Sections 4.1 - 4.4

8
Eigenvalues and Eigenvectors
  • Definition. Let A be a linear operator from (Cn,
    C) to (Cn, C). A scalar ? is called an
    eigenvalue of A if ? a nonzero x ? Cn, such that
    Ax ?x
  • (?I - A)x 0 has a non-trivial sol. iff ?(?)
    ?I - A 0
  • Characteristic polynomial of A with degree n
  • A has n eigenvalues, not necessarily distinct,
    and some of them could be complex Generally
    want to have F C
  • x is the eigenvector associated with ?. What can
    be said?
  • (?I - A)x 0 x ? N(?I - A)
  • The set of eigenvalues of A is called the
    spectrum

9
  • Example

find ?1, ?2, x1, and x2
10
  • We shall see later that
  • Eigenvalues are associated with system stability
  • Eigenvectors form a convenient set of basis
  • Shall now examine two cases of eigenvalues and
    eigenvectors
  • Case 1 All eigenvalues are distinct
  • Case 2 Eigenvalues with multiplicity gt 1

11
Case 1 All Eigenvalues are Distinct
  • Consider first the case where all the eigenvalues
    of A are distinct, i.e., ?i ? ?j for i ? j.
  • Let vi be the associated eigenvector for ?i
  • What can we say about v1, v2, .., vn?
  • Theorem. v1, v2, .., vn are linearly
    independent
  • How to proof this theorem?
  • Proof. By contradiction
  • Suppose that they are linearly dependent, then
    assume without loss of generality that

12
  • ?i?ivi 0, with ?1 ? 0
  • (A - ?2I)(?i?ivi) 0
  • ?i?i(A - ?2I)vi
  • ?i?i(?i - ?2)vi
  • ?i?2?i(?i - ?2)vi The second term drops out
  • (A - ?3I)?i?2?i(?i - ?2)vi 0
  • ?i?2?i(?i - ?2)(A - ?3I)vi
  • ?i?2?i(?i - ?2)(?i - ?3)vi
  • ?i?2,3?i(?i - ?2)(?i - ?3)vi
  • The third term drops out
  • Finally, ?1(?1 - ?2)(?1 - ?3) .. (?1 - ?n)v1 0
  • Since ?i ? ?j for i ? j, the above implies v1 0
  • Contradiction ? v1, v2, .., vn are LI

13
  • SGD. What happens if we represent A in terms of
    them?
  • ai the representation of yi Lxi w.r.t. w1,
    w2, .., wm
  • A the matrix formed as a1, a2, .., an
  • Now with v1, v2, .., vn as the basis, the ith
    column of?A Representation of the Lvi w.r.t.
    v1, v2, .., vn
  • Lv1 ?1v1
  • Av2 ?2v2

A diagonal matrix
14
  • Example (Continued)

find?A
  • First by inspection
  • Then by similar transformation (ith column of Q
    Representation of?ei w.r.t. the set of e1, e2,
    .., en)

Q
15
  • Example.

Represent the system dynamics in terms of v1, v2
16
  • What is the system dynamics in terms of v1, v2?
  • Two decoupled modes and can be easily analyzed
  • The system is stable since Re(?i) lt 0 ? i

17
  • Theorem. All similar matrices have the same
    eigenvalues
  • How to prove this?
  • ?Q-1Q - Q-1AQ
  • Q-1(?I - A)Q
  • Q-1?(?I - A)?Q
  • ?I - A
  • The two matrices have the same characteristic
    polynomial, and therefore have the same set of
    eigenvalues

18
  • Today Linear Spaces and Linear Operators
  • Eigenvalues and Eigenvectors
  • Case 1 All Eigenvalues are Distinct
  • Case 2 Eigenvalues with Multiplicity gt 1
  • Functions of a Square Matrix
  • Polynomials of a Square Matrix
  • Cayley Hamilton Theorem Minimal Polynomial
  • General Functions of a Square Matrix

19
Case 2 Eigenvalues with Multiplicity gt 1
  • What may happen when the multiplicity of an
    eigenvalue is greater than 1?
  • The matrix may not be diagonalizable
  • Example.

20
  • What is v3?
  • v3 v2
  • v1, v2, v3 are not LI, and cannot be used as a
    basis
  • Q formed by them is not invertible, and there is
    no similar transformation to diagonalize A. What
    then?
  • Have to think something different for v2 and v3
  • Let us find v3 such that

Different from the previous v2
  • IWBS that v1, v2, v3 are LI. What is?A?

21
  • The ith column of?A Representation of the Lvi
    w.r.t. v1, v2, .., vn

22
  • For this particular example, how to get v2 and
    v3?
  • What is Q for similar transformation? (ith
    column of Q Representation of?ei w.r.t. the set
    of e1, e2, .., en)

23
as expected
  • What are the eigenvalues?

0, 1, 1, as expected
  • A matrix with multiplicity gt 1 could still be
    diagonalizable

24
  • Example.

2 LI eigenvectors!
  • A is diagonalizable even with multiplicity gt 1

25
  • Definition. A vector v is a generalized
    eigenvector of grade k associated with ? iff
  • What is the new representation w.r.t. v1, v2, .,
    vk?

26
  • Theorem. The generalized eigenvectors associated
    with a particular eigenvalue are LI
  • Theorem. The generalized eigenvectors associated
    with different eigenvalues are LI
  • The eigenvectors and generalized eigenvectors
    span Cn
  • A good basis ?A is the Jordan Canonical Form

27
  • Today Linear Spaces and Linear Operators
  • Pseudo Inverse
  • Eigenvalues and Eigenvectors
  • Case 1 All Eigenvalues are Distinct
  • Case 2 Eigenvalues with Multiplicity gt 1
  • Functions of a Square Matrix
  • Polynomials of a Square Matrix
  • Cayley Hamilton Theorem Minimal Polynomial
  • General Functions of a Square Matrix

28
Functions of a Square Matrix
Polynomials of a Square Matrix
Example.
  • What is A1? A2? A3? A0?

29
  • In general, suppose A (Cn, C) ? (Cn, C)
  • A1 A, A2 A?A, A3 A?A?A
  • Ak A?A???A , k terms, k ? 1
  • A0 I
  • Let f(?) be a polynomial, e.g.,
  • f(?) 5?3 4?2 7? - 2
  • What is f(A)?
  • f(A) 5A3 4A2 7A - 2A0

30
  • Is there an easier way to compute f(A)?
  • Would the process be easier for a diagonal or
    block diagonal matrix? How to proceed?

f(A) 5A3 4A2 7A - 2A0
31
  • Example (Continued)

as expected
32
  • In general,
  • Advantages to use diagonal or Jordan canonical
    form?

33
Cayley Hamilton Theorem Minimal Polynomial
A polynomial of degree n ni Multiplicity of
?i
  • What is ?(A)?
  • ?(A) 0

Cayley-Hamilton Theorem
  • Will prove it in several steps. What is its
    significance?
  • Example (Continued)

0
as expected
34
  • Any polynomial can be expressed as a polynomial
    of degree n-1
  • This came from the fact that ?(A) 0, where ?(?)
    is a polynomial of degree n (proof to follow)
  • If there is a polynomial ?(?) of degree m lt n
    such that ?(A) 0, then any polynomial can be
    expressed as a polynomial of degree m-1
  • The minimal polynomial ?(?) of A is the monic
    polynomial (with highest power coefficient 1)
    of least degree such that ?(A) 0
  • What is ?(A)?

?ni Order of the largest Jordan block
associated with ?i, or the index of ?i
35
  • Theorem. Similar matrices have the same minimal
    polynomial
  • Proof.
  • f(A) 0 iff f(?A) 0 since f(A) Qf(?A)Q-1
  • ? We can use?A to find ?(?)
  • Consider a third order Jordan block

36
  • Therefore
  • In general,
  • It is clear that ?(?A) 0
  • There is no other monic polynomial ?'(?) of less
    order such that ?'(?A) 0 ? ?(?) is the
    minimal poly.
  • As a by-product

0
Therefore the Cayley-Hamilton Theorem is proved
37
  • Example (Continued). Find ?(?) for the following
  • Recall that A is diagonalizable

38
  • Example
  • How to solve this problem?
  • We should be able to represent f(A) as
  • A85 ?0I ?1A g(A)
  • Much easier to compute
  • What is ?0? ?1? How to obtain them?
  • A general problem Find g(A) that is equivalent
    to f(A) but simpler to evaluate

39
  • Under what conditions would f(A) g(A)?
  • Theorem. Let f and g be two polynomials. Then
    the following statements are equivalent
  • f(A) g(A)
  • f g h1? or g f h2?
  • where h1 and h2 are some polynomials
  • f(l)(?i) g(l)(?i), l 0, 1, ..,?ni -1, i 1,
    .., m

and m is the number of distinct eigenvalues
  • Proof
  • (1) ? (2) okay since ?(A) 0
  • To see (2) ? (3), suppose the following

40
  • Suppose f g h1?, and ? (? - ?i)3, then
  • f(0)(?i) g(0)(?i) h1(0)(?i)?(0)(?i)
    g(0)(?i)
  • f(1)(?i) g(1)(?i) h1(1)(?i)?(?i)
    h1(?i)?(1)(?i) g(1)(?i)
  • f(2)(?i) g(2)(?i) h1(2)(?i)?(?i)
    2h1(1)(?i)?(1)(?i) h1(?i)?(2)(?i) g(2)(?i)
  • Corollary.
  • If f(l)(?i) g(l)(?i), l 0, 1, .., ni -1, i
    1, .., m, then f(A) g(A)
  • If f g h1? or g f h2?, then f(A) g(A)
  • Definition. f(l)(?i), l 0, 1, .., ni -1, i
    1, .., m are called the values of f on the
    spectrum of A
  • Any two polynomials having the same values on the
    spectrum of A define the same matrix function

41
  • Example.
  • Performing long division to obtain
  • f(?) (?3 5?2 28? 150) ?(?) 807? 301
  • f(A) g(A) 807A 301I

42
  • Example (continued)
  • What is a good g?
  • g(?) ?0 ?1?
  • g(0)(?1) ?0 ?1?1 ?0 2?1 285
  • g(0)(?2) ?0 ?1?2 ?0 ?1 1
  • ?1 285 -1, ?0 2 - 285 ? g(?) (2 - 285)
    (285 -1)?

43
  • g(A) (2 - 285)?I (285 -1)A
  • One way to compute f(A)
  • Form ?(?) (or ?(?)), and find ?i and f(l)(?i)
  • Construct an (n - 1)th (or (?n -1)th) order
    polynomial
  • g(?) ?0 ?1? ?2?2 .. ?n-1?n-1
  • s.t. f and g have the same values on the
    spectrum of A
  • f(A) g(A)

44
  • Another use of Cayley-Hamilton Theorem Find A-1
  • ?(?) ?n ?n-1?n-1 ?n-2?n-2 .. ?1? ?0
  • ?(A) An ?n-1An-1 ?n-2An-2 .. ?1A ?0I
    0
  • An-1 ?n-1An-2 ?n-2An-3 .. ?1I ?0A-1
    0 (if A-1 exists)
  • A-1 - (An-1 ?n-1An-2 ?n-2An-3 .. ?1I)
    /?0
  • Convert inversion to multiplication, assuming ?0
    ? 0
  • Example.

?1 -3, ?0 2
45
  • Under what condition does the inverse exist?
  • ?0 ? 0 (?(?) ?n ?n-1?n-1 .. ?1? ?0 )
  • What does this mean?
  • Consider a matrix in diagonal form
  • Similar things can be said for A in Jordan
    canonical form or in other forms

46
  • Today Linear Spaces and Linear Operators
  • Eigenvalues and Eigenvectors
  • Case 1 All Eigenvalues are Distinct
  • Case 2 Eigenvalues with Multiplicity gt 1
  • Functions of a Square Matrix
  • Polynomials of a Square Matrix
  • Cayley Hamilton Theorem Minimal Polynomial
  • General Functions of a Square Matrix

47
General Functions of a Square Matrix
  • Previously we studied polynomials of a square
    matrix. How about non-polynomial functions?
  • Suppose f(?) e?, sin ?, or 1/(s - ?). What is
    f(A)?
  • Two definitions
  • By means of a polynomial g(?) having the same
    values on the spectrum of A
  • By an infinite series
  • It can be shown that the two are equivalent

48
  • Definition 1. Let f(?) be a general function
    with f(l)(?i) well defined. g(?) is a
    polynomial with g(l)(?i) f(l)(?i) for all i and
    l.
  • Then f(A) ? g(A)
  • Example

?1 2, ?2 3 f(0)(?1) e2t, f(0)(?2) e3t
49
  • Now let g(?) ?0 ?1?
  • g(0)(?1) ?0 2?1 e2t
  • g(0)(?2) ?0 3?1 e3t
  • ?1 e3t - e2t, ?0 e2t - 2?1 3e2t - 2e3t
  • g(?) (3e2t - 2e3t) (- e2t e3t)?
  • f(A) g(A) (3e2t - 2e3t)I (- e2t e3t)A

50
  • Thus to calculate f(A) given f(?) and A
  • Form ?(?) (or ?(?)), and find ?i and f(l)(?i)
  • Construct an (n - 1)th (or (?n -1)th) order
    polynomial such that g(l)(?i) f(l)(?i) for all
    i and l
  • f(A) g(A)
  • Definition 2. Let f(?) ? ?i from 0 to ? ?i?i
    with the radius of convergence ?. Then
  • f(A) ? ?i from 0 to ? ?iAi
  • if ?j lt ? for all j, or Ak 0 for some
    positive k (in this case, a finite sequence)
  • It can be shown that the two are equivalent

51
  • Example. Find eAt for a diagonal A and for A in
    Jordan canonical form

52
  • Now suppose that A is a Jordan block. Find eAt
  • ?(?) (? - ?1)4 , with ?1 of multiplicity 4
  • f(0)(?1) e?1t, f(1)(?1) te?1t
  • f(2)(?1) t2e?1t, f(3)(?1) t3e?1t
  • g(?) ?0 ?1(? - ?1) ?2(? - ?1)2 ?3(? -
    ?1)3
  • g(0)(?1) ?0 e?1t, g(1)(?1) ?1 te?1t
  • g(2)(?1) 2?2 t2e?1t, g(3)(?1) 6?3 t3e?1t

53
  • ?0 e?1t, ?1 te?1t, ?2 0.5t2e?1t, ?3
    t3e?1t/6
  • g(?) e?1t te?1t(? - ?1) 0.5t2e?1t(? - ?1)2
    t3e?1t(?-?1)3/6
  • f(A) g(A) e?1t I te?1t(A - ?1 I)
    0.5t2e?1t(A - ?1 I)2 t3e?1t(A - ?1 I)3 /6

Components tke?1t, 0 ? k ? n-1
54
  • The above process can be easily extended to
    matrices in Jordan canonical form
  • For a non-diagonal but diagonalizable matrix

55
  • Example. f(?) sin ?t ?t - (?t)3/3!
    (?t)5/5! .. Find f(A)
  • f(A) sin At tA - (tA)3/3! (tA)5/5! ..
  • If A is diagonal, the f(A) can be easily computed
  • Otherwise may use f(A) Qf(?A)Q-1 or find g(?)
    so that f and g have the same values on the
    spectrum of A
  • Similarly, cos At I - (tA)2/2! (tA)4/4! - ..
  • It can also be shown that
  • sin2At cos2At I
  • sin At (ejAt - e-jAt)/2j ...

56
  • Example.

57
  • Example.

Assuming that A-1 exists
Assuming ? ? 0
58
  • Example. Laplace Transform of eAt

Assuming ? lt 1
Assuming s is sufficiently large
  • How to compute (sI - A)-1?

59
  • Example. f(?) (s - ?)-1. Compute f(A) (sI -
    A)-1,
  • ?(?) (? - ?1)3 , with ?1 of multiplicity 3
  • f(0)(?1) (s - ?1)-1, f(1)(?1) (s - ?1)-2,
    f(2)(?1) 2(s - ?1)-3
  • g(?) ?0 ?1(? - ?1) ?2(? - ?1)2
  • g(0)(?1) ?0 (s - ?1)-1, g(1)(?1) ?1 (s -
    ?1)-2
  • g(2)(?1) 2?2 2(s - ?1)-3
  • g(?) (s - ?1)-1 (s - ?1)-2(? - ?1) (s -
    ?1)-3(? - ?1)2
  • g(A) (s - ?1)-1I (s - ?1)-2(A - ?1) (s -
    ?1)-3(A - ?1)2

60
(No Transcript)
61
  • Today Linear Spaces and Linear Operators
  • Eigenvalues and Eigenvectors
  • Case 1 All Eigenvalues are Distinct
  • Case 2 Eigenvalues with Multiplicity gt 1
  • Functions of a Square Matrix
  • Polynomials of a Square Matrix
  • Cayley Hamilton Theorem Minimal Polynomial
  • General Functions of a Square Matrix
  • Next Time Sections 4.1 - 4.4
Write a Comment
User Comments (0)
About PowerShow.com