Lecture 8 Gaussian Elimination - PowerPoint PPT Presentation

1 / 33
About This Presentation
Title:

Lecture 8 Gaussian Elimination

Description:

... matrix so that it will become diagonally dominate and reduce the round-off and ... on matrix A with pivoting technique to make matrix diagonally dominate. ... – PowerPoint PPT presentation

Number of Views:35
Avg rating:3.0/5.0
Slides: 34
Provided by: San59
Category:

less

Transcript and Presenter's Notes

Title: Lecture 8 Gaussian Elimination


1
Lecture 8 - Gaussian Elimination
  • CVEN 302
  • September 12, 2001

2
Lectures Goals
  • Discuss how to solve systems
  • Gaussian Elimination
  • Gaussian Elimination with Pivoting
  • Tridiagonal Solver
  • Problems with the technique
  • Examples

3
Gaussian Elimination
  • There are two phases to the solving technique
  • Elimination --- use row operations to convert the
    matrix into an upper diagonal matrix. (The
    elimination phase, which takes the the most
    effort and most susceptible to corruption by
    round off)
  • Back substitution -- Solve x using a back
    substitution.

4
Gaussian Elimination Algorithm
  • Ax b
  • Augment the n x n coefficient matrix with the
    vector of right hand sides to form a n x (n1)
  • Interchange rows if necessary to make the value
    a11 with the largest magnitude of any coefficient
    in the first row
  • Create zero in 2nd through nth row in first row
    by subtracting ai1 / a11 times first row from ith
    row

5
Gaussian Elimination Algorithm
  • Repeat (2) (3) for second through the (n-1)th
    rows, putting the largest magnitude coefficient
    in the diagonal by interchanging rows (consider
    only row j to n ) and then subtract times the
    jth row from the ith row so as to create zeros in
    all positions of jth column below the diagonal at
    conclusion of this step the system is upper
    triangular
  • Solve for n from nth equation xn an,n1 / ann
  • Solve for xn-1 , xn-2 , ...x1 from the (n-1)th
    through the first xi (ai,n1 - Sji1,n aj1
    xj ) / aii

6
Example 1
  • X1 3X2 5
  • 2X1 4X2 6

7
Example 2
  • -3X1 2X2 - X3 -1
  • 6X1 - 6X2 7X3 -7
  • 3X1 - 4X2 4X3 -6

8
Computer Program
  • The program GEdemo(A,b) does the Gaussian
  • elimination for a square matrix (n x n). It
  • does not do any pivoting and works for only
  • one b vector.

9
Test the Program
  • Example 1
  • Example 2
  • New Matrix
  • 2X1 4X2 - 2 X3 - 2 X4 - 4
  • 1X1 2X2 4X3 - 3 X4 5
  • - 3X1 - 3X2 8X3 - 2X4 7
  • - X1 X2 6X3 - 3X4
    7

10
Problem with Gaussian Elimination
  • The problem can occur when a zero appears in the
    diagonal and makes a simple Gaussian elimination
    impossible.
  • Pivoting changes the matrix so that it will
    become diagonally dominate and reduce the
    round-off and truncation errors in the solving
    the matrix.

11
Example of Pivoting
  • 2 X1 4 X2 - 2 X3 10
  • X1 2 X2 4 X3 6
  • 2 X1 2 X2 1X3 2
  • Answer X1 X2 X3 -3.40, 4.30, 0.20

12
Computer Program
  • GEPivotdemo(A,b) is a program, which will do a
    Gaussian elimination on matrix A with pivoting
    technique to make matrix diagonally dominate.
  • The program is modification to handle a single
    value of b

13
Question?
  • How would you modify the programs to handle
    multiple inputs?
  • What is diagonal matrix, upper triangular matrix,
    and lower triangular matrix?
  • Can you do a column exchange and how would you
    handle the problem if it works?

14
Question?
  • What happens with the following example?
  • 0.0001X1 0.5 X2 0.5
  • 0.4000X1 - 0.3 X2 0.1
  • What happens is the second equation becomes
    0.4000X1 - 2000 X2 -2000

15
Gaussian Elimination
  • If the diagonal is not dominate the problem can
    have round off error and truncation errors.
  • The scaling will result in problems

16
Question?
  • What happens with the following example for
    values with two-significant figures?
  • 0.4000 X1 - 0.3 X2 0.1
  • 0.0001 X1 0.5 X2 0.5

17
Scaling
  • Scaling is an operation of adjusting the
    coefficients of a set of equations so that they
    are all of the same magnitude.
  • A set of equations may involve relationships
    between quantities measured in a widely different
    units (N vs. kN, sec vs hrs, etc.) This may
    result in equation having very large number and
    others with very small , if we select pivoting
    may put numbers on the diagonal that are not
    large in comparison to other rows and create
    round-off errors that pivoting was suppose to
    avoid.

18
Scaling
  • What happens with the following example?
  • 3X1 2 X2 100X3 105
  • - X1 3 X2 100X3 102
  • X1 2 X2 - 1X3 2

19
Scaling
  • The best way to handle the problem is to
    normalize the results.
  • 0.03X1 0.02 X2 1.00X3 1.05
  • - 0.01X1 0.03 X2 1.00X3 1.02
  • 0.50X1 1.00 X2 - 0.50X3 1.00

20
Gauss-Jordan Method
  • The Gauss-Jordan Method is similar to the
    Gaussian Elimination.
  • The method requires almost 50 more operations.

21
Gauss-Jordan Method
The Gauss-Jordan method changes the matrix into
the identity matrix.
22
Gauss-Jordan Method
  • There are one phases to the solving technique
  • Elimination --- use row operations to convert the
    matrix into an identity matrix.
  • The new b vector is the solution to the x values.

23
Gauss-Jordan Algorithm
  • Ax b
  • Augment the n x n coefficient matrix with the
    vector of right hand sides to form a n x (n1)
  • Interchange rows if necessary to make the value
    a11 with the largest magnitude of any coefficient
    in the first row
  • Create zero in 2nd through nth row in first row
    by subtracting ai1 / a11 times first row from ith
    row

24
Gauss-Jordan Elimination Algorithm
  • Repeat (2) (3) for first through the nth rows,
    putting the largest magnitude coefficient in the
    diagonal by interchanging rows (consider only row
    j to n ) and then subtract times the jth row
    from the ith row so as to create zeros in all
    positions of jth column and the diagonal becomes
    all ones
  • Solve for all of the equations, xi ai,n1

25
Example 1
  • X1 3X2 5
  • 2X1 4X2 6

26
Example 2
  • -3X1 2X2 - X3 -1
  • 6X1 - 6X2 7X3 -7
  • 3X1 - 4X2 4X3 -6

27
Band Solver
  • Large matrices tend to be banded, which means
    that the matrix has a band of non-zero
    coefficients and zeroes on the outside of the
    matrix.
  • The simplest of the methods is the Thomas Method,
    which is used for a tridiagonal matrix.

28
Advantages of Band Solvers
  • The method reduce the number of operations and
    save the matrix in smaller amount of memory.
  • The band solver is faster and is useful for large
    scale matrices.

29
Thomas Method
  • The method takes advantage of the bandedness of
    the matrix.
  • The technique uses a two phase process.
  • The first phase is to obtain the coefficients
    from the sweep.
  • The second phase solves for the x values.

30
Thomas Method
  • The first phase starts with the first row of
    coefficients scales the a and r coefficients.
  • The second phase solves for x values using the a
    and r coefficients.

31
Thomas Method
  • The program for the method is given as
    demoThomas(a,d,b,r)
  • The algorithm is from the textbook, where a,d,b,
    r are vectors from the matrix.

32
Summary
  • Scaling of the problem will help in the
    convergence.
  • Gauss-Jordan method is more computational intense
    and does not improve the round-off errors.
    However, it is useful for finding matrix
    inverses.
  • Banded matrix solvers are faster and use less
    memory.

33
Homework
  • Check the Homework webpage
Write a Comment
User Comments (0)
About PowerShow.com