Computers, Math, Science, Technology > Computers, Math, Science, Technology

Linear Algebra and Matrix Math

(1/1)

forbitals:
Linear algebra and its applications / Gilbert Strang (1976 first edition)

I hope to write a lot about this, but for now I want to start recording the references.  These references are of great interest to me.  I like the old text books.  And the math books are always good, no matter how old.

ABSTRACT LINEAR ALGEBRA

F. R. Gantmacher, "Theory of Matrices" Chelsea, New York, 1959

The theory of matrices / by F.R. Gantmacher ; [translation by K.A. Hirsch]  ( 2 volumes)

P. R. Halmos, "Finite-Dimensional Vector Spaces" Van Norstrand_Reinhold, Princeton, 1958

Finite-dimensional vector spaces, Paul Richard Halmos
*

K. Hoffman and R. Kunze, "Linear Algebra" 1971
Linear algebra, by Kenneth Hoffman and Ray Kunze.

T. Muir, "Determinants" Dover, 1960, 4 volumes, originally 1923

https://www.amazon.com/Treatise-Theory-Determinants-Thomas-Muir/dp/1245077570

The theory of determinants in the historical order of development / by Sir Thomas Muir (yes, 4 volumes, from 1923)

forbitals:
Gilbert Strang, Linear Algebra and Its Applications

more references:
APPLIED LINEAR ALGEBRA

B. Noble, "Applied Linear Algebra" 1969


NUMERICAL LINEAR ALGEBRA

G. Forsythe and C. Moler, "Computer Solution of Linear Algebraic Systems" 1967

C. L. Lawson and R. J. Hanson, "Solving Least Squares Problems" 1974

G. W. Stewart, "Introduction to Matrix Computations", 1973

R. S. Varga, "Matrix Iterative Analysis", 1962

J. M. Wilkinson, "Rounding Errors in Algebraic Processes", 1963

J. M. Wilkinson, "The Algebraic Eigenvalue Problem", 1965

J. M. Wikinson and C. Reinsch, eds, "Handbook for Automatic Computation II, Linear Algebra", Springer, 1971

D. M. Young, "Iterative Solution of Large Linear Systems", 1971




Gilbert Strang
Linear Algebra and Its Applications
1976, 1st edition
Strang was at MIT

Before forget, let me say that Strang does talk some about Regression Analysis, Factor Analysis, and Principle Component Analysis

So he starts out explaining that primarily linear algebra is about simultaneous equations and Gaussian elimination.  The second idea will be determinants and Cramer's rule.

He will show Gaussian Elimination and talk about zero pivots and when you have a singular matrix.  It will get into LU factorization, which results from Gaussian Elimination.  And then you use back substitution.

Tends to be n^2 operations for Gaussian Elimination.

So he talks about Matrix Multiplication.

So you will be doing Gaussian Elimination, and you will be logging the results in an Elementary Matrix, E.

So you will premultiply Ax  with E, on both sides, and matrix multiplication is associative.

So you get this upper triangular matrix, and these various E matrices which log what was done to get it, cause you will want to back substitute for your solution.

Usually you will want to do row substitution in order to get bigger pivots, to avoid zero, but also to minimize round off errors.  I think it is actually biggest ABS pivot.

And so a P matrix to log this is introduced, Permutation Matrix.  You also premultiply both sides by this.

So you are finding the inverse of the original A matrix.  ( the other way of doing this, Determinant and Adujunct Matrices, is extremely slow )

So you use the Gauss-Jordan method:

https://en.wikipedia.org/wiki/Gaussian_elimination

And then he talks about Band Matrices, which are matrices where the only non-zero elements are close to the diagonal.  I guess this is a particular form of the Sparse Matrix

Good links on this:
Sparse Matrix
https://en.wikipedia.org/wiki/Sparse_matrix

Band Matrix
https://en.wikipedia.org/wiki/Band_matrix

He looks at a differential equation with a two point boundary condition.  A geometrical spacer, h, is introduced.  This makes the problem discrete, and how small h is determines the number of equations and the number of unknowns, and of course this results in a band matrix.

So he goes into a more formal theory of simultaneous linear equations, and gets into Vector Spaces and Subspaces, and he goes into graphic representations.

And so we are talking about the row space of A, the nullspace of A, the column space of A, and the left nullspace of A.

Talks about Orthogonality of Vectors and Subspaces.

Fundamental Theorem of Linear Algebra, Part 1 and Part 2.

So, talks about Orthogonal Projections and Least Squares, and starts off talking about Inner Products and Transposes, and the Schwarz Inequality

Projections onto Subspaces and Least Squares

Least Squares solution satisfies "normal equations".

Projection Matrices, P

Least Squares Fitting of Data

Orthogonal Bases, Orthogonal Martices, And Gram-Schmidt Orthogonalization

Hilbert Space
Fourier Series
Legendre Polynomials

Pseudoinverse and the Singular Value Decomposition

Weighted Least Squares

now there is a big change in the book as he shifts to the discussion of Determinants

pg 146:

"The determinant provides and explicit "formula," a concise and definite expression in closed form, for quantities such as A^-1"

gives test for invertibility

gives volume of parallelepiped

Jacobian Determinant
Gilbert Strang, Linear Algebra and Its Applications

more references:
APPLIED LINEAR ALGEBRA

B. Noble, "Applied Linear Algebra" 1969


NUMERICAL LINEAR ALGEBRA

G. Forsythe and C. Moler, "Computer Solution of Linear Algebraic Systems" 1967

C. L. Lawson and R. J. Hanson, "Solving Least Squares Problems" 1974

G. W. Stewart, "Introduction to Matrix Computations", 1973

R. S. Varga, "Matrix Iterative Analysis", 1962

J. M. Wilkinson, "Rounding Errors in Algebraic Processes", 1963

J. M. Wilkinson, "The Algebraic Eigenvalue Problem", 1965

J. M. Wikinson and C. Reinsch, eds, "Handbook for Automatic Computation II, Linear Algebra", Springer, 1971

D. M. Young, "Iterative Solution of Large Linear Systems", 1971




Gilbert Strang
Linear Algebra and Its Applications
1976, 1st edition
Strang was at MIT

Before forget, let me say that Strang does talk some about Regression Analysis, Factor Analysis, and Principle Component Analysis

So he starts out explaining that primarily linear algebra is about simultaneous equations and Gaussian elimination.  The second idea will be determinants and Cramer's rule.

He will show Gaussian Elimination and talk about zero pivots and when you have a singular matrix.  It will get into LU factorization, which results from Gaussian Elimination.  And then you use back substitution.

Tends to be n^2 operations for Gaussian Elimination.

So he talks about Matrix Multiplication.

So you will be doing Gaussian Elimination, and you will be logging the results in an Elementary Matrix, E.

So you will premultiply Ax  with E, on both sides, and matrix multiplication is associative.

So you get this upper triangular matrix, and these various E matrices which log what was done to get it, cause you will want to back substitute for your solution.

Usually you will want to do row substitution in order to get bigger pivots, to avoid zero, but also to minimize round off errors.  I think it is actually biggest ABS pivot.

And so a P matrix to log this is introduced, Permutation Matrix.  You also premultiply both sides by this.

So you are finding the inverse of the original A matrix.  ( the other way of doing this, Determinant and Adujunct Matrices, is extremely slow )

So you use the Gauss-Jordan method:

https://en.wikipedia.org/wiki/Gaussian_elimination

And then he talks about Band Matrices, which are matrices where the only non-zero elements are close to the diagonal.  I guess this is a particular form of the Sparse Matrix

Good links on this:
Sparse Matrix
https://en.wikipedia.org/wiki/Sparse_matrix

Band Matrix
https://en.wikipedia.org/wiki/Band_matrix

He looks at a differential equation with a two point boundary condition.  A geometrical spacer, h, is introduced.  This makes the problem discrete, and how small h is determines the number of equations and the number of unknowns, and of course this results in a band matrix.

So he goes into a more formal theory of simultaneous linear equations, and gets into Vector Spaces and Subspaces, and he goes into graphic representations.

And so we are talking about the row space of A, the nullspace of A, the column space of A, and the left nullspace of A.

Talks about Orthogonality of Vectors and Subspaces.

Fundamental Theorem of Linear Algebra, Part 1 and Part 2.

So, talks about Orthogonal Projections and Least Squares, and starts off talking about Inner Products and Transposes, and the Schwarz Inequality

Projections onto Subspaces and Least Squares

Least Squares solution satisfies "normal equations".

Projection Matrices, P

Least Squares Fitting of Data

Orthogonal Bases, Orthogonal Martices, And Gram-Schmidt Orthogonalization

Hilbert Space
Fourier Series
Legendre Polynomials

Pseudoinverse and the Singular Value Decomposition

Weighted Least Squares

now there is a big change in the book as he shifts to the discussion of Determinants

pg 146:

"The determinant provides and explicit "formula," a concise and definite expression in closed form, for quantities such as A^-1"

gives test for invertibility

gives volume of parallelepiped

Jacobian Determinant

https://en.wikipedia.org/wiki/Jacobian_matrix_and_determinant

finding determinants needs n! computations!!!

Cramer's Rule

Expansion in cofactors, and finding inverse from adjugate matrix

Then book makes big shift to Eigenvalues and Eigenvectors
https://en.wikipedia.org/wiki/Jacobian_matrix_and_determinant

finding determinants needs n! computations!!!

Cramer's Rule

Expansion in cofactors, and finding inverse from adjugate matrix

Then book makes big shift to Eigenvalues and Eigenvectors

Navigation

[0] Message Index

Go to full version