# Linear Algebra (Lamar Univeristy)

**Content URL:**Link To Content

###
About *Linear Algebra (Lamar Univeristy):*

Here is a listing of all the material that is currently available online.

Systems of Equations and Matrices

Systems of Equations In this section we’ll introduce most of the basic topics that we’ll need in order to solve systems of equations including augmented matrices and row operations.

Solving Systems of Equations Here we will look at the Gaussian Elimination and Gauss-Jordan Method of solving systems of equations.

Matrices We will introduce many of the basic ideas and properties involved in the study of matrices.

Matrix Arithmetic & Operations In this section we’ll take a look at matrix addition, subtraction and multiplication. We’ll also take a quick look at the transpose and trace of a matrix.

Properties of Matrix Arithmetic We will take a more in depth look at many of the properties of matrix arithmetic and the transpose.

Inverse Matrices and Elementary Matrices Here we’ll define the inverse and take a look at some of its properties. We’ll also introduce the idea of Elementary Matrices.

Finding Inverse Matrices In this section we’ll develop a method for finding inverse matrices.

Special Matrices We will introduce Diagonal, Triangular and Symmetric matrices in this section.

LU-Decompositions In this section we’ll introduce the LU-Decomposition a way of “factoring” certain kinds of matrices.

Systems Revisited Here we will revisit solving systems of equations. We will take a look at how inverse matrices and LU-Decompositions can help with the solution process. We’ll also take a look at a couple of other ideas in the solution of systems of equations.

Determinants

The Determinant Function We will give the formal definition of the determinant in this section. We’ll also give formulas for computing determinants of and matrices.

Properties of Determinants Here we will take a look at quite a few properties of the determinant function. Included are formulas for determinants of triangular matrices.

The Method of Cofactors In this section we’ll take a look at the first of two methods form computing determinants of general matrices.

Using Row Reduction to Find Determinants Here we will take a look at the second method for computing determinants in general.

Cramer’s Rule We will take a look at yet another method for solving systems. This method will involve the use of determinants.

Euclidean n-space

Vectors In this section we’ll introduce vectors in 2-space and 3-space as well as some of the important ideas about them.

Dot Product & Cross Product Here we’ll look at the dot product and the cross product, two important products for vectors. We’ll also take a look at an application of the dot product.

Euclidean n-Space We’ll introduce the idea of Euclidean n-space in this section and extend many of the ideas of the previous two sections.

Linear Transformations In this section we’ll introduce the topic of linear transformations and look at many of their properties.

Examples of Linear Transformations We’ll take a look at quite a few examples of linear transformations in this section.

Vector Spaces

Vector Spaces In this section we’ll formally define vectors and vector spaces.

Subspaces Here we will be looking at vector spaces that live inside of other vector spaces.

Span The concept of the span of a set of vectors will be investigated in this section.

Linear Independence Here we will take a look at what it means for a set of vectors to be linearly independent or linearly dependent.

Basis and Dimension We’ll be looking at the idea of a set of basis vectors and the dimension of a vector space.

Change of Basis In this section we will see how to change the set of basis vectors for a vector space.

Fundamental Subspaces Here we will take a look at some of the fundamental subspaces of a matrix, including the row space, column space and null space.

Inner Product Spaces We will be looking at a special kind of vector spaces in this section as well as define the inner product.

Orthonormal Basis In this section we will develop and use the Gram-Schmidt process for constructing an orthogonal/orthonormal basis for an inner product space.

Least Squares In this section we’ll take a look at an application of some of the ideas that we will be discussing in this chapter.

QR-Decomposition Here we will take a look at the QR-Decomposition for a matrix and how it can be used in the least squares process.

Orthogonal Matrices We will take a look at a special kind of matrix, the orthogonal matrix, in this section.

Eigenvalues and Eigenvectors

Review of Determinants In this section we’ll do a quick review of determinants.

Eigenvalues and Eigenvectors Here we will take a look at the main section in this chapter. We’ll be looking at the concept of Eigenvalues and Eigenvectors.

Diagonalization We’ll be looking at diagonalizable matrices in this section.

Systems of Equations and Matrices

Systems of Equations In this section we’ll introduce most of the basic topics that we’ll need in order to solve systems of equations including augmented matrices and row operations.

Solving Systems of Equations Here we will look at the Gaussian Elimination and Gauss-Jordan Method of solving systems of equations.

Matrices We will introduce many of the basic ideas and properties involved in the study of matrices.

Matrix Arithmetic & Operations In this section we’ll take a look at matrix addition, subtraction and multiplication. We’ll also take a quick look at the transpose and trace of a matrix.

Properties of Matrix Arithmetic We will take a more in depth look at many of the properties of matrix arithmetic and the transpose.

Inverse Matrices and Elementary Matrices Here we’ll define the inverse and take a look at some of its properties. We’ll also introduce the idea of Elementary Matrices.

Finding Inverse Matrices In this section we’ll develop a method for finding inverse matrices.

Special Matrices We will introduce Diagonal, Triangular and Symmetric matrices in this section.

LU-Decompositions In this section we’ll introduce the LU-Decomposition a way of “factoring” certain kinds of matrices.

Systems Revisited Here we will revisit solving systems of equations. We will take a look at how inverse matrices and LU-Decompositions can help with the solution process. We’ll also take a look at a couple of other ideas in the solution of systems of equations.

Determinants

The Determinant Function We will give the formal definition of the determinant in this section. We’ll also give formulas for computing determinants of and matrices.

Properties of Determinants Here we will take a look at quite a few properties of the determinant function. Included are formulas for determinants of triangular matrices.

The Method of Cofactors In this section we’ll take a look at the first of two methods form computing determinants of general matrices.

Using Row Reduction to Find Determinants Here we will take a look at the second method for computing determinants in general.

Cramer’s Rule We will take a look at yet another method for solving systems. This method will involve the use of determinants.

Euclidean n-space

Vectors In this section we’ll introduce vectors in 2-space and 3-space as well as some of the important ideas about them.

Dot Product & Cross Product Here we’ll look at the dot product and the cross product, two important products for vectors. We’ll also take a look at an application of the dot product.

Euclidean n-Space We’ll introduce the idea of Euclidean n-space in this section and extend many of the ideas of the previous two sections.

Linear Transformations In this section we’ll introduce the topic of linear transformations and look at many of their properties.

Examples of Linear Transformations We’ll take a look at quite a few examples of linear transformations in this section.

Vector Spaces

Vector Spaces In this section we’ll formally define vectors and vector spaces.

Subspaces Here we will be looking at vector spaces that live inside of other vector spaces.

Span The concept of the span of a set of vectors will be investigated in this section.

Linear Independence Here we will take a look at what it means for a set of vectors to be linearly independent or linearly dependent.

Basis and Dimension We’ll be looking at the idea of a set of basis vectors and the dimension of a vector space.

Change of Basis In this section we will see how to change the set of basis vectors for a vector space.

Fundamental Subspaces Here we will take a look at some of the fundamental subspaces of a matrix, including the row space, column space and null space.

Inner Product Spaces We will be looking at a special kind of vector spaces in this section as well as define the inner product.

Orthonormal Basis In this section we will develop and use the Gram-Schmidt process for constructing an orthogonal/orthonormal basis for an inner product space.

Least Squares In this section we’ll take a look at an application of some of the ideas that we will be discussing in this chapter.

QR-Decomposition Here we will take a look at the QR-Decomposition for a matrix and how it can be used in the least squares process.

Orthogonal Matrices We will take a look at a special kind of matrix, the orthogonal matrix, in this section.

Eigenvalues and Eigenvectors

Review of Determinants In this section we’ll do a quick review of determinants.

Eigenvalues and Eigenvectors Here we will take a look at the main section in this chapter. We’ll be looking at the concept of Eigenvalues and Eigenvectors.

Diagonalization We’ll be looking at diagonalizable matrices in this section.