Introduction to Linear Algebra, 5th Edition (2024)

by Gilbert Strang (gilstrang@gmail.com)ISBN: 978-09802327-7-6


Go to Introduction to Linear Algebra (6th Edition) website

Wellesley-Cambridge Press

Book Order from Wellesley-Cambridge Press
Book Order for SIAM members

Book Order from American Mathematical Society

Book Order from Cambridge University Press (outside North America)

Introduction to Linear Algebra, Indian edition, is available at Wellesley Publishers

Review of the 5th edition by Professor Farenick for the International Linear Algebra Society

Book review by insideBIGDATA (2016)

Related websites :
Linear Algebra for Everyone (new textbook, September 2020) SEE NOTE BELOW
Other books by Gilbert Strang
OpenCourseWare
Gilbert Strang's Home Page

I hope this website will become a valuable resource for everyone learning and doing linear algebra. Here are key links:

  • Table of Contents
  • Preface
  • Section 1.3 of the book: Matrices
  • Section 2.5 of the book: Inverse Matrices
  • Section 3.5 of the book: Dimensions of the Four Subspaces
  • Section 6.1 of the book: Introduction to Eigenvalues
  • Section 7.1 of the book: Image Processing by Linear Algebra
  • Section 12.1 of the book: Mean, Variance, and Probability
  • Matrix Factorizations
  • Index
  • 6 Great Theorems
  • The Transpose of a Derivative
  • Eigshow in MATLAB
  • 18.06 OpenCourseWare site with video lectures 18.06 on OCW
  • The publisher's site for the textbook www.wellesleycambridge.com

  • Solution Manual for the Textbook (updated November 2023)
  • Matrix World : The Picture of All Matrices, by Kenji Hiranabe
  • LU and CR Elimination (To appear in the Education Section of SIAM Review)

** Each section in the Table of Contents links to problem sets, solutions,
** other websites, and all material related to the topic of that section.
** Readers are invited to propose possible links.

Table of Contents for Introduction to Linear Algebra (5th edition 2016)

  • 1 Introduction to Vectors
    • 1.1 Vectors and Linear Combinations
    • 1.2 Lengths and Dot Products
    • 1.3 Matrices
  • 2 Solving Linear Equations
    • 2.1 Vectors and Linear Equations
    • 2.2 The Idea of Elimination
    • 2.3 Elimination Using Matrices
    • 2.4 Rules for Matrix Operations
    • 2.5 Inverse Matrices
    • 2.6 Elimination = Factorization: A = LU
    • 2.7 Transposes and Permutations
  • 3 Vector Spaces and Subspaces
    • 3.1 Spaces of Vectors
    • 3.2 The Nullspace of A: Solving Ax = 0 and Rx = 0
    • 3.3 The Complete Solution to Ax = b
    • 3.4 Independence, Basis and Dimension
    • 3.5 Dimensions of the Four Subspaces
  • 4 Orthogonality
    • 4.1 Orthogonality of the Four Subspaces
    • 4.2 Projections
    • 4.3 Least Squares Approximations
    • 4.4 Orthonormal Bases and Gram-Schmidt
  • 5 Determinants
    • 5.1 The Properties of Determinants
    • 5.2 Permutations and Cofactors
    • 5.3 Cramer’s Rule, Inverses, and Volumes
  • 6 Eigenvalues and Eigenvectors
    • 6.1 Introduction to Eigenvalues
    • 6.2 Diagonalizing a Matrix
    • 6.3 Systems of Differential Equations
    • 6.4 Symmetric Matrices
    • 6.5 Positive Definite Matrices
  • 7 The Singular Value Decomposition (SVD)
    • 7.1 Image Processing by Linear Algebra
    • 7.2 Bases and Matrices in the SVD
    • 7.3 Principal Component Analysis (PCA by the SVD)
    • 7.4 The Geometry of the SVD
  • 8 Linear Transformations
    • 8.1 The Idea of a Linear Transformation
    • 8.2 The Matrix of a Linear Transformation
    • 8.3 The Search for a Good Basis
  • 9 Complex Vectors and Matrices
    • 9.1 Complex Numbers
    • 9.2 Hermitian and Unitary Matrices
    • 9.3 The Fast Fourier Transform
  • 10 Applications
    • 10.1 Graphs and Networks
    • 10.2 Matrices in Engineering
    • 10.3 Markov Matrices, Population, and Economics
    • 10.4 Linear Programming
    • 10.5 Fourier Series: Linear Algebra for Functions
    • 10.6 Computer Graphics
    • 10.7 Linear Algebra for Cryptography
  • 11 Numerical Linear Algebra
    • 11.1 Gaussian Elimination in Practice
    • 11.2 Norms and Condition Numbers
    • 11.3 Iterative Methods and Preconditioners
  • 12 Linear Algebra in Probability & Statistics
    • 12.1 Mean, Variance, and Probability
    • 12.2 Covariance Matrices and Joint Probabilities
    • 12.3 Multivariate Gaussian andWeighted Least Squares
  • Matrix Factorizations
  • Index
  • Six Great Theorems / Linear Algebra in a Nutshell

[top]

Each section of the book has a Problem Set.

In the following videos, click the 'Play' ► icon
While playing, click the word 'YouTube'
to watch a larger video in a separate tab

Linear transformations of a house

Eigenvalues don't quite meet

Download Selected Solutions (small differences from the solutions above)

Practice Exam Questions

Links to websites for each semester at MIT: web.mit.edu/18.06 ,

  • Exam 1 (1997-2009)
  • Exam 1 (2010-2015)
  • Exam 2 (1997-2009)
  • Exam 2 (2010-2015)
  • Exam 3 (1997-2009)
  • Exam 3 (2010-2015)
  • Final (1998-2009)
  • Final (2010-2015)

Linear Algebra Problems in Lemma

My friend Pavel Grinfeld at Drexel has sent me a collection of interesting problems -- mostly elementary but each one with a small twist. These are part of his larger teaching site called LEM.MA and he built the page http://lem.ma/LAProb/especially for this website linked to the 5th edition.

The H.264 Video Standard (promised in Section 7.1 of the book)

This video standard describes a system for encoding and decoding (a "Codec") that engineers have defined for applications like High Definition TV. It is not expected that you will know the meaning of every word -- your book author does not know either. The point is to see an important example of a "standard" that is created by an industry after years of development--- so all companies will know what coding system their products must be consistent with.

The words "motion compensation" refer to a way to estimate each video image from the previous one. The simplest would be to guess that successive video images are the same. Then we would only need the changes between frames -- hopefully small. But if the camera is following the action, the whole scene will shift slightly and need correction. A better idea is to see which way the scene is moving and build that change into the next scene. This is MOTION COMPENSATION. In fact the motion is allowed to be different on different parts of the screen.

It is ideas like this -- easy to talk about but taking years of effort to perfect -- that make video technology and other technologies possible and successful. Engineers do their job. I hope these links give an idea of the detail needed.

Notes on Linear Algebra


Proof of Schur's Theorem

Singular Value Decomposition of Real Matrices (Prof. Jugal Verma, IIT Bombay, March 2020)

Our recent textbook Linear Algebra for Everyone starts with the idea of independent columns

This leads to a factorization A = CR where C contains those independent columns from A

The matrix R tells how to combine those columns of C to produce all columns of A

Then Section 3.2 explains how to solve Rx = 0. This gives the nullspace of A !!

Here is that new section: A = CR and Computing the Nullspace by Elimination


Other books by Gilbert Strang

  • Linear Algebra for Everyone (new textbook, September 2020)

  • Linear Algebra and Learning from Data

  • Differential Equations and Linear Algebra

  • Computational Science and Engineering

  • Calculus

  • [top]

This page has been accessed at least Introduction to Linear Algebra, 5th Edition (3) times since January 2009.

Accessibility

Introduction to Linear Algebra, 5th Edition (2024)

References

Top Articles
Latest Posts
Article information

Author: Maia Crooks Jr

Last Updated:

Views: 5591

Rating: 4.2 / 5 (63 voted)

Reviews: 86% of readers found this page helpful

Author information

Name: Maia Crooks Jr

Birthday: 1997-09-21

Address: 93119 Joseph Street, Peggyfurt, NC 11582

Phone: +2983088926881

Job: Principal Design Liaison

Hobby: Web surfing, Skiing, role-playing games, Sketching, Polo, Sewing, Genealogy

Introduction: My name is Maia Crooks Jr, I am a homely, joyous, shiny, successful, hilarious, thoughtful, joyous person who loves writing and wants to share my knowledge and understanding with you.