Projection Matrices and Least Squares

Sick of ads?​ Sign up for MathVids Premium
Taught by OCW
  • Currently 4.0/5 Stars.
4362 views | 1 rating
Lesson Summary:

In this lesson on Projection Matrices and Least Squares, the professor discusses the formula for a projection matrix and how it projects a vector to the nearest point in the column space. The lesson then goes on to explain the concept of least squares and how it can be used to compute the best straight line by minimizing the sum of squares of errors. The professor also demonstrates the importance of the equation A transpose A X hat equals A transpose B in statistics and estimation by using it to compute the matrix A transpose A. Overall, this lesson provides a good understanding of the concepts of projection matrices and least squares, and their applications in finding the best fit line.

Lesson Description:

Projection Matrices and Least Squares -- Lecture 16. Review of projection matrices, orthogonality, and a discussion of least squares matrices and computation of best straight line.

Gilbert Strang, 18.06 Linear Algebra, Spring 2005. (Massachusetts Institute of Technology: MIT OpenCourseWare), http://ocw.mit.edu (Accessed November 22, 2008). License: Creative Commons BY-NC-SA.
More info at: http://ocw.mit.edu/terms

Additional Resources:
Questions answered by this video:
  • How do you find a least squares matrix?
  • What is a best fit line?
  • What is a best straight line?
  • How do you find the best fit line for a data set?
  • What is x hat?
  • Staff Review

    • Currently 4.0/5 Stars.
    This video talks a lot more about projections and orthogonal spaces as a lead-in to finding a least-squares matrix for a data set. By minimizing the error of a straight line through data points, it is explained how to do linear regression to find the best fit line. A great explanation of what is going on, why it works, and how to find the best straight line.