In this lesson on Projection Matrices and Least Squares, the professor discusses the formula for a projection matrix and how it projects a vector to the nearest point in the column space. The lesson then goes on to explain the concept of least squares and how it can be used to compute the best straight line by minimizing the sum of squares of errors. The professor also demonstrates the importance of the equation A transpose A X hat equals A transpose B in statistics and estimation by using it to compute the matrix A transpose A. Overall, this lesson provides a good understanding of the concepts of projection matrices and least squares, and their applications in finding the best fit line.
Projection Matrices and Least Squares -- Lecture 16. Review of projection matrices, orthogonality, and a discussion of least squares matrices and computation of best straight line.
Gilbert Strang, 18.06 Linear Algebra, Spring 2005. (Massachusetts Institute of Technology: MIT OpenCourseWare), http://ocw.mit.edu (Accessed November 22, 2008). License: Creative Commons BY-NC-SA.
More info at: http://ocw.mit.edu/terms