Connecting...

This is a quick preview of the lesson. For full access, please Log In or Sign up.
For more information, please see full course syllabus of Linear Algebra
For more information, please see full course syllabus of Linear Algebra
Linear Algebra Orthogonal Complements, Part II
Lecture Description
Previously we introduced the notion of orthogonal complements in a vector space; now we’re going to discuss some applications and intriguing facts about spaces associated with matrices and how they are complements of each other. The fact that some spaces are orthogonal to each other paves the way to so many other mathematical phenomenon that we won’t get into in this series. For now, all you need to know is what orthogonal complements are and the methods used to check for orthogonality.
Bookmark & Share
Embed
Share this knowledge with your friends!
Copy & Paste this embed code into your website’s HTML
Please ensure that your website editor is in text mode when you paste the code.(In Wordpress, the mode button is on the top right corner.)
×
Since this lesson is not free, only the preview will appear on your website.
- - Allow users to view the embedded video in full-size.
Next Lecture
Previous Lecture
2 answers
Fri Dec 13, 2019 9:18 AM
Post by Sungmin Lee on December 9, 2019
Hi, professor Raff
your lecture is always helpful to me. I'm taking this lecture as preparation for college
(excuse me on my poor English. I'm working on it) I have some ideas want to be checked.
First, Regarding to m by n matrix A, it was obvious to me that (Rank of A) + (Nullity of A) = n because Rank of A is equal to leading entry of matrix of A after being transformed to RRE form(lets call that matrix B) and Nullity same with the number of arbitrary constants.
Second, I think transition matrix P(s<-t) = S^(-1)•T. The reason is that 1. turning coordinate vector respect T to respect to In, and then turning that coordinate vector respect to S. (S and T is matrix which constituted with vector of basis S and T respectively)
lastly, Row space and null space must be orthogonal naturally as definition of null space is set of vectors that satisfy Ax=0 and we can regard matrix product AB as combination of dot products of row vectors of matrix A and column of B, which makes x orthogonal with A as result is 0.
Thank you for reading this long text. And again, thank you for understanding of my poor English
0 answers
Post by Manfred Berger on June 21, 2013
I've been thinking a bit about example 1. If I was to use Gram-Schmidt to expand this into a full basis of R3 and then take the image of v with respect to my new basis, the projection would be the first 2 components, correct?