• [categories_slider autoplay="1"]
  • Linear Algebra II: Oxford Mathematics 1st Year Student Lecture

    Copy Help
    • Public/Private: Change the visibility of this video on your My Videos tab
    • Save/Unsave: Save/Unsave this video to/from your Saved Videos tab
    • Copy: Copy this video link to your system clipboard
    • Email: Copy this video link to your default email application
    • Remove: Remove this video from your My Videos or Saved Videos tab
    search-icon
    Watch at: 00:00 / 00:00:20[Music]okay good morning everyone I think maybewe'll get startedsay welcome to linear algebra 2 if youweren't expecting in algebra 2 either atWatch at: 00:20 / 00:40least one of your eyes and they'reongoingso I'll start off just with a fewpractical bits of information so this islinear algebra 2 which is building onWatch at: 00:40 / 01:00from getting out of a one that you sawlast term so we're meeting Mondays andWednesdays from 10 to 11 a.m. and that'salways in this room or one so sinceWatch at: 01:00 / 01:20you'll hear you probably understood thetimetable okay and one thing to noticethat this is just a short course sowe're only having lectures between weeks1 & 4 as you saw in many out of group 1Watch at: 01:20 / 01:40the idea of learning out there is reallyto studymaybe functions which have the propertythat F of ax plus B Y is a times f of XWatch at: 01:40 / 02:00plus B times F of Y and here maybe I'mthinking of x and y as being vectors andNBS being scalars but you could alsoWatch at: 02:00 / 02:20imagine slightly more general situationswhere this is true and any function thatsatisfies this we call it linear andthen linear algebra is theout of Exedy of these sorts of functionsand it turns out that this is one of theWatch at: 02:20 / 02:40most important and well understood areasof mathematics see there's a verybeautiful theory of this and one quicksummary is you can understand a lotabout this by looking at matrices andWatch at: 02:40 / 03:00vector spaces so objects that we'veencountered getting out of the one andactually a huge amount of modernmathematics is often to take some verycomplicated situation and to try andWatch at: 03:00 / 03:20maybe approximated in some way by somelinear system so when you do calculusyou're approximating a more generalfunction by something that looks linearon a small scale and you're now into areally lies at the very heart of a hugeamount of mathematics you've maybeWatch at: 03:20 / 03:40already seen that linear algebra hasvarious different applications if youwant to solve simultaneous equations youcan cast this as a problem in linearalgebra just as you can understandlinear algebra as looking at vectorspaces and more geometric problems andWatch at: 03:40 / 04:00one philosophy for the way that I liketo think about very large worth that Ithink would be useful in this course isto when you're thinking about statementsto always imagine them geometrically asWatch at: 04:00 / 04:20questions about two by two matrices andtransformations of the plane but toalways write down proofs and argumentsmuch more abstractly using an alphaso you thinkWatch at: 04:20 / 04:40about you know algebra geometrically inparticular as say transformations of theWatch at: 04:40 / 05:00plane say I'm not too good atvisualizing things in two dimensions butI can visualize things okay in twodimensions and often it's very easy toWatch at: 05:00 / 05:20visualize things if you just imaginethat everything is some operation on theplane so I like to think about what'sgoing on in a sort of geometric way butwe're going to prove rather more generalstatements and one of the big powerfulSuvla around four powerful things aboutlinear algebra is you can take thisWatch at: 05:20 / 05:40geometric intrusion but then write downformally that work in arbitrarydimensions in a very nice way and oftenthe algebraic way of doing thecalculations is much much moreconvenient than the geometric way evenmaybe we were originally motivated bythe geometry say on to do calculationsWatch at: 05:40 / 06:00and poof abstractly are using algebra soWatch at: 06:00 / 06:20if you do it this way the proofs and thestatements often hold in much moregenerality but also it's much moreconvenient to do the calculations inthis way[Music]okay so I've maybe said this in veryWatch at: 06:20 / 06:40vague termsmaybe we can sort of think about aconcrete example as to how this mightwork and this gets us to the first thingthat I'd like to talk about in manyalgebra which is determinant but I'dWatch at: 06:40 / 07:00like to demonstrate it through this sortof philosophy say think of just somelinear transformation of the plane soWatch at: 07:00 / 07:20what do I mean by this well somecombination of rotation or stretching inone direction or maybe I could skew itWatch at: 07:20 / 07:40if I shift things but a lineartransformation is then to preserve linesand I can think about just taking theunit box and this will then be mapped tosome maybe skewed box so be some sort ofWatch at: 07:40 / 08:00parallelogram and maybe the axes won'tbe I was on two anymore and I want tounderstand transformations like this saythis is just a unit square in R squaredWatch at: 08:00 / 08:20and then this gets mapped to some partagainand the unit square has area one andWatch at: 08:20 / 08:40this parallelogram the area won'tnecessarily be preserved so we'll havesome different area see but I can thinkabout the linear transformation actingon other shapes so I could have maybe atriangle and then so they all get mappedWatch at: 08:40 / 09:00to some strange skewed triangle orsomething and maybe I can draw sometriangle with area two and it's a factWatch at: 09:00 / 09:20that the image of the triangle we didn'thave any OTC so you can it's an exerciseif you want to work this outand in fact or maybe we would guess justWatch at: 09:20 / 09:40from doing a couple of examples likethis that every shape well I'm not goingto be precise about what I mean by shapehas the area scaled by a factor of C soWatch at: 09:40 / 10:00often it's unwise to make a generalguess just from doing two examples butyou can do a few more examples if youlike and it turns out this guess is trueWatch at: 10:00 / 10:20and this constant C is going to dependon the linear transformation and thisconstancy is essentially the determinantof three transformation so it's clearlyWatch at: 10:20 / 10:40going to be some important property ofany linear transformation that itso that even if you're working in manydimensions any reasonable shape is hasits volume scaled by a constant factorand this constant fracture is called thedeterminant so this is very much what IWatch at: 10:40 / 11:00mean by thinking about thingsgeometrically but it turns out that it'sreally quite difficult to make lots ofthis precise and the calculations get abit horrible if you try and think aboutthings just in terms of volumes so verymuch in the spirit of this philosophy Iwant to do calculations trying toWatch at: 11:00 / 11:20abstract out properties of this so let'sthink about a few properties of thatthis scaling areas would have in Rsquared and try and think of if I canwrite these down as abstract propertiesmaybe I can understand properties ofWatch at: 11:20 / 11:40this linear transformation just fromthese abstract ideas and say note thatthe area or volume of a linearWatch at: 11:40 / 12:00transformation is linear in terms of theWatch at: 12:00 / 12:20entries of the linear transformation somaybe let's consider a matrix say let abe a matrix with columnsWatch at: 12:20 / 12:40a 1 up to a n say these are we can thinkof these are just vectors in R to the Nso isin n by n matrix and the volume ofWatch at: 12:40 / 13:00the image of maybe the unit cube in RTNis linear in the columns so the volumeWatch at: 13:00 / 13:20of so maybe a nice way for me to writethis is I whenever matrix as columns a 1not 2 a n all right this is a is equalto square brackets a 1 up to am and theWatch at: 13:20 / 13:40volume of some columns and even theeighth column I have vector vector b j /c j of the unit cube is equal to theWatch at: 13:40 / 14:00volume where I have the same columnswith BJ us the volume when I have CJ isthe jth column and this is basicallyWatch at: 14:00 / 14:20based on the idea that the if I have twovectors here a and B so I'm thinking ofthis is or maybe I say consistent I'llWatch at: 14:20 / 14:40call this B and C the area of this isequal to just the area where I havevector B plus C hereand so that was hey then I'd have just aWatch at: 14:40 / 15:00here so that's one property a second andeasy property is that if a J is equal toWatch at: 15:00 / 15:20a K for some J not equal to K so two ofthe columns are the same then the imageof the unit clue gube wool caps so maybeit's easiest if we just think about siWatch at: 15:20 / 15:40for two consecutive columns and then wedon't need to worry about yes so if Ihave two consecutive columns whichexactly the same then the volume of ofWatch at: 15:40 / 16:00the image of the unit cube is equal tozeroand so again I'm really thinking aboutWatch at: 16:00 / 16:19this and if I have two vectors being thesame then I just have two columns AJ andAJ plus one and the whole unit cube willjust degenerate down to a line and thenWatch at: 16:19 / 16:40a third property that we get from justlooking at transformations on R squaredis that the if a is the identity matrixWatch at: 16:40 / 17:00then it preserves the unit cube in artWatch at: 17:00 / 17:20the N and the volume equals one so Ihadn't necessarily proved any of thesestatements but maybe they're allbelievable and so what I want to saythen is that these are properties thatthis determinant I think should satisfyWatch at: 17:20 / 17:40and these are all algebraic propertiesand therefore I want to study thedeterminant just those algebraicproperties so this is going to to studyWatch at: 17:40 / 18:00the term and we're then going to definewhat it means for any map to bedetermined the term mental which meansit's going to satisfy the equivalence ofthese properties one two and three sowe're now going to start doing someconcrete mathematics maybe before wassome geometric motivation so we're goingWatch at: 18:00 / 18:20to say if I take any functionsay I'm thinking of a function B andit's going to take his inputs n by nmatrices with real entries and it'sWatch at: 18:20 / 18:40going to output some new value and I'mgoing to cool this map I'm going to sayit has the property of being determinantdeterminant all meaning the Paves likewhat I expect this determinant to behavelike if it satisfies the analogousWatch at: 18:40 / 19:00Papias so if I think about my map actingon some matrix where I'll write it inthe column form and in the Jade columnit has the vector bj + CJ then i want itWatch at: 19:00 / 19:20to equal to the determinant at bj plusthe determinant at CJ for any choice ofWatch at: 19:20 / 19:40column index j between 1 and m andsimilarly i wanted to scale so if Iscale one column by factor lambda then IWatch at: 19:40 / 20:00just wanted this to scale thedeterminant by a factor lambda so thisis going to be the first property and Iwant this to be d is linear in eachcolumnWatch at: 20:00 / 20:20so this is basically the property thatwe guessed the volume of the image ofthe unit ball should satisfy for anylien transformation the second thing isthat I want D to have the equivalent toWatch at: 20:20 / 20:40the second property here that if Ievaluated at some matrix and two of theWatch at: 20:40 / 21:00columns are equal to each other maybetwo consecutive columns then I want thedeterminant to be equal to zero and thenWatch at: 21:00 / 21:20the final thing is if I evaluate thedeterminant of the identity matrix Ijust want this to be equal to one sothis is a completely abstract definitionof what it means for any function onmatrices to be determinantal and atWatch at: 21:20 / 21:40least based on this motivation we expectthat the volume of the image of the unitcube should satisfy this and so we thinkit should be an important property ofany linear map but now this applies ingeneral to matrices and out very quicklyso the first thing I want to do is toWatch at: 21:40 / 22:00show you that if I need a linear mapsubsidize these properties that itactually satisfies slightly strongerversions of these properties so we canautomatically upgrade it so populationWatch at: 22:00 / 22:20if d is a determined mental map then oneWatch at: 22:20 / 22:40I can swap two columns if I switch thesign of the determinant so thesealternating in the columns say B of someWatch at: 22:40 / 23:00matrix with columns bj & VJ plus 1 isWatch at: 23:00 / 23:20equal to minus the determinant of thesame matrix if I swap two columns sayswapping two just changes the sign aWatch at: 23:20 / 23:40second property is that okay I said theWatch at: 23:40 / 24:00determinant is equal to 0 2 consecutivecolumns the same but actually they don'thave to be consecutive so if I evaluatedit some matrix with two columnsWatch at: 24:00 / 24:20BJ and bi this is 0 if any two of thedistinct columns the sameWatch at: 24:20 / 24:40and then a final property is a strongerversion of it being automating that thefirst one we said it's alternating inconsecutive columns but actually I cando it for general columns so if I swapWatch at: 24:40 / 25:00any two columns I say whenever I swatchWatch at: 25:00 / 25:20any two columns I just get exactly thesame answer but with the opposite signso let's go ahead and through thisWatch at: 25:20 / 25:40proposition based on the properties thatwe've defined for the term rental map solet's start off with part one becausethat really holds the key to the wholeWatch at: 25:40 / 26:00thing and what I want to do is to thinkabout D being evaluated where I'mevaluating the Jade common column andthe J plus 1 column at VJ plus BJ plus 1Watch at: 26:00 / 26:20so here this is the Jade column and thisWatch at: 26:20 / 26:40is the j+ first see buying mini ante Ican expand this out say what do I getWatch at: 26:40 / 27:00well I'll get one term with BJ and BJI'll get one term with BJ and BJ plusWatch at: 27:00 / 27:20one I'll get one term with Dave columnBJ plus 1 and J plus first column BJ andthen I'll get a final term with J :Watch at: 27:20 / 27:40BJ plus 1 and J plus first columns alsoBJ plus 1 so this is pop T 1Watch at: 27:40 / 28:00but by pop TT we have that the left handWatch at: 28:00 / 28:20side is equal to zero because it has theWatch at: 28:20 / 28:40Jade column and the G Plus first columnbeing the same so I'm just using thisproperty but moreover we also have yetin my expansion into four terms on theright hand side the first term and thelast term must also be zero becauseWatch at: 28:40 / 29:00these also have two columns which arethe same and so if I substitute theseWatch at: 29:00 / 29:20building code zero I therefore find yesthis is 0 this term is 0 this time 0 andso I just get that the sum where I haveWatch at: 29:20 / 29:40Jade column be Jane Doe plus firstcolumn VJ plus 1 is equal to thenegative of exactly the same thing whenI swap the two columns and so this givesWatch at: 29:40 / 30:00the first propertyso for part two I can now justrepeatedly use part one to move column JWatch at: 30:00 / 30:20next to column I and I change theWatch at: 30:20 / 30:40determinants just by a factor of plus orminus oneWatch at: 30:40 / 31:00but then the determinant has to be zerobecause I have would have twoconsecutive columns be insaneand so that instantly givesWatch at: 31:00 / 31:20and say for now for the third part ofpart three there's a few different waysI could do it but I can essentiallyfollow exactly the same proof as partone I can think about evaluating myWatch at: 31:20 / 31:40matrix where are now evaluating it withcolumns bi and BJ in position I and BJWatch at: 31:40 / 32:00plus bi also in position Jand I can follow exactly the sameargument you meant as for part I I canWatch at: 32:00 / 32:20expand this out as for terms thisleft-hand side must vanish because twoof the columns are the same and by partWatch at: 32:20 / 32:39two I've seen that whenever two columnsare the same the whole thing vanishesbut then when I've expanded it out thefirst term in the last term must alsovanish using two okay so so far whatWatch at: 32:39 / 33:00we've seen is that we've maybe had thisgeometric guess that there's animportant property of any matrix orlinear transformation which is how itWatch at: 33:00 / 33:20scales volumes and this is a importantconstant that you associate to anymatrix I then defined this abstractproperty of any function beingdeterminantal if it satisfies verse ofthe properties that we expect thisgenuine constant to satisfy and I'veWatch at: 33:20 / 33:39seen that if it satisfies theseproperties then in this proposition itactually satisfies slightly strongerversions of those propertiesso we've nowdone enough of the basic importantWatch at: 33:39 / 34:00properties of the determinant but wehaven't seen there actually there's anysuch map that satisfies these propertiesand so that's what I'd like to do now Iwant to show you that there is at leastsome function which satisfies all theseproperties in the back of my mind I'mWatch at: 34:00 / 34:20thinking that a function that satisfiesthese properties is the volume of theimage of the unit cube but again I don'twant to do any calculationsgeometrically I want to do them all inan algebraic way because that makeseverything much much clearer and saythis is maybe the first important resultof the course so let's call it a theoremWatch at: 34:20 / 34:40a determinantal map exists on n by nmatrices exists for each n big an equalWatch at: 34:40 / 35:00to one say it's not taking obvious thatWatch at: 35:00 / 35:20something should exist from this and toprove this I want to do a proof byinduction where I'm going to induct onthe size of the matricesWatch at: 35:20 / 35:40okay so let's first of all startthinking about the case when N equalsone so we're just looking at one by onematrices and I can define D of just aone by one matrix a to be maybe the mostobvious thing just its entry a and sinceWatch at: 35:40 / 36:00the function the identity function iscertainly linear it's very easy to checkthat D satisfies the properties one twoand three here okay so we've shown thatWatch at: 36:00 / 36:20something exists for what one by onematrices and now we want to consider thecase when n is bigger than one and we'regoing to assume that something existsWatch at: 36:20 / 36:40some determinant or map exists for nminus 1 by n minus 1 so we somehow wantWatch at: 36:40 / 37:00to construct a function on n by nmatrices out of this function that we'reWatch at: 37:00 / 37:20assuming to exist on n minus 1 by nminus 1 matrices so I want to come upwith so I want to cook up some way ofcoming up with the determinant to mapthat satisfies these three propertiesthat I want to somehow use this n minusWatch at: 37:20 / 37:401 by n minus 1 so I want to think ofsome matrix awhich is an M by n matrix so let's sayit has entries little a IJ and if I'mWatch at: 37:40 / 38:00going to use this the N minus 1 which isclear going to be important in the antameans I have to think about how can iconstruct a n minus 1 by n minus 1matrix out of this n by n matrix maybethere's one easy way to do this which isWatch at: 38:00 / 38:20just to throw out some lonesome Colin soWatch at: 38:20 / 38:40I want a I J capital a IJ to be the Nminus 1 by n minus 1 matrix formed byremoving the I throw and J column ok soWatch at: 38:40 / 39:00this is one way of constructing versedif than n minus 1 by n minus 1 matricesfor my n by n matrix a and I now don'tto consider so let's choose some IWatch at: 39:00 / 39:20between 1 and n and I'm going to definemy attempt to and a determine when toWatch at: 39:20 / 39:40function on and by all matrices asfollows so I will put minus 1 to the Iplus 1 a I 1 and then I'm going to usethe N minus 1 by minus 1 function on theWatch at: 39:40 / 40:00matrix AI 1but maybe say this is some function thatwould work on n by n matrices but isignoring lots of the elements in the i3so instead I want to add lots of theseWatch at: 40:00 / 40:20similar versions of this togetherto get something that's sensitive toevery single element in the matrix andit turns out that a good way to do thisis to keep on multiplying by minus 1 andjust going through all the differentpossible choices in the i3 ok so I'veWatch at: 40:20 / 40:40Watch at: 40:40 / 41:00just plucked from thin airthis matrix here but I think if you playaround with a little bit and you decidethat you want to come up with some wayof defining a function on n by nmatrices in terms of function by n minus1 by n on n minus 1 matrices thatdepends on each element you'll see thatWatch at: 41:00 / 41:20actually this isn't such an unnaturalthing that I've just picked out of theair maybe you wouldn't have guessedthese minus 1 factors but it should beclear once you go through a bit more ofthe proof why these minus 1 factorsanyway I've defined this function andthis is a function on n by n matricesand I want to show you that it's nowWatch at: 41:20 / 41:40determine ttle so I want to verify thatit satisfies properties 1 2 & 3 and ifI've done this I have completed theproofWatch at: 41:40 / 42:00okay so I'm just going to verifying theproperties in order maybe I'll move onto the middleI guess maybe will be useful to keep thescore up so that we can see what theWatch at: 42:00 / 42:20focus are so maybe I'll move so let'sstart off with top t1 so we'd like toshow you that B is linear in each columnso since D is a sum of terms a IJ d nWatch at: 42:20 / 42:40minus 1 AI J it is sufficient to showWatch at: 42:40 / 43:00each of these is linear in the columnsWatch at: 43:00 / 43:20okay so I just need to show that each ofthese different quantities a IJ DN minus1 of capital a IJ is in in the columnsand so let's concentrate on the case ofWatch at: 43:20 / 43:40my n by n matrix a and I'm going toconsider these functions so if J isWatch at: 43:40 / 44:00equal to K then a IJ doesn't dependon the cave column of a because this isWatch at: 44:00 / 44:20formed by removing the Kate column of abut little a IJ is just the ice elementof the cave column into a certain linearWatch at: 44:20 / 44:40and so you open these two thingstogether we see the a IJ the N minus 1Watch at: 44:40 / 45:00of a IJ is living the Kate column when Jis equal to K what if J is not equal toWatch at: 45:00 / 45:20K well then we have the a IJ doesn'tdepend on the cave column at all becauseit's just some other entry in the JadecalmWatch at: 45:20 / 45:40but we have the d n minus 1 AI j isWatch at: 45:40 / 46:00linear in the cave column of a sincewe're assuming that this function isdetermined to land so they're in theconference and therefore in this case weWatch at: 46:00 / 46:20have the a IJ DN minus 1 of a IJ islinear in the cave column and thereforeWatch at: 46:20 / 46:40regardless of what value J is thesefunctions are always living in the cavecolumn and so D is okay so we'veWatch at: 46:40 / 47:00therefore verified property one for thisfunction D that I've written down sowe're left to verify properties 2 & 3 soWatch at: 47:00 / 47:20maybe we have just enough space here solet's go for property T mounting that inWatch at: 47:20 / 47:40my matrix a I have a the J's column isequal to the j plus first column and soI want to show you that this determinanthas to be 0 thenso again is going to look a little bitWatch at: 47:40 / 48:00similar to this a IJ will have twocolumns the same two consecutive columnsWatch at: 48:00 / 48:20same because it essentially has thecolumns inherited from AJ and AJ plusone unless we removed one of these twookay so my notations slightly bannedWatch at: 48:20 / 48:40let's imagine that a K is equal to a kplus 1just cuz otherwise I'll get confused byall the indices apologies so imaginethat the caithe column is equal to the kplus first column and now i'm looking atone of these matrices a IJ where I'mremoving the I throw in the jth columnWatch at: 48:40 / 49:00if I haven't removed either the casecolumn or the k plus first column thenwhatever corresponds to these two willstill be the same until have twoconsecutive columns the same andtherefore I have the DN minus one of aIJ is equal to zero but this is only twoif I haven't removed one of these twoWatch at: 49:00 / 49:20columns so if I didn't remove either ofthose two columns then to the columnswill be the same and so the determinantwill vanish on these n minus 1 by nminus 1 matrices and so this means thatwhen I'm evaluating da I would haveWatch at: 49:20 / 49:40loads of no two terms but virtually allof them assume I'm only going to get theterms from aik and I are K plus 1Watch at: 49:40 / 50:00say I can write out precisely what thisis and I just get the two terms fromwhen I have AI K and AI K plus 1 butWatch at: 50:00 / 50:20then if the cave column is equal to thek plus first column I have that a I K isWatch at: 50:20 / 50:40equal to AI K plus 1 and little a I K isequal to little a I K plus 1 and so inparticular these two terms are equal toeach other and so the determine valuesWatch at: 50:40 / 51:00so now finally I just need to checkproperty three so if a is equal to AI nWatch at: 51:00 / 51:20then a IJ is equal to zero unless Jequals I and so all but one term in thishuge expansion vanishes and so I haveWatch at: 51:20 / 51:40the this function evaluated at a isminus 1 to the I plus I a I I the nminus 1 of AII but this is equal to 1Watch at: 51:40 / 52:00because minus 1 to the I plus I is 1 a II is 1 whenever I is the identity matrixand by assumption of the N minus 1 beingdeterminantal this also evaluates to 1at the identity matrix which is what IgetI will move the ice column and theeye-to-eye and so this means theWatch at: 52:00 / 52:20Watch at: 52:20 / 52:40function D satisfies popped is one twoand three and so I've shown itsdeterminant and so therefore byinduction we know that there exists atleast one different determinant or mapon M by matrices so therefore these termWatch at: 52:40 / 53:00rental and therefore determine when toexit this okay so I'll stop here whatWatch at: 53:00 / 53:20we've seen is that there's this we guessthat there's this important geometricproperty of matrices which is how theyscale volumes but it turns out that isvery better to work with that so we'veabstracted this out to some algebraicWatch at: 53:20 / 53:40properties and then working with thosealgebraic properties we've shown thatthere's something that exists thatbehaves in exactly the same way and nexttime we'll show that this is in fact aunique map that has all of theseproperties okay thanks a lotyouWatch at: 53:40 / 54:00you