Math 2210Q-001, Fall 2011


Week Sections in text (estimated) Quiz/Test Administrivia
Week 1: Aug. 29 - Sept. 2 1.1-1.4 None!
Week 2: Sept. 5-9 1.4, 1.5, 1.7 Quiz #1 No class on Monday (Labor Day)
Week 3: Sept. 12-16 1.8, 1.9, 2.1 Quiz #2 Monday: last day to drop without a W
or change grading option to pass/fail
Week 4: Sept. 19-23 2.1-2.4 Quiz #3
Week 5: Sept. 26-30 2.4-2.5 Quiz #4
Week 6: Oct. 3-7 3.1-3.2 Midterm #1
Week 7: Oct. 10-14 3.2-3.3 Quiz #5
Week 8: Oct. 17-21 4.1-4.2 Quiz #6
Week 9: Oct. 24-28 4.2-4.5 Quiz #7
Week 10: Oct. 31 - Nov. 4 4.5-4.7 Quiz #8 Monday: last day to drop
or change a pass/fail option to letter grade
Week 11: Nov. 7-11 5.1-5.2 Midterm #2
Week 12: Nov. 14-18 5.2-5.4 Quiz #9
Week 13: Nov. 21-25 Thanksgiving break
Week 14: Nov. 28 - Dec. 2 6.1-6.4 Quiz #10
Week 15: Dec. 5-9 6.4, 7.1, 7.4 Quiz #11

August 31

The only way to get comfortable with applying these elementary row operations is to do it. Practice this! Also, convince yourself (if you aren't already convinced) that applying the elementary row operations we talked about in class doesn't affect the solution set of the linear system.

September 2

Today's challenge is to understand the difference between row echelon form and reduced row echelon form. If I gave you a matrix, could you recognize whether it was in REF, RREF, or neither?

September 7

Think about what it means to be in the span of a set of vectors. How would you find out if one vector was in the span of a set of vectors?

September 9

The theorem I gave at the end of class (Theorem 4, section 1.4) is a good way for you to answer difficult-sounding questions by doing something you've been doing for a while. If I ask if the columns of A span R^m, all you have to do is find out if A has a pivot position in every row!

September 12

Today we talked about different kinds of linear systems. Make sure you understand how the solution sets for a nonhomogeneous system and the corresponding homogeneous system are related to each other.

September 14

Think about the theorem I gave you at the end of class today. Keep in mind that the statements don't necessarily reverse in the way you'd hope. For instance, the theorem states that a set of vectors that has more vectors than each of the vectors has entries must be linearly dependent. One of you asked if a set of vectors with fewer elements than each of the vectors has entries must be linearly independent. We showed that this wasn't the case by producing a linearly dependent set of two vectors with three entries each. Can you produce a linearly independent set of two vectors with three entries each?

September 16

Can you prove that any 2x2 matrix transformation is linear?

September 19

I recommend that you look at the tables on pp. 73-75 of your book to gain a better understanding of how certain 2x2 matrices act on vectors.

September 21

I want to reiterate that I don't care what technique you use for multiplying matrices. The one I presented in class breaks down the process into more steps and is, I think, a bit more intuitive and foolproof, but if you know another method, you're welcome to use it.

September 23

At this point, we have a quick calculation to tell us whether a 2x2 matrix is invertible and a formula for calculating the inverse if it is. We also have a method for finding the inverse of an nxn matrix if one exists: if you find out at some point in the computation that you can't possibly transform A into I, you can stop right there and say that the inverse does not exist.

September 26

The Invertible Matrix Theorem I talked about in class is Theorem 8 on p. 112. I recommend reading through the proof and looking at the diagrams on the left showing the implications. Can you combine them all into one big diagram?

September 28

Convince yourself that multiplying two partitioned matrices gives the same answer as multiplying them without the partitions.

September 30

Think about why you would want to find an LU-factorization. As a challenge, read the section about permuted LU-factorizations on p. 127.

October 3

Try reading the very first part of Section 3.1 to understand where the concept for the determinant comes from.

October 7

Today's exercise: calculate the determinants I described in class and convince yourself that they all give the same answer.

October 10

If you want to know more about proof by induction, here's a nice example on YouTube. If you just want to watch dominoes falling, there's a YouTube video for that, too.

October 12

I suggest reading the proofs of Theorems 3 and 6 on p. 174. See me if you have questions!

October 14

My challenge for you for the day: compute the determinant of a general 2x2 matrix using the adjugate method we talked about in class today and show that we get the formula in Theorem 4, Section 2.2. Remember that the determinant of a 1x1 matrix is just the entry in the matrix.

October 17

Take a look at Example 5 on p. 183 to see how linear algebra can give you the formula for the area of an ellipse! It would also be good for you to come up with an example of a set with addition and scalar multiplication that is not a vector space.

October 19

Find two examples of subsets of R^3 that are vector spaces and two examples that are not.

October 21

Can you show that V = Span{sin^2(x),cos^2(x)} contains all constant functions?

October 24

Think about Col A and Nul A. If A is 3x4, what is Col A a subspace of and what is Nul A a subspace of?

October 26

The Spanning Set Theorem really just says that if you have a spanning set and you find out that a vector in it is a linear combination of other vectors in it, you can delete it and still have a set of vectors that spans the original set.

October 28

Come up with a nonstandard basis for R^3 and find the coordinate vector for the vector [1 1 1] (given in terms of the standard basis) in your basis!

October 31

Remember that the dimension is fixed for any given vector space: every basis has the same size.

If you're up for a bit of mathematical philosophy and you want to know more about the Axiom of Choice, check here. If you want to talk about it, come find me!

November 2

To find a basis for Nul A, Col A, or Row A, put A into REF (and get, say, B). The bases for Nul A and Row A are the same as the bases for Nul B and Row B. Why is the basis for Col A not necessarily the same as the basis for Col B, and how can you find a basis for Col A if you know a basis for Col B?

November 4

When computing the change-of-coordinates matrices, just remember that the matrix transforming C-coordinates to B-coordinates is the inverse of the one that transforms B-coordinates to C-coordinates.

November 7

Remember that for the purposes of Chapter 5, all matrices are square. (To find eigenvalues, we will need to be able to compute determinants.)

November 11

Some people seemed confused about multiplicities of eigenvalues. If I ask how many eigenvalues a matrix has, I want to know how many DIFFERENT eigenvalues it has. However, some of them may correspond to multiple factors of the characteristic polynomial. If the characteristic polynomial is (x-5)^2(x-1)(x+6)(x-pi), we say that there are four eigenvalues, one with multiplicity 2 and three with multiplicity 1.

November 14

The example I came up with in class of two matrices with the same eigenvalues that are not similar is correct: the matrix whose first row is 2 1 and second row is 0 2 is not similar to twice the 2x2 identity matrix. Can you prove that?

Also, if you want to know more about Markov chains, take a look at Section 4.9.

November 16

Today we found another use for eigenvalues and eigenvectors: figuring out if a matrix is diagonalizable. We know this helps us speed up computations of powers of matrices. What else might it help us do?

November 18

Here's a practice problem for you: Suppose you have a linear transformation T from the vector space V to the vector space W. B={b1,b2,b3} is a basis for V, and C={c1,c2,c3,c4} is a basis for W. If you know that
T(b1) = 3c1 + c2 + 4c3 - c4,
T(b2) = c1 + 2c2 - c3 + 2c4, and
T(b3) = -2c1 - c2 + 2c3,
what is the matrix for T relative to the bases B and C?

November 28

If you want to know more about the fun you can have with complex eigenvalues, read section 5.5. Remember that to think about this, we have to work with vectors of complex numbers instead of just vectors of real numbers. The problem I was doing in class today is #2 in 5.5: try applying that matrix to the vector [1 0] over and over and plot the vectors you get as points in R^2.

November 30

W and W(perp) must have one vector in common. What is it? Is it possible for them to have any other vectors in common?

Convince yourself that (Row A)(perp) is Nul A if you aren't convinced already!

December 2

When calculating the orthogonal projection of a vector y onto a line parallel to a vector u, you divide y.u by u.u. Do you get to save time on calculations if u is a unit vector?

December 5

Now you know at least three standard ways of factoring a matrix: LU-factorization, the diagonal factorization, and QR-factorization. Think about what must be known about a matrix to be able to factorize it in each of these three ways.

Remember to bring a #2 pencil on Wednesday!