Math 3160-003, Spring 2014

Schedule

Week Sections in text (estimated) Test Administrivia
Week 1: Jan. 20-24 1.1-1.5 No class Monday
Week 2: Jan. 27-31 2.1-2.5
Week 3: Feb. 3-7 2.7, 3.1, 3.2 Monday: last day to drop without a "W"
         or choose the P/F option
Week 4: Feb. 10-14 3.2-3.4
Week 5: Feb. 17-21 3.4, 3.5 Midterm #1
Week 6: Feb. 24-28 4.1-4.5
Week 7: March 3-7 4.6-4.8
Week 8: March 10-14 4.8-4.9, 5.1-5.4
Week 9: March 17-21 Spring break!
Week 10: March 24-285.5, 5.6 Midterm #2
Week 11: March 31- April 4 5.7, 6.1, 6.2 Monday: last day to drop
          or choose to get a letter grade
Week 12: April 7-11 6.3-6.5, 6.7
Week 13: April 14-18 7.1-7.4
Week 14: April 21-25 7.7, 7.8
Week 15: April 28 - May 2 8.1-8.4

Tuesday, January 21

One of you asked for additional sources for the course. Grinstead and Snell is a very good book, and you can find it here for free.

We'll talk more about the number of ways to rearrange a word when some of the letters are duplicates on Thursday.

Thursday, January 23

One thing to keep in mind when you read the textbook is that there will sometimes be long examples in a section followed by new, useful material, so don't stop reading a section just because the example is complicated! Skip the example and see what's next. (I don't think there are any outrageously complicated examples in the sections we covered this week, but it will happen. Read Section 1.6 if you want a challenge instead!)

Tuesday, January 28

Remember that Axiom 3 may be simpler to think about if you consider only two mutually exclusive events. In that case, it turns into P(E union F) = P(E)+P(F). Try to think of an example of this that involves something straightforward like rolling a die.

Thursday, January 30

The trick we did to find the probabilities of unions of events is called the Inclusion-Exclusion Principle. Prop. 4.4 in Section 2.4 has a more general version if you'd like to read it, but it amounts to what I said in class: sum the probabilities of the individual events, subtract the probabilities of the intersections of the pairs, add the probabilities of the intersections of the triples, etc.

Examples 5n and 5o are somewhat more complicated than the rest of the examples in Section 2.5.

Tuesday, February 4

We went through Example 5i in class today (Section 2.5) and concluded that you only need 23 people for the probability that at least two of them share a birthday to be bigger than .5. How many people do you need for the probability that at least two of them share a particular birthday (for example, yours) to be bigger than .5?

Thursday, February 6

Lots of links today! If you want to run a simulator of the Monty Hall problem, you can use this one on the New York Times website.

Your first midterm will take place on February 18. You can use the midterm I gave last spring as a practice midterm. I expect you to be able to do all the problems on this midterm by the time you take yours (though we haven't discussed how to do the last one yet, we will next Tuesday).

Tuesday, February 11

Here's a Bayesian problem with three options:

A general knows that an attack is coming, and he knows that the enemy will attack on the left, the right, or the center. One of his lieutenants tells him that the probability of an attack on the left is 1/5, the probability of an attack on the right is 3/10, and the probability of an attack on the center is 1/2. His communications officer has been hearing radio traffic, and he knows from experience that the probability of hearing this chatter given that the enemy will attack on the left is 1/5, the probability given that the enemy will attack on the center is 7/10, and the probability given that the enemy will attack on the right is 1/10. Given that this chatter is, in fact, being heard, what are the probabilities of an attack on the left, the right, and the center?

Thursday, February 13

Snow day!

Tuesday, February 18

Snow day!

Thursday, February 20

Finally, a midterm!

Tuesday, February 25

Make sure you understand the table we put together in class so you can understand the difference between disjoint events and independent events.

Thursday, February 27

Try figuring out the probability mass function and graphing the cumulative distribution function for the following experiment and random variables: Roll a fair 6-sided die. Let X be a random variable whose value is the number showing on the die, and let Y be a random variable whose value is 1 if the number showing is odd and 0 if the number showing is even.

Tuesday, March 4

Examples 4b and 4c in Section 4.4 are rather complicated, so proceed at your own risk.

Thursday, March 6

I really suggest reading all the examples in Section 4.6, though 6f is a little complicated. You should also calculate the variance of the 1/3/4/5/6/8 die we talked about in class for a bit more practice.

Tuesday, March 11

Pages 138-145 in the text are a technical discussion of Poisson random variables. You can find, among other things, calculations of the expected value and variance and a discussion of its derivation from the binomial. I don't expect you to read them, though you can if you want!

Keep in mind that while Poisson and binomial random variables both count the number of events/successes, the events for binomial random variables are triggered by something particular (flipping a coin, rolling a die, etc.) while the events for Poisson random variables happen at random (earthquakes, typos, etc.).

Thursday, March 13

We talked about four more distributions today: geometric, negative binomial, hypergeometric, and Benford's law. The last one was just for fun -- you won't be tested on it. For each distribution we've discussed, I expect you to know the formula for the probability mass function, the scenario in which it is used, and the formulas for the expected value and the variance (if I gave them to you). Example 8h in Section 4.8 is rather complicated, but the others are fine.

Last year's second midterm can be found here. I think you can do all the problems on it!

Thursday, March 27

Example 9e in Section 4.9 is a rather complicated calculation, but the others are reasonable.

The main thing you should remember about the formulas for continuous random variables is that if you would use a sum in the discrete case, you need to integrate in the continuous case, and the pdf plays the same role in the computations that the pmf did in the discrete case.

Tuesday, April 1

So far, we aren't into any complicated work with continuous random variables, so just focus on remembering the distributions we're talking about. I'll expect you to know everything about the continuous distributions I expected you to know about the discrete ones.

Thursday, April 3

Here is the Galton board applet I was trying to run in class. I hope it works better for you -- it's a really interesting thing to play with.

Tuesday, April 8

I don't expect you to know anything from Section 5.5.1 (hazard rate functions). It might be useful to the engineers among you, so if you want to talk about it, I'm happy to go over it with you, but it won't be an official part of the class.

Thursday, April 10

Sections 5.7 and 6.1 are good reads. The main thing to remember now is that when you compute the joint p.m.f. of two random variables, you need to calculate P(X=x,Y=y) directly and not just multiply P(X=x) and P(Y=y). We'll talk about when you can do the latter on Tuesday!

Tuesday, April 15

Section 6.2 is pretty straightforward: Example 2d is a classic example, but 2e and 2g are a little more complicated than they need to be. The example on the handout I gave you (which can be downloaded here) is actually Example 3a in Section 6.3. The particular kinds of sums I mentioned can all be found in that section, though I talked about them in reverse order (starting with discrete distributions makes more sense to me).

Thursday, April 17

I think that for this section, the notation is the hardest part to get used to. Make sure you know the difference between f(x,y), f_X(x), and f_(X|Y)(x,y).

Tuesday, April 22

The bivariate normal distribution is described in Example 5d in Section 6.5. It's a good one to know about even if I won't cover it in this class.

All but the first example in Section 6.7 are more complicated than we'll go through in this class.

Thursday, April 24

Most of Section 7.2 describes things you already know. For instance, they calculate the expected values of binomial and negative binomial random variables.

We're skipping Section 7.3 even though it's on the schedule. It's by far the least important for our purposes.

Most of the examples in Section 7.4 are quite reasonable.

Tuesday, April 29

A few of you asked how to use joint moment-generating functions. If you've got the moment generating function M(t_1,t_2) of X and Y, then if you take the partial derivative with respect to t_1 n times and the partial derivative with respect to t_2 m times and then plug in 0 for both t_1 and t_2, you get E[(X^n)(Y^m)].

Here's the final exam I gave last spring.

Thursday, May 1

All the examples in Sections 8.2 and 8.3 seem reasonable. If I put a question about Chebyshev's inequality or Markov's inequality on the exam, you won't have to guess what kind of problem it is. I'll tell you in the problem which inequality I expect you to use.

Good luck studying! I'll have your last homeworks graded by the start of office hours on Monday.