Math 3160-005, Spring 2013

Schedule

Week Sections in text (estimated) Test Administrivia
Week 1: Jan. 21-25 1.1-1.5 No class Monday
Week 2: Jan. 28 - Feb. 1 2.1-2.5
Week 3: Feb. 4-8 2.7, 3.1, 3.2 Monday: last day to drop without a "W"
         or choose the P/F option
Week 4: Feb. 11-15 3.2-3.4
Week 5: Feb. 18-22 3.4, 3.5 Midterm #1
Week 6: Feb. 25 - March 1 4.1-4.5
Week 7: March 4-8 4.6-4.8
Week 8: March 11-15 4.8-4.9, 5.1-5.4
Week 9: March 18-22 Spring break!
Week 10: March 25-295.5, 5.6 Midterm #2
Week 11: April 1-5 5.7, 6.1, 6.2 Monday: last day to drop
          or choose to get a letter grade
Week 12: April 8-12 6.3-6.5, 6.7
Week 13: April 15-19 7.1-7.4
Week 14: April 22-26 7.7, 7.8
Week 15: April 29 - May 3 8.1-8.4

January 23

We said today in class that if you have n different possibilities for one space on a license plate and m for the next, you have nm possibilities total for those two spaces. Your book calls this the basic principle of counting (followed by the generalized basic principle of counting).

You can think about how you would draw a tree for permutations. If repetitions are allowed, you know how to draw each level. How is the tree affected if repetitions aren't allowed?

January 25

Example 4c in Section 1.4 is more complicated than the other ones. I particularly recommend that you read 5b and 5c in Section 1.5: the difference between them is very important. I'll talk about that more in class on Monday.

January 28

We talked in class about infinite sample spaces, and one of the examples we found was an experiment in which a coin is flipped until it comes up tails. The main characters in Rosencrantz and Guildenstern Are Dead perform this experiment here.

January 30

The most important practical things to remember from today's class are the following:

February 1

Remember that when drawing a Venn diagram for multiple events, it's normally best to fill in the numbers from the inside out.

Here's the problem I promised you: If you roll three fair, six-sided dice, it makes sense at first that a sum of 9 and a sum of 10 have equal probability since they can both be obtained in three ways:

9 = 1+2+6, 1+3+5, 1+4+4, 2+2+5, 2+3+4, 3+3+3

10 = 1+3+6, 1+4+5, 2+4+4, 2+3+5, 2+2+6, 3+3+4

Compute the actual probabilities of these sums and show that a sum of 10 is more likely than a sum of 9. Do you understand why?

February 4

I consider examples 5n and 5o in section 2.5 to be challenging examples: they're good to work through if you want to study the ideas in the section in great depth. Example 5m is the one I was working on at the end of today's class.

We've been using Venn diagrams for about a week now, and I mentioned that I found a diagram with 4 sets very difficult to draw. Someone has made a Venn diagram with 7 sets. Take a look at it, and don't forget to flip it over!

February 6

Today's main lesson is that there are two main types of problems involving conditional probabilities: if you're given the conditional probabilities, you can use them to calculate the probabilities of intersections of those events, and if you know (or can calculate) the probabilities of the intersections of some events, you can calculate the corresponding conditional probabilities.

February 13

If you want to run a simulation of the Monty Hall problem a few times, try this one.

Remember that you use Bayes's Theorem when you have P(E) and P(F|E) and you want to find P(E|F).

February 15

Most of the examples in section 3.3 are useful, but I think 3g, 3h, and 3o are more challenging. Example 3m will give you an example of a three-level tree. Try drawing it out!

The New York Times article I mentioned can be found here.

February 18

Examples 4h, 4i, 4j, 4k, 4j, and 4l are the most challenging exercises in Section 3.4. (I do recommend looking at 4j, though: it's a very nice classical problem.)

February 22

The main word for the day is "trial": it's what you call a subexperiment in an experiment. For instance, if you flip a coin 12 times, each individual flip can be called a trial.

February 25

Remember that while the set of outcomes of an experiment is fixed, we can use different random variables to answer different questions about the outcomes. We also defined two kinds of functions today. The probability mass function gives you the probability that the random variable is EXACTLY a given value, and the cumulative distribution function gives you the probability that the random variable is LESS THAN OR EQUAL TO a given value. Make sure you know which is which!

February 27

Remember that if g(x) is not a linear function, you need to calculate E[g(X)] directly, but if g(x) = ax+b, then you can just calculate E[X] and find that E[g(X)] = E[aX+b] = aE[X]+b.

Example 4c is easily the longest and most challenging example in Section 4.4.

March 1

Remember the formulas for E[aX], Var(aX), and SD(aX) and E[X+b], Var(X+b), and SD(X+b)! (If you're having trouble memorizing them, memorize only the E[X] and Var(X) formulas: the SD(X) formulas can be derived easily from the Var(X) formulas.)

March 4

Remember that a Bernoulli random variable measures success or failure of ONE trial and that a binomial random variable counts the number of successes in a set number of trials.

Example 6g is more challenging than the rest of the examples in Section 4.6.

March 6

I talked about multinomial random variables in class today. A description of them can be found on p. 240 (Example 1f, Section 6.1). I will expect you to know what kind of situations they can be used to model and what the formula is, but the related theory will come later in the course.

March 11

The derivations of the formula for Poisson random variables and their expectation and variance are given in Section 4.7, so if you want to test your knowledge of series and limits, reading those would help.

Examples 7b is a good example of a Poisson being used to approximate a binomial, and example 7d is a very challenging one.

March 13

First of all, a reminder about cumulative distribution functions: they have values at every real number, and they never decrease!

Second of all, at this point, we have four basic kinds of discrete random variables. At this point, it might help you to start making a table of types of random variables. With each, you could include (1) the general kind of situation it applies to, (2) a particular situation it applies to, (3) the parameters you need to calculate its p.m.f., (4) the formula you need to calculate its p.m.f., (5) the formula for its expected value, and (6) the formula for its variance. Are there any other factors that might help you learn about these distributions?

March 15

Example 8h in Section 4.8 is a nice example of another application of a hypergeometric random variable: estimating animal populations. We won't discuss the zeta distribution (subsection 4.8.4).

We didn't officially cover Section 4.10 -- it's basic facts about cumulative distribution functions, most of which I mentioned briefly in class when I defined c.d.f.s in the first place. Reading it might help you study for the midterm, though! Enjoy your spring break.

March 29

The main thing to remember about continuous random variables is that essentially the same formulas hold for them that hold for discrete random variables -- you just have to change the sum to an integral and trade the pmf for a pdf.

I really recommend reading Example 1d in Section 5.1.

April 1

Take a look at Example 3c in Section 5.3 -- there aren't many ways to set up a problem that could sensibly be modeled using a uniform distribution, but this is one of them.

April 3

You can read a very good description of a Galton board (and see a picture of the original that Galton built!) here, and you can try running a simulation of one yourself here. Try changing the parameters and see what happens! We'll talk about approximating a binomial random variable using the normal distribution on Friday.

April 5

Example 4i in Section 5.4 would be a very good one to read, and I won't expect you to know about hazard rate functions.

Remember that the continuity correction is only used when approximating a binomial random variable with a normal random variable!

April 8

Example 6b in Section 5.6 gives another way of looking at the Cauchy distribution: first, you calculate its distribution function, and then you take its derivative to get the density function. You should read the rest of Section 5.6, too, but I won't ask you to apply it in this course.

April 10

I recommend that you work out the joint cumulative probability distribution function of X and Y, where f(x,y) = 1 on the square 0 I also recommend that you work through Example 1e in Section 6.1. It gives a really good example of how to work with nonrectangular regions in the plane.

April 12

The handout I referenced in class can be found here. It's essentially a more detailed version of Example 3a in Section 6.3. I recommend that you read it carefully, learn the procedure, understand how to find the different cases for z, and find the bounds for each case.

April 15

The proofs of most of the results I mentioned in class today can be found in Section 6.3 (along with some other results, like the formula for the sum of n independent geometric random variables, all with different probabilities).

I recommend working through a few more cases of the conditional distribution we considered in class today. What's P_(X|Y)(1,1)? P_(Y|X)(1,0)?

If you want a harder discrete conditional distribution problem to work on, here's one: Find the jpdf, marginal density functions, and conditional distributions for X and Y, where X is the number of aces in a standard poker hand and Y is the number of kings in that hand.

April 17

You can look at the definition of the bivariate normal distribution in Example 5c of Section 6.5.

April 19

If you'd like to see an example of a joint distribution of a function of three random variables, Example 7d in Section 6.7 is a good one. Examples 2f and 2g in Section 7.2 use the techniques from this section on distributions you already know, but a lot of the later examples are rather complicated and may not be the best use of your time.

April 22

To make sure you understand the method for finding the moments of the numbers of events that occur, try working out the third moment of a binomial random variable without looking at the book. You could also try this method on a hypergeometric random variable (Example 3b in Section 7.3).

April 24

We talked about covariance today and showed that Cov(aX,Y) = aCov(X,Y). What do you think the formula for Cov(X+b,Y) is?

April 26

Remember that if you just want to know if Y tends to increase when X does, all you need to know is Cov(X,Y), but if you want to know how strong that tendency is, you need to calculate rho(X,Y).

We also started talking about moment-generating functions today and worked through finding one for a binomial random variable. Poisson and exponential random variables are naturals for moment-generating functions due to the presence of an e: try calculating those!

April 29

I was asked in class today how to use joint moment-generating functions. The answer is that if you have M(s,t) as the moment-generating function of X and Y (so M(s,t) = E[e^(sX+tY)]), then the partial of M(s,t) taken with respect to s n times and taken with respect to t m times and then evaluated at (0,0) is E[X^n Y^m]. (Note that if the function M is nice enough, the order you take these derivatives in won't matter.

May 1

First of all, the Connecticut state fossil is is Eubrontes.

I am delighted to say that all the examples in Section 8.3 are quite readable!

Also, the Connecticut state shellfish is the Eastern Oyster.

May 3

Today was just review. Thank you for being a lovely class all term. There will be a review session in MSB 211 on Sunday night from 7:00 to 9:00, and then your exam is next Monday at 3:30! See you then.