This lecture leads us from the discrete random variable to continuous random variables. Recall that a binomial random variable, such as the number of heads in ten coin tosses can only take a discreet number of values. Here 0, 1, 2 up to 10. For a binomial random variable in which the probability of success is p and the trials is n, the probability that the random variable takes the value k for k equals 0, 1, 2 up to n is n choose k times p to the power k times 1-p raised to the power n-k. This formula is called the probability mass function for the binomial. The probability mass function can be visualized as a histogram. The area under the histogram is one and the area of each bar is the probability of seeing a binomial random variable, whose value is equal to the x-value at the center of the bars base. In contrast, the normal distribution also called the Gaussian distribution or the bell-shaped curve can take any numerical value between negative infinity and positive infinity. Since it can take a continuum of values, it is a continuous random variable. In general, if the set of possible values a random variable can take are separated points, it is a discrete random variable. But if it can take any value in some possibly infinite interval, then it is a continuous random variable. When the random variable is discreet, it has a probability mass function or pmf. That pmf tells us the probability that the random variable takes each of the possible values. But when the random variable is continuous, it has probability zero of taking any single value. We can only talk about the probability of a continuous random variable lined within some interval. For example, suppose that heights are approximately normally distributed. The probability of finding someone who is exactly 6 feet tall at 0.0000 inches tall for an infinite number of 0s after the decimal point is 0. But we can easily calculate the probability of finding someone who is between 5'11" inches tall and 6'1" inch tall. Continuous random variables have probability density functions or pdfs instead of probability mass functions. The probability of finding someone whose height lies between 5'11" and 6'1" is the area under the pdf curve for height between those two values. For example, a normal distribution with mean mu and standard deviation sigma highest PDF curve that is defined by 1 over the square root of 2 pi times 1 over sigma times e raised to the power -1 over 2 sigma squared times x minus mu quantity squared. Here, x is any value the random variable can take. Recall that a probability mass function assigns the probability that a random variable takes a specific value for the discrete set of possible values. The sum of those probabilities over all possible values must equal one. Similarly, a probability density function is any function of x that is non-negative and that has area one underneath its curve. The pdf can be thought of the limit of histograms made from its sample data. As the sample size becomes infinitely large, the bin width of the histogram shrink to zero. There are infinite number of pmf's and an infinite number of pdf's. Some such as the binomial and the normal are so important that they have been given names. For the rest of this week, we shall talk about three more named continuous distributions, the uniform, the beta and the gamma distributions. We shall also talk about a new discrete distribution, the Poisson distribution. Before closing, let's review the key ideas to take away. First, continuous random variables exist and they can take any value within some possibly infinite range. Second, the probability that a continuous random variable takes a specific value is zero. Third, probabilities from a continuous random variable are determined by the density function with this non-negative and the area beneath it is one. And fourth, we can find the probability that a random variable lies between two values say, c and d as the area under the density function that lies between c and d.