Welcome back to the second lecture of Probability Distributions. Now, we're going to talk about the expected value. Why do we even need this number? I had shown you before the graphical representations of probability distributions. As soon as we have more and more possible numbers in a random variable, these graphical representations get messy. It's difficult for us humans with our limited brain capacity to really grasp and understand these random variables. So what we like to do is summarize the information into sums, so-called summary measures. And the first sum measure is a so-called average mean or expected value. Let me show you the definition. First time you look at this definition, you think, what the heck are they thinking? The mean or expected value is a probability weighted sum of all possible values. Here's a formula, the expected value E of X. Or also called mu, the greek letter for the m. Mean, mu is equal to x1 times the probability of x1. Plus x2 times the probability of x2, blah blah blah, plus xk, the last value times its probability. The first time you look at this formula and say, why, this looks so complicated. But let me assure you, this is something you have done before. This is actually very natural. And its a natural extension of things we do all the time when we average out numbers. Let me show this to you in a spreadsheet where we're going to look at really simple example the fair die. So lets have a look at this spreadsheet. Now that we have understood the formula and seen the formula in action. Let's talk about the interpretation. Many people finds the interpretation of an expected value of 3.5 really strange. Because in everyday language, when we use expectation, I expect something to happen. Then I think there's a very high probability something will happen. But the probability of a 3.5 is 0. So here we have a clash between the every day language we use when we use the term I expect to meet someone. I expect it to rain. I expect to see something. And the concept of expected value. So how can we understand this idea of expected value mean. Average. That is best explained using a Monte Carlo simulation. Here I prepared a Monte, such a Monte Carlo simulation of a fair die for you. Let's look here at the spreadsheet. I have ten columns of a thousand numbers each. So here, I have ten thousand times. Essentially, a fair die that's rolled. Then, I average out, with the every day gut feeling number. Add up the 10,000 numbers, and the sum divided by 10,000. And here's what I get. In this particular example of 10,000 random numbers, we have an average of 3.4843. Ha. That's very close to now the 3.5. And this gives us the key to understanding an expected value, a mean, an average. It does not mean that on the next roll of a fair di, I will get a 3.5. That's impossible. But what this number means is that if I repeat this experiment of rolling a fair die many many times. Then the long run average will get close to 3.5. And that's the key interpretation. I ask you to play around with the spreadsheet, which you can download. Click F9, refresh the numbers. And you will see the means fluctuate, but they fluctuate very closely around 3.5. And this will give you some feeling, hopefully some intuition, for the concept of expected value mean Y average. So let's go back to the slides, summarize what we learned about the expected value of a fair die. And move on to some more examples. Here is a summary of what we just did in the spreadsheet. The expected value of a fair die, or a random variable D, is 3.5. And we would now call this either the mean, the average, or the expected value. Let's now look at another example. Back to the roulette game. We recall there are 37 numbers. Actually, these numbers are also color coded. 0 is special, gets the color green. In addition, we separate the other 36 numbers into 18 red numbers and 18 black numbers. And here in the picture of a roulette table, you can see the colors of various Numbers. And now it's actually possible for you not just in this game to bet the chip on a particular number. But know you can also bet on the color. So lets now say you want to bet on the color red. What this means is that if one of the 18 numbers that are color-coded red appear, you win an extra chip. And you get two chips back for net win of 1. If, however, the green 0 or one of the 18 black numbers appears then you'll lose your chip. Question, what is the expected value of this color bet? How much money will you win or lose on average? So let's take this game into our language of probability. Here we have a random variable B, for your bet. And there are two possible value. You win one. Or you lose one. Minus one for the loss. The probability of you winning is 18 divided by 37. Why? Because there are 18 red numbers out of 37. So you see, the probability is 48.65 percent. The probability that B takes on the value negative one. That means you lose your chair, is 19/37. Why 19? 18 black numbers plus 1 green number, you lose. So 19/37 is the probability you're losing. That is 51.35%. Now with these probabilities in hand, we can calculate the expected bet value. So E(B) = 1x If you [INAUDIBLE] 18 divided by 37, plus negative 1 times the probability 19 divided by 37. In the end, it comes out to a loss of 0.027. So, on average, you will lose money with this bet. Let's think back to the interpretation we had with the di. If you play this color bet for a long long time, many many times, a visit to a casino, on average, you will lose about 0.027 chips. In the language of gambling, what this means is the so called house advantage. On average, the casino wins money. Now, I understand some people don't care much about gambling. Let's now look at a real life application where the expected value is really important. Recently, I was on a business trip. And I had to rent a car. And with the car rental. I was told that my insurance, in case I have an accident, carries a deductible of 800 Euros. That means, if I have an accident, the first 800 Euros of damage, I have to pay. Any damage above that is paid for by an insurance. Then I was offered the following extra insurance. For an extra fee of 84 Euros, I could buy a deductible of 100 Euros. So I could decrease my deductible by 700 Euros. Here was my decision problem. Should I buy this insurance? So let's translate this real world problem into a probability problem. There's a random variable, which is my accident payment. Let's make s simplifying assumption. As soon as there's an accident, the damage exceeds 800. So I will certainly have to pay 800 in that case. So we neglect accidents like bumper damage of say, perhaps 500. So we either have I have to pay 800 Euros, or zero. Because I have no accident. If I buy the extra insurance,these numbers will be 100, and zero. so. What's now the larger cost for me. Here, let's look at the expected value. What is the expected value of my payment, without insurance and the extra insurance? and what's the expected payment with the extra insurance? So now I have a problem I don't know the probability of an accident. So, let's call that p for the moment. Without the extra insurance, with probability p, I have to pay 800 Euros. With probability one minus p, I pay nothing. No accident. We only have the two outcomes, accident, no accident. Thus the extra insurance I have to pay for €84 right away. Then the probability p I have to pay a deductible of 100. With probability of 1-p, I pay nothing. And so in this case as you see on the slide, so total payment will be 100 times p plus 84 expected value Of the payment. Now, some of you may think, right away, now I can ask. at which probability would these two expect a payment to be the same? With a little bit of algebra. We get to eight and a third percent. So, if I think that my probability of an accident is larger than eight and a third percent, I should buy this insurance. If I think the probability of an accident is smaller, I should not buy the insurance. But I don't know the probability. Ha! But now, let's think about the insurance company. They are not forced to offer me this extra insurance. They are willing, freely willing to give me that insurance. In fact the car rental agent tried to talk me into buying that insurance. So what does that tell me? Obviously, it's Profitable for them to sell me this insurance, for me to buy it. So clearly, they have data. And they know the probability of an accident is less than eight and a third percent. Think about this, 1 in 12 let me knock on wood, I hope not. I don't think I have that high offer rate. So therefore, my conclusion was, I believe the car rental agency thinks they will make money on this extra insurance. So they may think the probability of Nixon is actually very, very low. And they take, they are happy to take the 84 Euros from me. So in the end, I decided not to buy the insurance. And luckily, I didn't have any accident and returned the car without any damage. And so this, so it summarizes or concludes the real word application of an expected value. So to summarize, we learned about our first summary measure, the expected value, the mean The average. We saw in the spreadsheet the interpretation as a long run average of many repetitions. And in the end, we saw a real world application, namely this tiny insurance application. Thanks for your attention. And please come back for more in the next lecture. Thank you.