Welcome back to the intuitive introduction to probability theory. To day in the third lecture, or the conditional probabilities I want to tell you about the multiplication rules in probability. When can we multiply probabilities and what does that mean? And where do we have to be careful? So, the conditional probability definition that saw in previous lecture, probability of A given B is equal to the intersection probability divided by the probability of B. If you want to think of the probability B given A it's intersection probability divided by P of A. Now if you do a little algebra and multiply on both sides of this equation with the nominations, so the first equation for example with P of B, we get this multiplication rule that intersection probability of A and B equals to the conditional probability of A given B times the probability of B. You look at this math and you say: "Oh, this looks kind of abstract." Actually, I'm sure you have used it, perhaps without knowing it we use this rule actually all the time. So, let me try to convince you that this is actually rather intuitive. Please recall this spreadsheet on Swiss population data that we have. If you look at the size of the canton of Zurich and the population of Switzerland, you will notice that 17.6% of all Swiss residence live actually in the canton of Zurich. And also in that data was said you will see 19.6% of all Zurich residents are under 20, they are between 0 and 19 years old. And here is the question for you. What proportion of Swiss residents live both in the canton of Zurich and are under 20 years old? So, look at the numbers, if I tell you 17.6% live in Zurich and from those people 19.6% are young what wold you do? You would say: "Yeah, 0.176 times 0.196 that's now the fraction of people that have both, they are both young and live in the canton of Zurich. You do this and you get 3.45%. We have done this, I'm sure, in our lives a lot. If I give you a half of cake and of that cake I cut it in five pieces so I give you a piece, how much of the total cake did you get? We started out with half, I have you one fifth, 20% of that you would say 0.2 times 0.5 is 0.1. I just gave you 10% of the cake. So we multiply these probabilities all the time just as in this example, or in the cake example. But now, let's take a look at, just as every day calculation in the context of conditional probabilities. If we take these proportions as our definition of probability the empirical probability concept, concept number two from what we saw in the previous module. Then we would say the probability that someone is from Zurich is 0.176 so probability from someone being young in the canton of Zurich the probability for 0 to 19 years given the person is from Zurich is 0.196. And now if we use some multiplication rules the probability of Zurich and 0 to 19 years old is then the probability of Zurich times the conditional probability, 0 to 19 years given Zurich bla, bla, bla, you do the math 3.45% exactly what you did with your gut feeling before. So, this abstract looking multiplication rule in conditional probability is actually and every day concept. A proportion of a proportion. And then we multiply. Let me remind you one more of the concept of independence. Independence of the occurence of one event does not effect the chances of another event occurring. That was the probability of A is equal to the probability of A given B. If that's not the case, they are unequal we say dependence. Now, what happens if we take our multiplication rule and now assume that A and B are independent. In that case, the condition probability of A given B is just the original probability of A. Now we place the conditional probability in that general multiplication rule and you get a specialized multiplication rule and that's a multiplication rule I showed you in the previous module when we talked about independent events. So, probability of A intersected B is equal P of A times P of B. Look, that looks much easier than the general multiplication rule. That's why people like this independent event assumption and this rule. Let's look at the example where we can easily use this. Let's say you play a dice game now with three dies. What's the probability of three ones? One on the first roll on the second roll and on the last die. So, you want the probability of one and one and one? Rolling three dies they are independent. I'm allowed to multiply the probability 1/6 times 1/6 times 1/6 it's 1/216. And there's nothing special about the one, one, one. I can ask you what's the probability of first one, then the three then a five, same math. So you see, rather large complex events like three numbers in a row you can do this for twenty numbers in the row for two hundred numbers in the row suddenly gets very, very easy under the assumption of independence. We can just multiply the probabilities. As simple as this is we have to be careful in real world applications. There, often we have to ask yourself: "Is that assumption reasonable?" Can we really assume independence? If yes? Great for you. We can use the independence multiplication rule. If however the answer is no, you are not allowed to use this rule. You may get into real trouble. And later on in this course I will show you some devastating applications where people assumed independence and terrible real-world things happened. Here now, I want to give you a very simple example. Let's say you have a machine in an assembly line and that machine carries a heavy load and as the result it breaks down on average on one out of ten days. On nine out of ten days it can handle the workload and it works fine. So, if we use this historical data now and the equilibrium concept number two empirical definition we can now say the probability of a good day as the point of working is 0.9 of a breakdown is 0.1. And here's the question now: What is the probability that this machine works well two days in a row? So, if I don't know anything I would have to say use the general multiplication rule. The probability of working well on the first day and working well on the second day is probability of working well on the first day times the conditional probability working well on the second day given it worked well on the first day. One probability I know, P of working well on the first day is 0.9, that's from my data. But what's the conditional probability? Now I'm in trouble. So now I would love to assume independence. If I have independence then I can use the simpler rule for independent events at the bottom of the slide 0.9 times 0.9 times 0.9, 0.9 squared is 0.81. But if I cannot assume independence than I need more data. Now what is it? If you as engineers, engineers like to talk about the so-called bathtub curve. A new machine often has breakdowns because it isn't perfectly calibrated so the probability is maybe a little elevated and a machine working well for a day or two or three indicates that the machine is calibrated and the probability changes it reaches a low, it remains constant for a while and eventually due to wear an tear it goes up. In the middle range, independence is an okay assumption but at the front end and at the back end it isn't so here we already see how a very trivial application of multiplying probabilities can result in rather tricky issues. Do we have independence, or do we not have independence? And that is crucial for our calculations. And as I said we will see more cool examples in the lectures to come. Let me wrap up this lecture. We have seen the general multiplication rule for conditional probabilities under the special case of independence this multiplication rule greatly simplifies but be careful assuming independence. Thanks for your attention. Please come back with more fun with probabilities.