Discrete Probability We may not realize it, but probability is something we all have an intuitive “feel” for already. We commonly ask questions that no one can answer for certain, usually because it would involve predicting the future. For example, each week I wonder, “Is it going to rain on Saturday morning?” This question cannot be answered for certain, partly because there is a random aspect to when and where rain falls. For this reason, we refer to things like this as random occurrences. Instead of a sure answer the best we can get is when the meteorologist says, “There is a 30% chance of rain on Saturday morning.” This “educated guess” of 30% is a measure of probability. Here is the generic set-up: • An experiment produces outcomes determined (at least in part) by chance. The set of all possible outcomes is called the sample space of the experiment. An event is a subset of the sample space. • We create a function to assign each outcome of the experiment a real number. Thus we have a (dependent) variable whose outcome is determined (at least in part) by chance. We call such a variable a random variable. The mathematical definition of random variable (R.V.) is a function that maps outcomes of a random experiment into the real numbers. • Random variables for which we can list all possible outputs are called a discrete random variables. (Note: being able to list does not necessarily mean finite.) • To each possible output of the random variable, we assign a number between 0 and 1, to measure the likelihood of that outcome. This number is called a probability of that outcome. Mathematically, we define a probability as ANY assignment p(E) of real numbers to events that satisfies the following three properties: (i) 0 ≤ p(E) ≤1 (ii) p(Sample Space)=1 (iii) If E1 and E2 are nonintersecting events (called mutually exclusive), then p(E1 ∪ E2)=p(E1) + p(E2). • The probability distribution (PDF) of a discrete random variable X is a function that gives for each possible output k of X the probability that X equals k. The sum over all possible values of k of the PDF should be 1. Example 1: In the tennis example above, the random variable would be that it rains on Saturday morning. There are two possible outcomes of for this R.V.—1 if it rains and 0 if it doesn’t rain. This table gives the probability distribution for the R.V. Notice that the sum of the last column is 1. Possible Outcomes 1 0 Probability 0.3 0.7 Example 2: Consider the experiment of rolling a pair of fair dice. Let X=the sum of the dots on the two face-up sides of the dice. We can make a list of the possible outcomes of X and the corresponding probabilities; this is the pdf for X. k 2 3 4 5 6 7 8 9 10 11 12 p(k) 1/36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36 How did we come up with these probabilities? We broke the experiment down into equally likely outcomes. Suppose one of the dice is purple and one is red. (The color of the dice should not affect the probabilities so we can add the color without changing the experiment.) Denote the outcome “purple die has 1 dot on face-up side and red die has 2 dots on the face-up side” as (1,2). Then the following are equally likely outcomes: (1,1), (1,2), (1,3), (1,4),(1,5), (1,6), (2,1), (2,2), …, (6,6). There are 36 of them. Of these 36 equally likely outcomes of rolling two dice, 4 have a total of 5 dots showing on face-up sides: (1,4), (2,3), (3,2), and (4,1). Hence p(X=4) =4/36. We can talk about sets of possible values of X; for example in Example 2 above, consider the subset of outcomes for which X ≥ 10. The probability that X ≥ 10 (denoted p(X ≥ 10 )) would be p(X=10)+ p(X=11) +p(X=12)= 6/36. From Example 2 also consider the set of possible values of X such that X is odd and X>8. p(X is odd and X>8)= p(X= 9) + p(X=11)=6/36. Glossary of Terms Associated with a Discrete Random Variable. • • • • • • If A and B are events, then we can form new events A∩B (read “A and B”), A∪B (read “A or B”), and the complement of A (read “not A”). A pair of events A and B for which A∩B is empty (both events cannot happen simultaneously) are called mutually exclusive. The expected value E(X) of a discrete random variable X is the weighted average of its possible outcomes, where the weight multiplying k is p(X=k). The expected value of the R.V. X in Example 2 is given by E(X) = 2(1/36)+3(2/36)+ 4(3/36)+5(4/36)+ 6(5/36)+7(6/36)+ 8(5/36)+9(4/36)+ 10(3/36)+11(2/36)+ 12(1/36)=7. The conditional probability of an event given another event, denoted p(A|B), is defined to be p(A ∩ B)/ p(B) provided that p(B) is not 0. In Example 2, the probability that X is 7 given that X is odd=P(X=7|X is odd)=6/18. (The denominator is obtained by counting the number of outcomes for which X is odd and the numerator is obtained by counting the number of outcomes for which X is odd and equal to 7.) Addition Rule: p(A∪B)=p(A or B)=p(A)+p(B)-p(A∩B). If A and B are mutually exclusive events, then p(A∪B)=p(A or B)=p(A)+p(B) Multiplication Rule: p(A∩B)=p(A)p(B|A) If A and B are independent events, then p(A∩B)=p(A)p(B). The cumulative distribution function (CDF) of random variable X is a function F defined by F(t)=P(X≤ t).
© Copyright 2026 Paperzz