Chapter 1 – Discrete Probability Distributions
1.1 Simulation of Discrete Probabilities
Probability
In this chapter, we shall first consider chance experiments with a finite number of possible outcomes ω1, ω2, . . . , ωn. For example, we roll a die and the possible outcomes are 1, 2, 3, 4, 5, 6 corresponding to the side that turns up. We toss a coin with possible outcomes H (heads) and T (tails).
It is frequently useful to be able to refer to an outcome of an experiment. For example, we might want to write the mathematical expression which gives the sum of four rolls of a die. To do this, we could let Xi, i = 1, 2, 3, 4, represent the values of the outcomes of the four rolls, and then we could write the expression
X1 + X2 + X3 + X4
for the sum of the four rolls. The Xi’s are called random variables . A random variable is simply an expression whose value is the outcome of a particular experiment. Just as in the case of other types of variables in mathematics, random variables can take on different values.
Let X be the random variable which represents the roll of one die. We shall assign probabilities to the possible outcomes of this experiment. We do this by assigning to each outcome ωj a nonnegative number m(ωj) in such a way that
m(ω1) + m(ω2) + · · · + m(ω6) = 1 .
The function m(ωj) is called the distribution function of the random variable X . For the case of the roll of the die we would assign equal probabilities or probabilities 1/6 to each of the outcomes. With this assignment of probabilities, one could write
P (X ≤ 4) = 2/3
to mean that the probability is 2/3 that a roll of a die will have a value which does not exceed 4.
Let Y be the random variable which represents the toss of a coin. In this case, there are two possible outcomes, which we can label as H and T. Unless we have reason to suspect that the coin comes up one way more often than the other way, it is natural to assign the probability of 1/2 to each of the two outcomes.
In both of the above experiments, each outcome is assigned an equal probability. This would certainly not be the case in general. For example, if a drug is found to be effective 30 percent of the time it is used, we might assign a probability .3 that the drug is effective the next time it is used and .7 that it is not effective. This last example illustrates the intuitive frequency concept of probability. That is, if we have a probability p that an experiment will result in outcome A, then if we repeat this experiment a large number of times we should expect that the fraction of times that A will occur is about p. To check intuitive ideas like this, we shall find it helpful to look at some of these problems experimentally. We could, for example, toss a coin a large number of times and see if the fraction of times heads turns up is about 1/2. We could also simulate this experiment on a computer.
We want to be able to perform an experiment that corresponds to a given set of probabilities; for example, m(ω1) = 1/2, m(ω2) = 1/3, and m(ω3) = 1/6. In this case, one could mark three faces of a six-sided die with an ω1, two faces with an ω2, and one face with an ω3.
In the general case we assume that m(ω1), m(ω2), . . . , m(ωn) are all rational numbers, with least common denominator n. If n > 2, we can imagine a long cylindrical die with a cross-section that is a regular n-gon. If m(ωj) = nj/n, then we can label nj of the long faces of the cylinder with an ωj , and if one of the end faces comes up, we can just roll the die again. If n = 2, a coin could be used to perform the experiment.
We will be particularly interested in repeating a chance experiment a large number of times. Although the cylindrical die would be a convenient way to carry out a few repetitions, it would be difficult to carry out a large number of experiments. Since the modern computer can do a large number of operations in a very short time, it is natural to turn to the computer for this task.