1. ECE 109 DISCUSSION 3 1.1. Geometric Probability Law. Consider a sequential experiment in which we repeat independent Bernoulli trials until the occurrence of the first success. Remark 1.1. Geometric probability law: Let Ai be the event of success in the ith trial. Then, the probability that (m − 1) trials result in failures and the mth trial in success is i h p (m) = P Ac1 Ac2 . . . Ac(m−1) Am
for
m = 1, 2, . . . .
Let p be the probability of success. Since Ai for all i = 1, 2, . . . are independent, h i p (m) = P [Ac1 ] · P [Ac2 ] · · · P Ac(m−1) · P [Am ] = (1 − p)(m−1) p
for
m = 1, 2, . . . .
Remark 1.2. The probability that more than M trials are required before a success occurs is given by P [{m > M }] =
∞ X
p (1 − p)(m−1) ,
m=M +1
which letting l = m − M − 1, =
∞ X
p (1 − p)(l+M ) ,
l=0
= p (1 − p)M
∞ X
(1 − p)l ,
l=0
since 0 < (1 − p) < 1, = p (1 − p)M
1 = (1 − p)M . 1 − (1 − p)
1.2. Examples. Example 1. Let P [A] = 0.8, P [B c ] = 0.6, and P [A ∪ B] = 0.8. Shows (a) P [Ac | B c ] = (b) P
[B c
| A] =
1 3
.
1 2.
Proof. (a) By the definition of conditional probability, P [Ac ∩ B c ] P [(A ∪ B)c ] 1 − P [A ∪ B] 1 − 0.8 1 P [A | B ] = = = = = P [B c ] P [B c ] P [B c ] 0.6 3 c
c
as required. (b) By the definition of conditional probability, P [A ∩ B c ] P [A] − P [A ∩ B] = P [A] P [A] P [A] + P [B] − P [A ∪ B] P [A ∩ B] = 1− =1− P [A] P [A] 0.8 + 0.4 − 0.8 1 = 1− = 0.8 2
P [B c | A] =
1
as required.
Example 2. A ternary communication channel is shown in the Fig. P2.4. Suppose that the input symbols 0, 1, and 2 occur with probability 1/2, 1/4, and 1/4, respectively. (a) Find the probabilities of the output symbols. (b) Suppose that a 1 is observed as an output. What is the probability that the input was 0? 1? 2? Also, find the bound of ε to assume that the input was 1. Proof. Denote the input by X and the output by Y . (a) Using the law of total probability and conditional probability, P [Y = 0] = P [Y = 0 | X = 0] P [X = 0] + P [Y = 0 | X = 2] P [X = 2] 1 1 1 1 (1 − ε) + ε = − ε. = 2 4 2 4 Similarly, P [Y = 1] = P [Y = 2] =
1 ε+ 2 1 ε+ 4
1 (1 − ε) = 4 1 (1 − ε) = 4
1 1 + ε, 4 4 1 . 4
Note that P [Y = 0] + P [Y = 1] + P [Y = 2] = 1.
(b) Using the conditional probability, P [X = 0 | Y = 1] = =
P [X = 0 ∩ Y = 1] P [Y = 1 | X = 0] P [X = 0] = P [Y = 1] P [Y = 1] 1 2ε 1 1 4 + 4ε
=
2ε . 1+ε
Similarly, P [X = 1 | Y = 1] =
1 4
(1 − ε) 1−ε = , 1 1+ε + 4ε
1 4
P [X = 2 | Y = 1] = 0. In order that the input was 1, P [X = 1 | Y = 1] > P [X = 0 | Y = 1] and P [X = 1 | Y = 1] > P [X = 2 | Y = 1]. So, 1 − ε > 2ε. Hence, if ε < 13 , then we can think the input was 1.
Example 3. A unbiased dice is rolled repeatedly until 6 has shown up twice. (a) Find the probability that first 6 shows in the kth rolls. (b) Find the probability that m rolls are required until 6 has shown up twice. 2
Proof. The probability that 6 has shown up each roll is p = 61 . (a) By the geometric probability law, P [(k)] = (1 − p)(k−1) p
for k = 1, 2, . . . , m
(b) Let two events A and B be ”mth roll is 6” and ”first 6 occurs in m − 1 rolls”, respectively. Hence, P [{m rolls required until 6 shows up twice}] = P [A ∩ B] = P [A | B] P [B] m−1 2 = p (1 − p)(m−1−1) 1 for m = 2, 3, . . ..
3