MATH TRIPOS: PART I I
L2011
YMS
APPLIED PROBABILITY Example Sheet 1 ( with comments) 1. Suppose X , Y and Z are exponential random variables of parameters respectively ely.. What is the distribution distribution of W = min X , Y , Z ? α, β and γ respectiv What is the probability that Z Shoow that rand random om Y ? Y ? Z Y X ? Sh variable W and the event Z Y X are independent independent.. State and prove prove a similar result for n random variables. Hint: Work with tail probabilities.
≤
{ ≤ ≤ }
{
≤ ≤
}
2. Let T 1 , T 2, . . . be independent exponential random variables of parameter λ and let N be an independent geometric random variable with P(N =
(1 n) = β (1
− β )
n−1
n = 1, 2, . . . .
,
N
Show that T =
Shoow T i has exponential distribution of parameter λβ . Sh
i=1
that,
n
∀ n ≥ 1, the sum S =
(n, λ), T i has Gamma-distribution Gamma (n,
i=1 n n−1
with the PDF f S S (x) =
λ x e−λx , x > 0. Hint: For S you may use (n 1)!
−
induction in n. For T , calculating the moment-generating function can be useful, along with the fact that it determines the distribution uniquely.
[For S : use induc inductio tion n in n. For n = 1: S = T 1 Exp Exp((λ) Gam (1, (1, λ), trivially. Make the induction hypothesis for n and pass to n + 1: 1:
∼
∼
S n+1 = S n + T n+1 , independently. Thus, the PDF f S S +1 is given by n
x
f S S +1 (x) = n
f S S (s)f T T1 (x n
0
)ds − s)ds
(the convolution formula).
Now use the induction hypothesis:
x
λnsn−1 −λs −λ(x−s) ds e λe ( 1)! n 0 x λn+1 −λx =e sn−1 ds (n 1)! 0 n+1 n λ x = . n!
f S S +1 (x) = n
− − 1
For T : use the above fact and sum a geometric progression or proceed with the MGFs (also arriving at a geometric progression):
− − − − − − − θT
φT (θ) = Ee =
=
β
1
β
β
1
E
eθT N
β )n−1
β (1
n
=
=E
(1
=
E
eθT N = n
P(N =
n)
n≥1
θT 1 n
Ee
β )
n
n
λ
λ
θ
(1 β )λ (1 β )λ 1 β λ θ λ θ λβ λβ = . θ (1 β )λ λβ θ
− − = λ− − −
−
−
As the MGF determines the PDF uniquely, conclude that T Exp(βλ).]
∼
3. Prove that the determinant det etQ is of the form etq and hence is > 0 for any finite Q-matrix. Can you determine q? Hint: Use the semi-group property to calculate det e(t+s)Q ; by analysing the expansion at t 0 conclude that q =tr Q.
∼
[ det e(t+s)Q = det esQ det etQ , and det etQ is continuous (and even differentiable) in t, with det e0·Q =det I = 1. Hence, det etQ = etq for some (real) q. To find q, one can observe that q=
d det etQ dt
t=0
d det (1 + tQ) dt
=
t=0
.
That is, q is the first order coefficient of the polynomial det (I + tQ) which is precisely tr Q. Note that unless Q = 0, det etQ 0 as t .]
→
→∞
(b) Prove that the following transition probability matrices cannot be 1 0 0 written in the form eQ where Q is a 3 3 Q-matrix: (i) P = 1 0 0 , 0 1 0 0 1 0 (ii) P = 0 0 1 . Hint: it is instructive to consider the matrix P 3 . 1 0 0
×
4. State the definition of a Poisson process in terms of independent increments. Show directly from this definition that the first jump time of a Poisson process of rate λ is exponential of parameter λ. 2
Furthermore, show that the holding times of the process are IID exponential variables of rate λ. Hint: Deduce the characterisation of a Poisson process through its holding times from its characterisation through independent increments. [For Poisson processes, then for birth and birth and death processes and finally for continuous-time Markov chains, there were three definitions given, (a) in terms of the transition probabilities (in the case of a Poisson process, in terms of increments), (b) in terms of the infinitesimal probabilities, (c) in terms of the jump chain and holding times. Many of the standard problems revolve around the equivalence of these various definitions, so it is necessary to make explicit which is to be regarded as the starting point. If none is mentioned the student is free to choose the most convenient definition.
5. Assume the infinitesimal definition of the Poisson process. Let p(t) denote the probability that the first jump time exceeds t. By considering the difference p(t + h) p(t), deduce a linear diferential equation for p(t) and solve it.
−
[Set p(t) = P(X t = 0), then p(t + h) = P(X t = 0 and X t+h X t = 0) = P(X t = 0)P(X t+h X t = 0) = p(t)(1
−
so p(t + h) uniformly in t
−
− λh + o(h))
− p(t) = −λhp(t) + o(h)
≥ 0. Hence p(t) is differentiable and p (t) = −λp(t), p(0) = 1 , ′
which shows p(t) = e−λt . This is a simple version of an argument used in lectures to get the distribution of the whole process.]
6. Arrivals of the Number 1 bus form a Poisson process of rate 1 bus per hour, and arrivals of the Number 7 bus form an independent Poisson process of rate 7 buses per hour. (a) What is the probability that exactly 5 buses pass by in 1 hour? (b) What is the probability that exactly 3 Number 7 buses pass by while I am waiting for a Number 1? (c) When the maintenance depot goes on strike half the buses break down before they reach my stop. What then is the probability that I wait for 30 minutes without seeing a single bus? 3
Answers: (a) e−885 /4!, (b) 73 /84 , (c) e−2 . [Done by using y standard properties of the Poisson process.]
7. Customers arrive in a supermarket as a Poisson-process of rate N . There are N aisles in the supermarket and each customer selects one aisle at random, independently of the other customers. Let X tN denote the proportion of aisles which remain empty at time t and let T N denote the time until half the aisles are busy. Show that X tN
−t
→e
T N
,
→ log2
in probability as N . Hint: Identify the processses of customers’ arrival in each aisle and relate them to the event T N > t .
→∞
{
}
[Customers arrive in each aisle as independent Poisson processes of rate 1. Let Y ti be the indicator function of the event that the ith aisle is empty at time t. Then Y t1 , . . . , Yt N are independent Bernoulli with P(Y ti = 1) = e−t . By the law of large numbers X tN = N −1 (Y t1 +
−t
N t
· · · + Y ) → e almost surely as N → ∞. Since {T > t} = {X ≥ 1/2} it follows that N
N
P(T
so T N
> t)
N t
→
1 if t < ln 2, 0 if t > ln 2,
→ ln 2 in probability.]
8. A pedestrian wishes to cross a single lane of fast-moving traffic. Suppose the number of vehicles that have passed by time t is a Poisson process of rate λ, and suppose it takes time a to walk across the lane. Assuming the pedestrian can foresee correctly the times at which vehicles will pass by, how long on average does it take to cross over safely? Hint: Let T be the time to cross and J 1 the ttime at which the first car passes. Identify the contributions to E T from the events J 1 > a and J 1 < a . How long on average does it take to cross two similar lanes (a) when one must walk straight across, (b) when an island in the middle of the road makes it safe to stop half way?
{
{
}
4
}
[Let T be the time to cross and J 1 the time at which the first car passes. Then a − E(T ) = aP(J 1 > a) + 0 (s + E(T ))λe λs ds = (1 e−λa ) (E(T ) + 1/λ) .
−
Hence E(T ). Check correct as λ 0. Walking straight across replaces a by 2a and λ by 2λ, because the sum of two independent Poisson processes of rate λ is a Poisson process of rate 2λ. Using an island, it takes twice as long on average as a single lane. How much quicker is it to use an island? ]
→
9. Customers enter a supermarket as a Poisson process of rate 2. There are two salesmen near the door who offer passing customers samples of a new product. Each customer takes an exponential time of parameter 1 to think about the new product, and during this time occupies the full attention of one salesman. Having tried the product, customers proceed into the store and leave by another door. When both salesmen are occupied, customers walk straight in. Assuming that both salesmen are free at time 0, find the probability that both are busy at a later time t. Hint: Find out the 3 3 Q-matrix and work with its eigen-values. Viz., q21 = 2µ.
×
[There are three states: 0 (both salesmen are free), 1 (one busy, one free) and 2 (both busy). The non-zero jump rates are q01 = λ, q10 = µ, q12 = λ, q21 = 2µ, with λ = 2, µ = 1. This leads to the Q-matrix
−
2 1 0
(2)
2 3 2
−
with q00 = 6 and the eigen-values 0, p00
− 0 2 2
,
−2, −5. Then (t) = A + Be + Ce , t ≥ 0, −2t
−5t
where A + B + C = 1,
−2B − 5C = −2, 4B + 25C = 6.
Hence, A = 1/5, B = 2/3, C = 2/15, and P0 (both
free at time t) = 5
1 2 −2t 2 + e + e−5t , 5 3 15
P0 (both
busy at time t) =
2 5
− 23 e
−2t
+
4 −5t e .] 15
10. Let (X t)t≥0 be a Markov chain on the integers with transition rates 2, where λ + µ = 1 and qi > 0 qii+1 = λqi , qii−1 = µqi , and qi = 0 if j i for all i. Find for all i
| − |≥
(a) The probability, starting from 0, that X t hits i. Hint: The answer (λ/µ)i , i 0, depends on the value of the λ/µ, viz., for λ/mu 1, hi0 = 1, i 0. (b) The expected total time spent in state i, starting from 0. Answer: i h0 qi λ µ .
≥
≤ ≥
| − |
In the case µ = 0 (jumps to the right), write down a necessary and sufficient condition for (X t )t≥0 to be explosive. Why is this condition necessary for (X t)t≥0 to be explosive for all µ [0, 12 )? Show that, in general, (X t )t≥0 is non-explosive if and only if one of the following conditions holds:
∈
(i) λ = µ, (Hint: In this case the jump chain is recurrent .)
∞
(ii) λ > µ and
1/qi =
i=1
−1
(iii) λ < µ and
i=−∞
∞, 1/q = ∞. i
[If hk = Pk (hit i) then, as usually, hi = 1 and (Qh)k = 0 for k = i. That
is, λqk hk+1 + µqk hk−1 = qk hk , or λ(hk−1
− h ) = µ(h − h k
k −1 ),
k
k = i.
Then, with h+ = hi+1 ,
− 1)
− 1)
k −i
hk = hi +
(hi+l
l=1
k −i
+
i+l−1 ) = 1 + (h
−h
l=1
µ λ
l
, k
≥ i.
, k
≤ i.
Similarly, with h− = hi−1 , i−k
hk = hi +
l=1
i−k
(hi−l
i−l−1 )
−h
−
= 1 + (h
6
l=1
λ µ
l
Parameters h± have to be fixed so that the solution is minimal non-negative. This yields = (µ/λ)k−i, k i, if λ µ, hk = (λ/µ)i−k , k i, if λ µ, = 1, otherwise.
≥ ≤
. . . h =(µ/ ) λ .. λ>µ ... ...... .. k
h k = 1
_
≥ ≤
i
k
i
k
. . . h = 1 . µ >λ . . .. . . . . . . . . _ i k
h = ( λ ) / µ k
k
i
k
. . . . . . h = 1 . µ= λ ... ...... .. k
i
k
Substituting k = 0:
h0 = P0 (hit i) =
It useful to observe that
(λ/µ)i , i (λ/µ)i , i 1,
≤ 0, ≥ 0,
if λ µ, if λ µ, otherwise.
≥ ≤
∀ k the return probability to a state k equals
λPk−1(hit k) + µPk+1 (hit k) = 2 min [λ, µ]. Next,
E0 (time
at i) = qi−1 E0 ( # of visits to i), so introduce
E k = Ek ( # of visits to i) = Pk (hit i)(1 + E i ) = hk (1 + E i ),
7
where,
∀ i, E i = Ei (# of visits to i) = nPi (# of visits to i is n)
n≥1
=
Pi (#
of visits to i is
n≥0
=
(2 min [λ, µ])n =
1
n≥0
−
So, E0 (time
≥ n)
at i) =
1 1 = . 2 min [λ, µ] λ µ
| − |
h0 . qi λ µ
| − | Now assume that µ ≤ λ, i.e., 0 ≤ µ ≤ 1/2. If µ = 0, then (X ) is explosive if and only if 1/q < ∞. If 0 ≤ µ < 1/2 then certainly the jump chain Y → ∞ as n → ∞. Hence, (X ) cannot explode to −∞, but comparison with the case µ = 0 shows that if 1/q = ∞ then (X ) cannot explode to ∞ either. To see that this condition is also necessary for non-explosion when 0 ≤ µ < 1/2, observe that (X ) makes only finitely many visits to states i ≤ 0 and
n
t t≥0
∞
i
i=1
t t≥0
∞
t t≥0
i
i=1
t t≥0
∞
E0 (total
time at states i > 0) =
1/(λ
i=1
− µ)q . i
Finally, in the case λ = µ, (Y n)n≥0 is recurrent so (X t )t≥0 does not explode. This covers cases (i) and (ii). Case (iii) is considered similarly.]
11. Let (X t )t≥0 be a birth-and-death process with rates λn = nλ and µn = nµ, and suppose X 0 = 1. Show that h(t) = P(X t = 0) satisfies
t
h(t) =
e−(λ+µ)s µ + λh(t
0
{
2
− s) }ds
and deduce that if λ = µ then
h(t) = (µeµt
λt
− µe
)/(µeµt
λt
− λe
).
Hint: If the population reaches level n then it will become extinct iff each of the n independent descendant’s trees terminates. 8
[A crucial point is that, with λn = nλ and µn = nµ, every member of the population produces an offspring or dies independently. Therefore, the population with n members becomes extinct iff the descendants’ trees generated by each member terminates, which occurs independently. Thus we can write: h(t) = P(X t = 0 X 0 = 1)
|
µ λ P(X t = 0 X s = 2) ds = 0 e−(λ+µ)s (λ + µ) + λ+µ λ+µ t = 0 e−(λ+µ)s [µ + λP(X t = 0 X s = 2)]ds. t
Now:
P(X t
t
e−(λ+µ)s µ = λ+µ µ = λ+µ
h(t) =
0
This leads to the equation −(λ+µ)t
e
|
2
− s) , and [µ + λh(t − s) )]ds 1−e +λ e 1−e +λ e
= 0 X s = 2) = h(t
|
|
2
− t
−(λ+µ)t
0
h(t
t
−(λ+µ)t
−(λ+µ)(t−u)
or
e
+λ
e−(λ+µ)u h(u)2 du,
0
whence
dh = dt, µ + λh2 (λ + µ)h
−
1
λ λh
−µ − h−1
log
or
dh = dt(µ
− λ),
λh
− 1 = (µ − λ)t + c, λh − µ = A(h − 1)e . h
−(µ−λ)t
From h(0) = 0: A = µ, and h(λ
h(u)2 du.
t
−(λ+µ)t
(λ + µ)h + h′ = µ + λh2, i.e.
2
− s) ds
0
µ 1 h(t) = λ+µ
or, after differentiation:
−(λ+µ)s
(µ−λ)t
− µe
)=µ 9
(µ−λ)t
− µe
which immediately leads to the answer.]
12. Each bacterium in a colony splits into two identical bacteria after an exponential time of parameter λ, which then split in the same way but independently. Let X t denote the size of the colony at time t, and suppose X 0 = 1. Show that the probability generating function φ(t) = E(z X ) satisfies t
t
φ(t) = ze
−λt
+
λe−λs φ(t
0
and deduce that, for q = 1
−λt
−e
, and n = 1, 2, . . .,
= n) = qn−1 (1
P(X t
2
− s) ds
− q) .
Hint:Take 0 s t. Conditional on the first split time J 1 = s, X t is decomposed as X t−s + Xt −s .
≤ ≤
[For 0 s t, the conditional distribution of X t, given the first split J 1 occurs at s, is the same as the distribution of X t−s + Xt −s , where X is an independent copy of X . So
≤ ≤
E(z
Xt
and
E(z
|J = s) = 1
˜ t−s Xt−s +X
z
∞
) for 0 for s
≤ s ≤ t, ≥ t,
φ(t) = 0 λe−λs E(z X J 1 = s)ds t = ze−λt + 0 λe−λs φ(t s)2 ds . t
|
−
Put u = t s, multiply by eλt and differentiate to see that dφ/dt = λφ(1 and so φeλt = z(φ 1)/(z 1). Then
−
−
−
∞
φ(t) =
− (1
q(t))q(t)n−1 zn
n=1
where q = 1
−λt
−e
.]
13. Compute p11 (t) for P (t) = etQ , where Q=
−
2 4 2
1 4 1
− 10
− 1 0 3
.
− φ)
Find an invariant distribution λ for Q and verify that p11 (t)
→λ
1
as t
→ ∞.
[The eigenvalues of Q are 0, 4, 5 so p11 (t) = A+Be −4t +Ce−5t for some constants A , B , C which may be determined from p11 (0) = 1, p′11 (0) = 2, 3 2 (2) p′′11(0) = q11 = 10. In fact, A = = λ1 , B = 0, C = .] 5 5 14. Two fleas are bound together to take part in a nine-legged race on the vertices A,B,C of a triangle. Flea 1 hops at random times in the clockwise direction; each hop takes the pair from one vertex to the next and the times between successive hops of Flea 1 are independent random variables, each with with exponential distribution, mean 1/λ. Flea 2 behaves similarly, but hops in the anti-clockwise direction, the times between his hops having mean 1/µ. Show that the probability that they are at A at a given time t > 0 is
− −
1 2 + exp 3 3
−
−
3(λ + µ)t 2
√ cos
3(λ µ)t 2
−
.
[The position evolves as a Markov chain with Q-matrix
Q=
− −
λ µ µ λ
λ λ µ µ
− −
− − µ λ λ µ
and the problem is to compute p11 (t). The method of Example 1 applies. 1 Convergence to equilibrium gives p11 ( ) = , which is useful in fixing the 3 constants.]
∞
11