Solutions to Stochastic Processes Ch.3

随机过程-第二版》(英文电子版Sheldon M. Ross 答案整理,此书作为随机过程经典教材没有习题讲解,所以将自己的学习过程记录下来,部分习题解答参考了网络,由于来源很多,出处很多也不明确,无法一一注明,在此一并表示感谢!希望对大家有帮助,水平有限,不保证正确率,欢迎批评指正,转载请注明出处。
Solutions to Stochastic Processes Sheldon M. Ross Second Edition(pdf)
Since there is no official solution manual for this book, I
handcrafted the solutions by myself. Some solutions were referred from web, most copyright of which are implicit, can’t be listed clearly. Many thanks to those authors! Hope these solutions be helpful, but No Correctness or Accuracy Guaranteed. Comments are welcomed. Excerpts and links may be used, provided that full and clear credit is given.

3.1 Is it true that:
(a) \(N(t) < n\) if and only if \(S_n > t\) ?
(b) \(N(t) \leq n\) if and only if \(S_n \geq t\) ?
(c) \(N(t) > n\) if and only if \(S_n < t\) ?


(a) True, (b)(c) False.


3.2 In defining a renewal process we suppose that \(F(\infty)\), the probability that an interarrival time is finite, equals 1. If \(F(\infty) < 1\), then after each renewal there is a positive probability \(1 – F(\infty)\) that there will be no further renewals. Argue that when \(F(\infty) < 1\) the total number of renewals, call it \(N(\infty)\), is such that \(1 + N(\infty)\) has a geometric distribution with mean \(1/(1 – F(\infty))\).


$$P\{1 + N(\infty) = k\} = F^{k-1}(\infty)(1 – F(\infty))$$


3.3 Express in words what the random variable \(X_{N(t) + 1}\) represents (Hints: It is the length of which renewal interval?). Show that$$P\{X_{N(t)+1} \geq x\} \geq \bar{F}(x).$$
Compute the above exactly when \(F(x) = 1 – e^{-\lambda x}\).


\(X_{N(t)+1}\) represents the first renewal interval after the last event in \(t\).
$$\begin{align}
P\{X_{N(t)+1} \geq x\} &= \int_0^{\infty} P\{X_{N(t)+1} \geq x | S_{N(t)} = s \}dF_{S_{N(t)}}(s)\\
&= \int_0^{\infty} P\{X_{N(t)+1} \geq x | X_{N(t) + 1} > t – s \}dF_{S_{N(t)}}(s) \\
&= \int_0^{\infty} \frac{P\{X_{N(t)+1} \geq x, X_{N(t) + 1} > t – s \}}{P\{X_{N(t) + 1} > t – s \}}dF_{S_{N(t)}}(s) \\
&= \int_0^{\infty} \frac{1 – F(max\{x, t-s\})}{1 – F(t-s)}dF_{S_{N(t)}}(s) \\
&= \int_0^{\infty} min\{\frac{1 – F(x)}{1 – F(t-s)}, \frac{1 – F(t-s)}{1 – F(t-s)}\}dF_{S_{N(t)}}(s) \\
&= \int_0^{\infty} min\{\frac{1 – F(x)}{1 – F(t-s)}, 1\}dF_{S_{N(t)}}(s) \\
&\geq \int_0^{\infty} 1 – F(x)dF_{S_{N(t)}}(s) \\
&= \bar{F}(x)
\end{align}$$
When \(F(x) = 1 – e^{-\lambda x}\),
$$\begin{align}
P\{X_{N(t)+1} \geq x\} &= \int_0^{\infty} \frac{e^{-\lambda x} }{e^{-\lambda(t-s)}}dF_{S_{N(t)}}(s) \\
&= \int_0^{t-x} dF_{S_{N(t)}}(s) + \int_{t-x}^t e^{-\lambda x} dm(s) \\
&= (1 + x/\lambda)e^{-\lambda x} – e^{-\lambda t}
\end{align}$$


3.4 Prove the renewal equation
$$m(t) = F(t) + \int_0^t m(t-x)dF(x)$$


$$\begin{align}
m(t) &= E[N(t)] = \int_0^{\infty} E[N(t)|X_1]dF(x) \\
&= \int_0^t E[1 + N(t-x)]dF(x) \\
&= F(t) + \int_0^t m(t-s)dF(x)
\end{align}$$


3.5 Prove that the renewal function \(m(t), 0 \leq t < \infty\) uniquely determines the interarrival distribution \(F\).



The Laplace transform for \(F\) is,
$$\phi_{F_n}(\lambda) = \int_0^{\infty} e^{-\lambda t}dF_n(t) = E[e^{-\lambda S_n}] = [\phi_F(\lambda)]^n $$
The Laplace transform for \(m(t)\) is,
$$\begin{align}
U(\lambda) &= \int_0^{\infty} e^{-\lambda t}dm(t) \\
&= \sum_{n=1}^{\infty} \int_0^{\infty} e^{-\lambda t}dF_n(t) \\
&= \sum_{n=1}^{\infty}[\phi_F(\lambda)]^n = \frac{\phi_F(\lambda)}{1 – \phi_F(\lambda)} \\
\phi_F(\lambda) &= \frac{U(t)}{1 + U(t)}
\end{align}$$
The distribution \(F\) is uniquely determined by its Laplace transform \(\phi_F\), which is uniquely determined by the renewal function.


3.6 Let \(\{N(t), t \geq 0\}\) be a renewal process and suppose that for all \(n\) and \(t\), conditional on the event that \(N(t) = n\), the event times \(S_1, \dots, S_n\) are distributed as the order statistics of a set of independent uniform \((0, t)\) random variables. Show that \(\{N(t), t \geq 0\}\) is a Poisson process.
(Hint: Consider \(E[N(s)|N(t)]\) and then use the result of Problem 3.5)


$$\begin{align}
E[N(s) | N(t) = n] &= E[\sum_{i=1}^n I(U_{(i)} \leq s)] \\
&= E[\sum_{i=1}^n I(U_{i} \leq s)] \\
&= \sum_{i=1}^n P\{U_i \leq s\} = \frac{ns}{t}\\
m(s) &= E[E[N(s)|N(t)]] = \frac{s}{t}m(t)\\
\end{align}$$Solve the equation, we get \(m(s) = \lambda s\) which uniquely determines the interarrival distribution function: \(F(x) = 1 – e^{-\lambda t}\). Proven.


3.8 The random variables \(X_1, \dots, X_n\) are said to be exchangeable if \(X_{i_1}, \dots, X_{i_n}\) has the same joint distribution as \(X_1, \dots, X_n\) whenever \(i_1, \dots, i_n\) is a permutation of \(1, \dots, n\). That is, they are exchangeable if the joint distribution function \(P\{X_1 \leq x_1, \dots, X_n \leq x_n \}\) is a symmetric function of \(x_1, \dots, x_n\). Let \(X_1, X_2, \dots\) denote the interarrival times of a renewal process.
(a) Argue that conditional on \(N(t) = n, X_1, \dots, X_n\) are exchangeable. Would \(X_1, \dots, X_n, X_{n+1}\) be exchangeable (conditional on \(N(t) = n\)) ?
(b) Use (a) to prove that for \(n > 0\)
$$E[\frac{X_1 + \dots + X_{N(t)}}{N(t)}|N(t) = n] = E[X_1|N(t) = n].$$
(c) Prove that
$$E[\frac{X_1 + \dots + X_{N(t)}}{N(t)}|N(t) > 0] = E[X_1|X_1 < t] .$$


(a) $$\begin{align}
&P\{X_1 \leq x_1, \dots, X_n \leq x_n, N(t) = n\} \\
=& \int_{y_1 \leq x_1}\dots \int_{y_n \leq x_n} P\{X_{n+1} > t – \sum_{i=1}^n y_i\}dF(y_1)\dots dF(y_n) \\
=& \int_0^1 \dots \int_0^1 I\{y_1 \leq x_1, \dots, y_n \leq x_n\} \bar{F}(t – \sum_{i=1}^n y_i)dF(y_1)\dots dF(y_n) \\
\end{align}$$By Fubini’s theorem, the order change of integration doesn’t change the result.
Since distribution of \(X_{n+1}\) which contains the time \(t\) is not identical to others, they are not exchangeable.
(b) Since they’re exchangeable,
$$\begin{align}
&E[\frac{X_1 + \dots + X_{N(t)}}{N(t)}|N(t) = n]\\
=&E[\frac{X_1 + \dots + X_n}{n}|N(t) = n] \\
=&\frac{1}{n}\sum_{i=1}^{n} E[X_i|N(t) = n] \\
=& E[X_1|N(t) = n]
\end{align}$$

(c) $$\begin{align}
&E[\frac{X_1 + \dots + X_{N(t)}}{N(t)}|N(t) > 0] \\
=& \sum_{n=1}^{\infty} E[\frac{X_1 + \dots + X_{N(t)}}{N(t)}|N(t) = n]P\{N(t) = n | N(t) > 0\}\\
=& \sum_{n=1}^{\infty} E[X_1|N(t) = n]P\{N(t) = n | N(t) > 0\}\\
=& E[X_1|N(t) > 0] = E[X_1|X_1 < t]
\end{align}$$


3.9 Consider a single-server bank in which potential customers arrive at a Poisson rate \(\lambda\). However, an arrival only enters the bank if the server if free when he or she arrives. Let \(G\) denote the service distribution.
(a) At what rate do customers enter the bank?
(b) What fraction of potential customers enter the bank?
(c) What fraction of time is the server busy?


(a) This is a renewal process whose renewal occurs at a customer’s entrance. Let \(\mu_G\) denote the average service time, then average interarrival time of the renewal process is \(\mu_G\) plus the average time a customer arrive after the service finished, which is \(1/\lambda\) due to the memoryless property. Thus the rate is \(\lambda/(1 + \lambda\mu_G)\)
(b) When a customer enter the bank, the renewal occurs. In the \(\mu_G\) time, the average customers arrive is \(\lambda\mu_G\). Thus, in each period, there will \(1 + \lambda\mu_G\) customers, only one of them entered the bank. The rate is \(1/(\lambda\mu_G + 1)\)
(c) \(\mu_G/(1/\lambda + \mu_G) = \lambda\mu_G/(1 + \lambda\mu_G)\)


3.10 Let \(X_1, X_2, \dots\) be independent and identically distributed with \(E[X_i] < \infty\). Also let \(N_1, N_2, \dots\) be independent and identically distributed stopping times for the sequence \(X_1, X_2, \dots\) with \(E[N_i] < \infty\). Observe the \(X_i\) sequentially, stopping at \(N_1\). Now start sampling the remaining \(X_i\)–acting as if the sample was just beginning with \(X_{N_i + 1}\)–stopping after an additional \(N_2\)(Thus, for instance, \(X_1 + \dots + X_{N_1}\) has the same distribution as \(X_{N_1+1} + \dots + X_{N_1 + N_2}\)). Now start sampling the remaining \(X_i\)–again acting as if the sample was just beginning–and stop after an additional \(N_3\), and so on.
(a) Let
$$S_1 = \sum_{i=1}^{N_1}X_i, \quad S_2 = \sum_{i=N_1 + 1}^{N_1 + N_2}X_i, \quad\dots,\quad S_m = \sum_{i=N_1+\dots+N_{m-1}+1}^{N_1 + \dots + N_m}X_i$$
Use the strong law of large numbers to compute
$$\lim_{m \to \infty} (\frac{S_1 + \dots + S_m}{N_1 + \dots + N_m}).$$
(b) Writing
$$\frac{S_1 + \dots +S_m }{N_1 + \dots + N_m } = \frac{S_1 + \dots +S_m}{m}\frac{m}{N_1 + \dots + N_m},$$
derive another expression for the limit in part (a).
(c) Equate the two expression to obtain Wald’s equation.


(a) $$\lim_{m \to \infty} (\frac{S_1 + \dots + S_m}{N_1 + \dots + N_m}) = E[X] $$
(b) $$\lim_{m \to \infty} (\frac{S_1 + \dots + S_m}{N_1 + \dots + N_m}) = E[S]/E[N] $$
(c) $$E[S] = E[N]E[X]$$


3.11 Consider a miner trapped in a room that contains three doors. Door 1 leads her to freedom after two-days’ travel; door 2 returns her to her room after four days’ journey, and door 3 returns her room after eight-days’ journey. Suppose at all times she is equally to choose any of the three doors, and let \(T\) denote the time it takes the miner to become free.
(a) Define a sequence of independent and identically distributed random variables \(X_1, X_2, \dots\) and a stopping time \(N\) such that
$$T = \sum_{i=1}^N X_i.$$
Note: You may have to imagine that the miner continues to randomly choose doors even after she reaches safety.
(b) Use Wald’s equation to find \(E[T]\).
(c) Compute \(E[\sum_{i=1}^N X_i|N=n]\) and note that it is not equal to \(E[\sum_{i=1}^n X_i]\).
(d) Use part (c) for a second derivation of \(E[T]\).


(a) \(P\{X=2\} = P\{X=4\} = P\{X=8\} = 1/3\), and \(N= min\{n, X_n = 2\}\).
(b) Since \(N \sim Geometric(1/3)\), then \(E[N] = 3\). Hence \(E[T] = E[N]E[X] = 14\).
(c) $$\begin{align}
E[\sum_{i=1}^N X_i|N=n ] &= E[\sum_{i=1}^N X_i|X_1 \neq 2, \dots, X_{n-1}\neq 2, X_n = 2] \\&=(n-1)E[X_i|X_i\neq 2] + 2 = 6n – 4 \\
E[\sum_{i=1}^N X_i] &= nE[X_i] = 14n/3
\end{align}$$
(d) $$E[T] = E[E[\sum_{i=1}^N X_i|N=n]] = 6E[N] – 4 = 14$$


3.12 Show how Blackwell’s theorem follows from the key renewal theorem.


Let \(h(x) = 1_{\{x \in [0, a]\}}\), then \(h(x)\) is directly Riemann integrable. Then we get
$$\lim_{t \to \infty} m(t) – m(t-a) = \frac{a}{\mu}$$


3.13 A process is in one of \(n\) states, \(1, 2, \dots, n\). Initially it is in state 1, where it remains for an amount of time having distribution \(F_1\). After leaving state 1 it goes to state 2, where it remains for a time having distribution \(F_2\). When it leaves 2 it goes to state 3, and so on. From state \(n\) it returns to 1 and starts over. Find
$$\lim_{t \to \infty} P\{\text{process is in state i at time t}\}.$$
Assume that \(H\), the distribution of time between entrances to state 1, is nonlattice and has finite mean.


There may be a typo in the last statement, which I think he might means state \(i\) not state 1. Then,
Let \(\mu_i\) denote the expected time of distribution \(F_i, i = 1, 2, \dots, n\), and \(\mu_H\) denote the expectation of distribution \(H\). Then
$$
\lim_{t \to \infty} P\{\text{process is in state i at time t}\} = \frac{\mu_i}{\sum_{j=1}^n \mu_j + n\mu_H}
$$


3.14 Let \(A(t)\) and \(Y(t)\) denote the age and excess at \(t\) of a renewal process. Fill in the missing terms:
(a) \(A(t) > x \leftrightarrow 0\) events in the interval____?
(b) \(Y(t) > x \leftrightarrow 0\) events in the interval____?
(c) \(P\{Y(t) > x\} = P\{A(\ ) > \ \ \}\)
(d) Compute the joint distribution of \(A(t)\) and \(Y(t)\) for a Poisson process.


(a) \((t-x, t)\)
(b) \((t, t+x)\)
(c) \(P\{Y(t) > x\} = P\{A(t+x) > x \}\)
(d) \(P\{A(t) > x, Y(t) > y\} = e^{-\lambda(x+y)}\)


3.15 Let \(A(t)\) and \(Y(t)\) denote respectively the age and excess at \(t\). Find:
(a) \(P\{Y(t) > x|A(t) = s\}\).
(b) \(P\{Y(t) > x|A(t + x/2) = s\}\).
(c) \(P\{Y(t) > x|A(t + x) > s\}\) for a Poisson process.
(d) \(P\{Y(t) > x, A(t) > y\}\).
(e) If \(\mu < \infty\), show that, with probability 1, \(A(t)/t \to 0\) as \(t \to \infty\).


(a) \(\bar{F}(x+s)/\bar{F}(s)\) (Formal solution should be conditional on \(S_{N(t)}\))
(b) Since \(A(t+x/2) = s, s \geq x/2\), and \(P\{Y(t) > x|A(t + x/2) = s\} =\bar{F}(x/2+s)/\bar{F}(s) \)
(c) \(P\{Y(t) > x|A(t + x) > s\} = e^{-\lambda s}\)
(d) \(e^{-\lambda(x+y)}\)
(e) $$\begin{align}
\lim_{t \to \infty} \frac{A(t)}{t} &= \lim_{t \to \infty} \frac{t – S_{N(t)}}{t} \\
&= 1 – \lim_{t \to \infty} \frac{S_{N(t)}}{N(t)}\frac{N(t)}{t} \\
&= 1 – \frac{\mu}{\mu} = 0
\end{align}$$


3.16 Consider a renewal process whose interarrival distribution is the gamma distribution with parameters \((n, \lambda)\). Use Proposition 3.4.6 to show that
$$\lim_{t \to \infty} E[Y(t)] = \frac{n+1}{2\lambda}.$$
Now explain how this could have been obtained without any computations.


Since \(X \sim Gamma(n, \lambda)\), then \(E[X] = n/\lambda, E[X^2] = n(n+1)/\lambda^2\), then
$$\lim_{t \to \infty} E[Y(t)] = \frac{E[X^2]}{2E[X]} = \frac{n+1}{2\lambda}. $$
The gamma distribution can be considered as the sum of \(n\) exponential random variables with parameter \(\lambda\). And the one containing \(t\) can be seen as divided into two due to the memoryless property. Hence,
$$E[Y(t)] = E[A(t)] = \frac{n+1}{2\lambda}$$


3.18 In Problem 3.9 suppose that potential customers arrive in accordance with a renewal process having interarrival distribution \(F\). Would the number of events by time \(t\) constitute a (possibly delayed) renewal process if an event corresponds to a customer:
(a) entering the bank?
(b) leaving the bank?
What if \(F\) were exponential?


Let \(X_i\) denote the length of the i-th service and let \(Y_i\) denote the time from the end of i-th service until the start of the \((i + 1)\)th service. Let \(Y_0\) denote the time when first arrival enters the bank (and gets service). Note that \(X_i\) and \(Y_i\) may be dependent when the arrival is not a Poisson process.
(a) In this case, each cycle consists of \(Z_i = X_i + Y_i, i = 1, 2, \dots\) and \(Z_0 = Y_0\). Since \(X_i, Y_i\) are independent of \(X_j, Y_j, j = 1, \dots, i − 1, \{Z_i\}_{i\in N}\) are i.i.d copies. We thus have a delayed renewal process.
(b) In this case, \(Z_i = Y_{i−1}+X_i\). When \(X_i, Y_i\) are dependent, \(\{Z_i\}_{i \in N}\) are not i.i.d. copies. We do not have a (delayed) renewal process.
If \(F\) is exponential, then (a) is still a delayed renewal process, (b) is a renewal process.


3.19 Prove Equation (3.5.3).


$$\begin{align}
&P\{S_{N(t)} \leq s\} \\
=& \sum_{n=0}^{\infty} P\{S_n \leq s, S_{n+1} > t\} \\
=& \bar{G}(t) + \sum_{n=1}^{\infty} \int_0^{\infty} P\{S_n \leq s, S_{n+1} > t | S_n = y\}dG*F_{n-1}(y) \\
=& \bar{G}(t) + \sum_{n=1}^{\infty} \int_0^s \bar{F}(t-y)dG*F_{n-1}(y) \\
=& \bar{G}(t) + \int_0^s \bar{F}(t-y)d(\sum_{n=1}^{\infty} G*F_{n-1}(y)) \\
=& \bar{G}(t) + \int_0^s \bar{F}(t-y)dm_D(y)
\end{align}$$


3.20 Consider successive flips of a fair coin.
(a) Compute the mean number of flips until the pattern HHTHHTT appears.
(b) Which pattern requires a larger expected time to occur: HHTT or HTHT?


(a) \(2^7\)
(b) HTHT


3.21 On each bet a gambler, independently of the past, either wins or loses 1 unit with respective probabilities \(p\) and \(1-p\). Suppose the gambler’s strategy is to quit playing the first time she wins \(k\) consecutive bets. At the moment she quits:
(a) find her expected winnings.
(b) find the expected number of bets she has won.


Let \(X_i\) denote the ith winning, and \(Y_i\) denote the indicator of whether win ith game, \(N\) denote the number of play until she quits. Then \(N\) is the stopping time for sequence \(X_i\)s and \(Y_i\)s, and \(E[N] = \sum_{i=1}^kp^{-i}\).
(a) \(E[N]E[X] = (2p-1)\sum_{i=1}^kp^{-i} \)
(b) \(E[N]E[Y] = \sum_{i=0}^{k-1}p^{-i} \)


3.22 Consider successive flips of a coin having probability \(p\) of landing heads. Find the expected number of flips until the following sequences appear.
(a) A = HHTTHH.
(b) B = HTHTT.
Suppose now that \(p = 1/2\).
(c) Find \(P\{A \text{ occurs before } B\}\).
(d) Find the expected number of flips until either A or B occurs.


(a) \(p^{-1} + p^{-2} + p^{-4}(1-p)^{-2}\)
(b) \(p^{-2}(1-p)^{-3}\)
(c)(d) $$\begin{align}
E[N_{B|A}] &= E[N_{HTHTT}] – E[N_{H}] = p^{-2}q^{-3} – p^{-1} \\
E[N_{A|B}] &= E[N_{A}]= p^{-1} + p^{-2} + p^{-4}p^{-2} \\
E[N_{A}] &= E[M] + E[N_A – M] \\
&= E[M] + E[N_A – M|\text{B before A}](1 – P_A) \\
&= E[M] + E[N_{A|B}](1-P_A). \\
E[N_B] = E[M] + E[N_{B|A}]P_A\\
\end{align}$$where \(P_A = P\{\text{A before B}\}, M=min(N_A, N_B), p=q=1/2\). Solve the equations above, we get,
$$
P_A = \frac{E[N_B] + E[N_{A|B}] – E[N_A]}{E[N_{A|B}] + E[N_{B|A}]} = \frac{8}{25} \\
E[M] = E[N_B] – E[N_{B|A}]P_A = \frac{112}{5}\\
$$


3.23 A coin having probability \(p\) of landing heads is flipped \(k\) times. Additional flips of the coin are then made until the pattern of the first \(k\) is repeated (possibly by using some of the first \(k\) flips). Show that the expected number of additional flips after the initial \(k\) and \(2^k\).


Let \(m\) denote the number of head in the first \(k\) flips. Then
$$\begin{align}
E[T] &= \sum_{i=0}^k E[T|m=i]P\{m=i\} \\
&= \sum_{i=0}^k p^{-i}(1-p)^{-k+i}{k \choose i} p^i(1-p)^{k-i} \\
&= 2^k
\end{align}$$


3.25 Consider a delayed renewal process \(\{N_D(t), t \geq 0\}\) whose first interarrival has distribution \(G\) and the others have distribution \(F\). Let \(m_D(t) = E[N_D(t)]\).
(a) Prove that
$$m_D(t) = G(t) + \int_0^t m(t-x)dG(x),$$
where \(m(t) = \sum_{n=1}^{\infty}F_n(t)\)
(b) Let \(A_D(t)\) denote the age at time \(t\). Show that if \(F\) is nonlattice with \(\int x^2 dF(x) < \infty\) and \(t\bar{G}(t) \to 0\) as \(t \to \infty\), then
$$E[A_D(t)] \to \frac{\int_0^{\infty} x^2dF(x)}{2\int_0^{\infty}xdF(x)}$$
(c) Show that if \(G\) has a finite mean, then \(t\bar{G}(t) \to 0\) as \(t \to \infty\).


(a) $$\begin{align}
m_D(t) &= \int_0^t E[N_D(t) | X_1 = x]dG(x)\\
&= \int_0^t 1 + m(t – x)dG(x) \\
&= G(t) + \int_0^t m(t – x)dG(x)
\end{align}$$
(b) $$\begin{align}
&E[A_D(t)]\\
=& E[A_D(t)|S_{N(t)} = 0]\bar{G}(t) + \int_0^t E[A_D(t) | S_{N(t)} = y]\bar{F}(t-y)dm_D(y)\\
=& E[t|X>t]\bar{G}(t) + \int_0^t E[t-y|X>t-y]\bar{F}(t-y)dm_D(y) \\
=& \int_0^{\infty}E[t|X>t]\bar{F}(t)dt/\mu \quad (\text{from Proposition 3.5.1(v)}) \\
=& \frac{\int_0^{\infty} x^2dF(x)}{2\int_0^{\infty}xdF(x)}
\end{align}$$
(c) $$\begin{align}
E[G] &= \int_0^{\infty} \bar{G}(t)dt \\
&= t\bar{G}(t)|_0^{\infty} + \int_0^{\infty}tdG(t) \\
&= \lim_{t \to \infty} t\bar{G}(t) + E[G]\\
\lim_{t \to \infty} t\bar{G}(t) &\to 0
\end{align}$$


3.26 Prove Blackwell’s theorem for renewal reward processes. That is, assuming that the cycle distribution is not lattice, show that, as \(t \to \infty\),
$$E[\text{reward in } (t, t+a)] \to a\frac{E[\text{reward in cycle}]}{E[\text{time of cycle}]}$$
Assume that any relevant function is directly Riemann integrable.


From Proposition 3.6.1 (ii), \(E[R(t)] \to t\frac{E[R]}{E[X]}\), then
$$\begin{align}
&E[\text{reward in } (t, t+a)]\\
=& E[R(t+a)] – E[R(t)] \\
\to & (t+a)\frac{E[R]}{E[X]} – t\frac{E[R]}{E[X]} \\
=& a\frac{E[\text{reward in cycle}]}{E[\text{time of cycle}]}
\end{align}$$


3.28 In Example 3.6(C) suppose that the renewal process of arrivals is a Poisson process with mean \(\mu\). Let \(N^*\) denote that value of \(N\) that minimizes the long-run average cost if a train leaves whenever there are \(N\) travelers waiting. Another type of policy is to have a train depart every \(T\) time units. Compute the long-run average cost under this policy and let \(T^*\) denote the value of \(T\) that minimizes it. Show that the policy that departs whenever \(N^*\) travelers are waiting leads to a smaller average cost than the one that departs every \(T^*\) time units.


From 3.6(c), we get the average cost for policy 1 is
$$E_1 = \frac{c(N-1)}{2} + \frac{K\mu}{N}$$
By derivation, we get the minimum value \(E_1^* = \sqrt{2kc\mu} – c/2\), when \(N= N^*\).
For policy 2,
$$E_2 = \frac{ct\mu}{2} + {K}{t}$$
By derivation, we get the minimum value \(E_2^* = \sqrt{2kc\mu}\), when \(T = T^*\).
Since \(c > 0\), we get the result.


3.29 The life of a car is a random variable with distribution \(F\). An individual has a policy of trading in his car either when it fails or reaches the age of \(A\). Let \(R(A)\) denote the resale value of an A-year-old car. There is no resale value of a failed car. Let \(C_1\) denote the cost of a new car and suppose that an additional cost \(C_2\) is incurred whenever the car fails.
(a) Say that a cycle begins each time a new car is purchased. Compute the long-run average cost per unit time.
(b) Say that a cycle begins each time a car in use fails. Compute the long-run average cost per unit time.
Note. In both (a) and (b) you are expected to compute the ratio of the expected cost incurred in a cycle to the expected time of a cycle. The answer should, of course, be the same in both parts.


(a) $$E[\text{time per cycle}] = \int_0^A xdF(x) + A(1 – F(A)) \\
E[\text{cost per cycle}] = C_1 – \bar{F}(A)R(A) + F(A)C_2\\
\lim_{t \to \infty} \frac{E[\text{total cost}]}{t} = \frac{C_1 – \bar{F}(A)R(A) + F(A)C_2 }{\int_0^A xdF(x) + A(1 – F(A)) }$$
(b) The probability of using a car until it fails is \(F(A)\), then the number of car in a cycle \(N\) is a geometric variable with parameter \(F(A), E[N] = 1/F(A)\), then
$$\begin{align}
E[\text{time per cycle}] &= E[(N-1)A] + \int_0^A xd\frac{F(x)}{F(A)}\\
&= \frac{A\bar{F}(A)}{F(A)} + \int_0^A xd\frac{F(x)}{F(A)} \\
E[\text{cost per cycle}] &= E[NC_1 – (N-1)R(A) + C_2]\\
&= \frac{C_1 + \bar{F}(A)R(A)}{F(A)} + C_2\end{align}$$


3.30 Suppose in Example 3.3(A) that a coin’s probability of landing heads is a beta random variable with parameters \(n\) and \(m\), that is, the probability density is
$$f(p) = Cp^{n-1}(1 – p)^{m-1} \quad 0 \leq p \leq 1.$$
Consider the policy that flips each newly chosen coin until \(m\) consecutive flips land tails, then discards that coin and does the same with a new one. For this policy show that, with probability 1, the long-run proportion of flips that land heads is 1.


Similar to 3.3(A),
$$\begin{align}
&E[\text{number of flips between successive patterns}] \\
= &\int_0^1 (\sum_{i=1}^m (1-p)^{-i}) Cp^{n-1}(1-p)^{m-1} dp \\
= &C\int_0^1 \frac{p^{n-2}}{1-p} – p^{n-2}(1-p)^{m-2} dp \\
\geq &C[\int_0^1 p^{n-2}dp\int_0^1 \frac{1}{1-p}dp – \int_0^1dp ] \\
=& \infty
\end{align}$$


3.31 A system consisting of four components is said to work whenever both at least one of components 1 and 2 work and at least one of components 3 and 4 work. Suppose that component \(i\) alternates between working and being failed in accordance with a nonlattice alternating renewal process with distributions \(F_i\) and \(G_i, i = 1, 2, 3, 4\). If these alternating renewal processes are independent, find \(\lim_{t \to \infty} = P\{\text{system is working at time } t\}\).


Let \(\mu_{F_i}, \mu_{G_i}\) denote the expected work time and failure time of ith component. When \(t \to \infty\), the probability of at least one of 1 and 2 work is
$$P_{12} = 1 – \frac{\mu_{G_1}\mu_{G_2}}{(\mu_{G_1} + \mu_{F_1})(\mu_{G_2} + \mu_{F_2})}$$
Similarly,$$P_{34} = 1 – \frac{\mu_{G_3}\mu_{G_4}}{(\mu_{G_3} + \mu_{F_3})(\mu_{G_4} + \mu_{F_4})}$$
$$\lim_{t \to \infty} = P\{\text{system is working at time } t\} = P_{12}P_{34} $$


3.32 Consider a single-server queueing system having Poisson arrivals at rate \(\lambda\) and service distribution \(G\) with mean \(\mu_G\). Suppose that \(\lambda\mu_G < 1\)
(a) Find \(P_0\), the proportion of time the system is empty.
(b) Say the system is busy whenever it is nonempty (and so the server is busy). Compute the expected length of a busy period.
(c) Use part (b) and Wald’s equation to compute the expected number of customers served in a busy period.


(a) \(P_0 = 1 – \lambda\mu_G\)
(b) \(E[B] = \frac{\mu_G}{1 – \lambda\mu_G}\)
(c) \(E[C] = \frac{1}{1 – \lambda\mu_G}\)
For detail see the “Introduction to probability models” by Sheldon M. Ross 10th edition Chapter 8.5


3 Replies to “Solutions to Stochastic Processes Ch.3”

  1. By YT,Su:
    习题3.3的第二问求具体解的时候答案我个人觉得应该是
    P=(1+\lambda*min{t,x})exp{-\lambda x}

  2. 3.29 (b). 最后一个式子的分数的分子上应该是C1减去,不是加,算是个小失误,影响不大。

  3. 我怎么感觉都挺有问题的?up主一直在假定X_N(t)+1的分布是和X同分布,但是他们是不同分布的

Leave a Reply

Your email address will not be published. Required fields are marked *

The maximum upload file size: 32 MB. You can upload: image, audio, video, document, spreadsheet, interactive, text, archive, code, other. Links to YouTube, Facebook, Twitter and other services inserted in the comment text will be automatically embedded. Drop file here