Solutions to Stochastic Processes Ch.2

随机过程-第二版》(英文电子版Sheldon M. Ross 答案整理,此书作为随机过程经典教材没有习题讲解,所以将自己的学习过程记录下来,部分习题解答参考了网络,由于来源很多,出处很多也不明确,无法一一注明,在此一并表示感谢!希望对大家有帮助,水平有限,不保证正确率,欢迎批评指正,转载请注明出处。
Solutions to Stochastic Processes Sheldon M. Ross Second Edition(pdf)
Since there is no official solution manual for this book, I
handcrafted the solutions by myself. Some solutions were referred from web, most copyright of which are implicit, can’t be listed clearly. Many thanks to those authors! Hope these solutions be helpful, but No Correctness or Accuracy Guaranteed. Comments are welcomed. Excerpts and links may be used, provided that full and clear credit is given.

2.1 Show that Definition 2.1.1 of a Poisson process implies Definition 2.1.2.


$$\begin{align}
&\ \forall t_1 < t_2, s > 0 \\
&P\{N(t_2+s) – N(t_1+s) = n\}\\
= &P\{N(t_2) – N(t_1) = n\} = e^{-\lambda(t_2-t_1)}\frac{\lambda^n(t_2-t_1)^n}{n!}
\end{align}$$
Thus, the process has stationary increments.
$$P\{N(h) = 1\} = \lambda h + \lambda h(e^{-\lambda h} – 1)\\
\lim_{h \to 0} \frac{\lambda h(e^{-\lambda h} – 1) }{h} = 0$$
Thus, \( P\{N(h) = 1\} = \lambda h + o(h)\).
$$P\{N(h) \geq 2\} = 1 – P\{N(h) = 1\} – P\{N(h) = 0\} = o(h) $$
Thus, \( P\{N(h) \geq 2\} = o(h)\).


2.2 For another approach to proving that Definition 2.1.2 implies Definition 2.1.1:
(a) Prove, using Definition 2.1.2, that
$$P_0(t + s) = P_0(t)P_0(s)$$
(b) Use (a) to infer that the interarrival times \(X_1, X_2, \dots\) are independent exponential random variables with rate \(\lambda\).
(c) Use (b) to show that \(N(t)\) is Poisson distributed with mean \(\lambda t\).


(a)$$P_0(t+s) = P\{N(t) = 0, N(s)=0\} = P_0(t)P_0(s) $$
(b)$$\begin{align}
1 – F_X(t) &= P_0(t) = \lim_{\Delta t \to 0} P_0(\Delta t)^{t/\Delta t} \\
&= \lim_{\Delta t \to 0} (1 – \lambda\Delta t + o(\Delta t))^{t/\Delta t} \\
&= e^{-\lambda t}\\
F_X(t) &= 1 – e^{-\lambda t}\\
\end{align}$$
(c) From Problem 1.29 we know, \(\sum_1^n X_i \sim \Gamma(n, \lambda)\) when all \(X_i\) are i.i.d and \( X_i \sim Exponential(\lambda), i=1, 2, \dots\) Let \(f_n(x)\) denote the PDF of the sum of \(n \ X_i\). Thus,
$$\begin{align}
P\{N(t) = n\} &= P\{\sum_1^n X_i < t, \sum_1^{n+1} X_i > t\} \\
&= \int_0^t f_n(x)e^{-\lambda(t-x)} dx \\
&= e^{-\lambda t}\frac{(\lambda t)^n}{n!}
\end{align}$$


2.3 For a Poisson process show, for \(s < t\), that
$$P\{N(s) = k | N(t) = n\} = {n \choose k}(\frac{s}{t})^k(1 – \frac{s}{t})^{n-k}, \quad k=0, 1, \dots, n.$$


$$\begin{align}
P\{N(s)=k|N(t)=n\} &= \frac{ P\{N(s)=k, N(t-s)=n-k\} }{ P\{N(t)=n\} } \\
&= \frac{ P\{N(s)=k\}P\{N(t-s)=n-k\} }{ P\{N(t)=n\} } \\
&= {n \choose k}(\frac{s}{t})^k(1 – \frac{s}{t})^{n-k}, \quad k=0, 1, \dots, n.
\end{align}$$


2.4 Let \(\{N(t), t \geq 0\}\) be a Poisson process with rate \(\lambda\). Calculate \(E[N(t)N(t+s)]\).


$$\begin{align}
E[N(t)N(t+s)] &= E[N(t)(N(t) + N(s))] \\
&= E[N^2(t)] + E[N(t)N(s)] \\
&= \sum_0^{\infty} n^2 e^{-\lambda t}\frac{(\lambda t)^n}{n!} + E[N(t)]E[N(s)] \\
&= \lambda t[\sum_0^{\infty} (n-1 + 1) e^{-\lambda t}\frac{(\lambda t)^{n-1}}{(n-1)!}] + \lambda^2ts \\
&= \lambda^2t(t+s) + \lambda t \\
\end{align}$$
The second mixed raw moment, which is \(E[N(t)N(s)]\), is called the auto-correlation function of the stochastic process. And the acf for Poisson process with parameter \(\lambda\) is
$$E[N(t)N(s)] = \lambda st + \lambda min\{s, t\}, \quad s,t\geq 0$$


2.5 Suppose that \(\{N_1(t), t \geq 0\}\) and \(\{N_2(t), t \geq 0\}\) are independent Poisson process with rates \(\lambda_1\) and \(\lambda_2\). Show that \(\{N_1(t) + N_2(t), t \geq 0\}\) is a Poisson process with rate \(\lambda_1 + \lambda_2\). Also, show that the probability that the first event of the combined process comes from \(\{N_1(t), t \geq 0\}\) is \(\lambda_1 /(\lambda_1 + \lambda_2)\), independently of the time of the event.


Let \(N(t) = N_1(t) + N_2(t)\), then
$$\begin{align}
&P\{N(t) = n\}\\= &\sum_{i=0}^n P\{N_1(t) = i\}P\{N_2(t) = n- i\} \\
=&\sum_{i=0}^n e^{-\lambda_1 t}\frac{(\lambda_1 t)^i}{i!} e^{-\lambda_2 t}\frac{(\lambda_2 t)^{n-i}}{(n-i)!} \\
=&e^{-(\lambda_1 + \lambda_2)t}\frac{(\lambda_1+\lambda_2)^nt^n}{n!}
\sum_{i=0}^n {n \choose i} (\frac{\lambda_1}{\lambda_1 + \lambda_2})^i(\frac{\lambda_2}{\lambda_1 + \lambda_2} )^{n-i} \\
=&e^{-(\lambda_1 + \lambda_2)t}\frac{(\lambda_1+\lambda_2)^nt^n}{n!}
\end{align}$$
Let \(X_i\) denote the waiting time until the first event occur of the \(i\) process, then \(X_i \sim Exponential(\lambda_i), i=1,2\). The probability of first event comes from 1 is,
$$P\{X_1 < X_2\} = \int_0^{\infty} \lambda_1e^{-\lambda_1 t} e^{-\lambda_2 t} dt = \frac{\lambda_1}{\lambda_1 + \lambda_2}$$


2.10 Buses arrive at a certain stop according to a Poisson process with rate \(\lambda\). If you take the bus from that stop then it takes a time \(R\), measured from the time at which you enter the bus, to arrive home. If you walk from the bus stop then it takes a time \(W\) to arrive home. Suppose that your policy when arriving at the bus stop is to wait up to a time s, and if a bus has not yet arrived by that time then you walk home.
(a) Compute the expected time from when you arrive at the bus stop util you reach home.
(b) Show that if \(W < 1/\lambda + R\) then the expected time of part (a) is minimized by letting \(s=0\); if \(W > 1/\lambda + R\) then it is minimized by letting \(s=\infty\) (that is, you continue to wait for the bus), and when \(W = 1/\lambda + R\) all values of \(s\) give the same expected time.
(c) Give an intuitive explanation of why we need only consider the case \(s=0\) and \(s=\infty\) when minimizing the expected time.


(a) $$\begin{align}
E &= (s+W)P\{N(s) = 0\} + \int_0^s (t+R)\lambda e^{-\lambda t} dt \\
&= (W-R-\frac{1}{\lambda})e^{-\lambda s} + R + \frac{1}{\lambda}
\end{align}$$
(b) When \(W < 1/\lambda + R\) the expectation monotonically increases as \(s\) increases, the expected time is minimized when \(s\) equals to its minimum value 0.
When \(W > 1/\lambda + R\) the expectation monotonically decreases as \(s\) increases.
When \(W = 1/\lambda + R\) the expectation always equals \(R + \frac{1}{\lambda} \).
(c) Because the interarrival times of bus is exponential distributed which is memoryless. The time until the first bus come after waited \(s\) is just the “same” as if you just came the stop. That is to say, if you decide to leave after waited a \(s\) time, you should leave at the beginning; and if you decide not to leave at the beginning, you should continue to wait for the bus.


2.11 Cars pass a certain street location according to a Poisson process with rate \(\lambda\). A person wanting to cross the street at that location waits until she can see that no cars will come by in the next \(T\) time units. Find the expected time that the person waits before starting to cross. (Note, for instance, that if no cars will be passing in the first \(T\) time units then the waiting time is 0).


Condition on the first arrival time \(X\), and the total expected waiting time
$$\begin{align}
E[W] &= E[E[W|X]] \\
&= \int_0^T (E[W] + x)\lambda e^{-\lambda x} dx\\
&= E[W] + \frac{1}{\lambda} -e^{-\lambda T}(T + E[W] + \frac{1}{\lambda}) \\
E[W] &= \frac{e^{\lambda T} -1}{\lambda} – T
\end{align}$$
In the second line, consider the first arrival time as a part of the second, then the conditional state is identical the initial state, due to the memoryless property.


2.12 Events, occurring according to a Poisson process with rate \(\lambda\), are registered by a counter. However, each time an event is registered the counter becomes inoperative for the next \(b\) units of time and does not register any new events that might occur during that interval. Let \(R(t)\) denote the number of events that occur by time \(t\) and are registered.
(a) Find the probability that the first \(k\) events are all registered.
(b) For \(t \geq (n-1)b\), find \(P\{R(t) \geq n\}\).


(a) Since the interarrival times conform exponential distribution with parameter \(\lambda\), then the first \(k\) events all registered means the \(k-1\) intervals are all greater than \(b\), which probability is \(e^{-\lambda t(k-1)}\).
(b) Due to the memoryless property of exponential distribution, the distribution of \(R(t)\) is identical to \(N(t-(n-1)b)\). Thus,
$$P\{R(t) \geq n\} = P\{N(t-(n-1)b) \geq n\} = e^{-\mu}\sum_{k=n}^{\infty} \frac{\mu^k}{k!}$$
where \(\mu = \lambda(t – (n-1)b) \).


2.13 Suppose that shocks occur according to a Poisson process with rate \(\lambda\), and suppose that each shock, independently, causes the system to fail with probability \(p\). Let \(N\) denote the number of shocks that it takes for the system to fail and let \(T\) denote the time of failure. Find \(P\{N=n|T=t\}\).


The process can be considered as two independent Poisson counter, the failure \(N_1(t)\) with parameter \(\lambda p\) and the non-failure \(N_2(t)\) with \(\lambda (1-p)\). Then,
$$P\{N=n|T=t\} = P\{N_2(t) = n-1\} = e^{-\lambda(1-p)t}\frac{(\lambda(1-p)t )^{n-1}}{(n-1)!} $$


2.16 The number of trails to be performed is a Poisson random variable with mean \(\lambda\). Each trails has \(n\) possible outcomes, number \(i\) with probability \(P_i, \sum_1^n P_i = 1\). Let \(X_j\) denote the number of outcomes that occur exactly \(j\) times, \(j=0, 1, \dots\). Compute \(E[X_j]\) and \(Var(X_j)\).


Let \(Y_i\) denote the number of the occurrence of the ith outcome. From Proposition 2.3.2, we have \(Y_i \sim \pi(\lambda P_i)\), and all \(Y_i\) is i.i.d. Thus, \(X_j\) can be considered as the sum of \(n\) random variable conformed (0-1) distribution independently.
$$ E[X_j] = \sum_{i=1}^n P\{Y_i = j\} = \sum_{i=1}^n e^{-\lambda P_i} \frac{(\lambda P_i)^j}{j!} $$
$$Var(X_j) = \sum_{i=1}^n P\{Y_i = j\}(1 – P\{Y_i = j\} ) $$


2.17 Let \(X_1, X_2, \dots, X_n\) be independent continuous random variables with common density function \(f\). Let \(X_{(i)}\) denote the ith smallest of \(X_1, \dots, X_n\).
(a) Note that in order for \(X_{(i)}\) to equal \(x\), exactly \(i-1\) of the \(X\)’s must be less than \(x\), one must equal \(x\), and other \(n-i\) must all be greater than \(x\). Using this fact argue that the density function of \(X_{(i)}\) is given by
$$f_{X_{(i)}}(x) = \frac{n!}{(i-1)!(n-i)!}(F(x))^{i-1}(\bar{F}(x))^{n-i}f(x).$$
(b) \(X_{(i)}\) will be less than \(x\) if, and only if, how many of the \(X\)’s are less than \(x\)?
(c) Use (b) to obtain an expression for \(P\{X_{(i)} \leq x\}\).
(d) Using (a) and (c) establish the identity for \(0 \leq y \leq 1\)
$$\sum_{k=i}^n {n \choose k}y^k(1-y)^{n-k} = \int_0^y \frac{n!}{(i-1)!(n-i)!}x^{i-1}(1-x)^{n-i}dx.$$
(e) Let \(S_i\) denote the time of the ith event of the Poisson process \(\{N(t), t \geq 0\}\). Find$$
E[S_i|N(t)=n] = \left\{
\begin{array}{ll}
\underline{\ \ } \quad i \leq n \\
\underline{\ \ } \quad i > n \\
\end{array}
\right. \\
$$


(a) $$\begin{align}
f_{X_{(i)}}(x) &= {n \choose i} (F(x))^{i-1}(\bar{F}(x))^{n-i} {i \choose 1}f(x) \\
&= \frac{n!}{(i-1)!(n-i)!}(F(x))^{i-1}(\bar{F}(x))^{n-i}f(x)\\
\end{align}$$
(b) \(\geq i\)
(c) $$ P\{X_{(i)} \leq x\} = \sum_{k=i}^n(F(x))^k(\bar{F}(x))^{n-k} $$
(d) $$\begin{align}
&P\{X_{(i)} \leq x\}\\
= &\sum_{k=i}^n(F(x))^k(\bar{F}(x))^{n-k} \\
= & \int_0^{F(x)}\frac{n!}{(i-1)!(n-i)!}(F(t))^{i-1}(\bar{F}(t))^{n-i}dF(t)
\end{align}$$
(e) From Theorem 2.3.1 we know the distribution of \((S_1, \dots, S_n | N(t) = n)\) is identical to \(U_{(1)}, \dots, U_{(n)}\).
Thus, when \(i \leq n\),
$$\begin{align}
E[S_i|N(t)=n] &= E[U_{(i)}] \\
&= \int_0^t x \frac{n!}{(i-1)!(n-i)!}(\frac{x}{t})^{i-1}(1 – \frac{x}{t})^{n-i}\frac{1}{t} dx \\
&= \frac{it}{n + 1} \int_0^t \frac{(n+1)!}{i!(n-i)!}(\frac{x}{t})^i(1 – \frac{x}{t})^{n-i} d \frac{x}{t} \\
&= \frac{it}{n + 1} \sum_{k=i}^n{n \choose k} (\frac{t}{t})^k(1 – \frac{t}{t})^{n-k}\\
&= \frac{it}{n + 1}
\end{align}$$
When \(i >n\), due to the memoryless property, the time remaining before \(t\) after the nth event occur can be omitted. Hence it can be considered as time \(t\) plus \(i – n\) exponential distributed variables with parameter \(\lambda\). Thus,
$$
E[S_i|N(t)=n] = \left\{
\begin{array}{ll}
\frac{it}{n + 1} \quad i \leq n \\
t + \frac{i-n}{\lambda} \quad i > n \\
\end{array}
\right. \\
$$


2.18 Let \(U_{(1)}, \dots, U_{(n)}\) denote the order statistics of a set of \(n\) uniform \((0, 1)\) random variables. Show that given \(U_{(n)} = y, U_{(1)}, \dots, U_{(n-1)}\) are distributed as the order statistics of a set of \(n-1\) uniform \((0, y)\) random variables.


$$\begin{align}
&f(U_{(1)}, \dots, U_{(n-1)} | U_{(n)} = y ) \\
=& (n-1)!\prod_{i \neq (n)} f(y_i|y_{(n)}) \\
=&\frac{(n-1)!}{y^{n-1}}
\end{align}$$


2.19 Busloads of customer arrive at an infinite server queue at a Poisson rate \(\lambda\). Let \(G\) denote the service distribution. A bus contains \(j\) customers with probability \(\alpha_j, j=1, \dots\). Let \(X(t)\) denote the number of customers that have been served by time \(t\).
(a) \(E[X(t)] = ?\)
(b) Is \(X(t)\) Poisson distributed?


(a) Let \(N_j(t)\) denote the number of bus with \(j\) customers which has finish by the time \(t\), \(X_j(t)\) denote the number of customers. Then \(N_j(t)\) are independent Poisson variables with mean \(\lambda \alpha_j \int_0^t G(y)dy\).
$$E[X(t)] = \sum E[X_j(t)] = \sum E[E[X_j(t)|N_j(t)]] = \lambda \int_0^t G(y)dy \sum j\alpha_j $$
(b) No. When \(j=2, P\{N_2(h) = 1\} = P\{X_2(h) = 2\} \neq o(h)\)


2.20 Suppose that each event of a Poisson process with rate \(\lambda\) is classified as being either type \(1, 2, \dots, k\). If the event occurs at \(s\), then, independently of all else, it is classified as type \(i\) with probability \(P_i(s), i = 1, \dots, k, \sum_1^k P_i(s) = 1\). Let \(N_i(t)\) denote the number of type \(i\) arrivals in \([0, t]\). Show that the \(N_i(t), i = 1, \dots, k\) are independent and \(N_i(t)\) is Poisson distributed with mean \(\lambda \int_0^t P_i(s)ds\).


Similar to the proof of Proposition 2.3.2, the probability an arbitrary event occurred in \([0, t]\) is a type-i event is$$p_i = \frac{1}{t}\int_0^t P_i(s)ds$$ independently of the other events. Hence the joint distribution conditioning on \(N(t)\) is
$$\begin{align}
&P\{N_i(t) = n_i\}\\
=& P\{N_i(t) = n_i | N(t) = n\}P\{N(t) = n\} \\
=& \frac{n!}{n_1!\dots n_k!}p_1^{n_1}\dots p_k^{n_k}e^{-\lambda t}\frac{(\lambda t)^n}{n!}\\
=& \prod_{i=1}^k e^{-\lambda tp_i}\frac{(\lambda tp_i)^{n_i}}{n_i!}
\end{align}$$ where \(n = \sum_{i=1}^k n_i, i = 1, 2, \dots\)


2.21 Individuals enter the system in accordance with a Poisson process having rate \(\lambda\). Each arrival independently makes its way through the states of the system. Let \(\alpha_i(s)\) denote the probability that an individual is in state \(i\) a time \(s\) after it arrived. Let \(N_i(t)\) denote the number of individuals in state \(i\) at time \(t\). Show that the \(N_i(t), i \geq 1\), are independent and \(N_i(t)\) is Poisson with mean equal to
\(\lambda E\)[amount of time an individual is in state i during its first t units in the system]


As shown in Problem 2.20, \(N_i(t), i \geq 1\), are independent Poisson with mean \(\lambda \int_0^t \alpha_i(t-s)ds \). And the later part is equal to the expectation of the amount of time an individual is in state i during its first t units in the system.


2.23 For the model of Example 2.3(C), find
(a) \(Var[D(t)] \)
(b) \(Cov[D(t), D(t+s)]\)


(a) Let \(X_i\) denote the ith damage, which are i.i.d. And
$$E[X_i] = \frac{E[D]}{\alpha t}(1 – e^{-\alpha t}) \\
E[X_i^2] = \frac{E[D^2]}{2\alpha t}(1 – e^{-2\alpha t}) $$ Hence,
$$\begin{align}
E[D^2(t)] &= E[E[D^2(t)|N(t) = n]] \\
&= E[E[\sum_{i=1}^n D_i^2e^{-2\alpha(t-S_i)} + \sum_{i \neq j} D_iD_je^{-2\alpha(t – S_i – S_j)} | N(t) = n]] \\
&= E[nE[X_i^2]] + E[n(n-1)E^2[X_i]] \\
&= E[N(t)]E[X_i^2] + E[N^2(t) – N(t)]E^2[X_i]] \\
\end{align}$$ Since \(E[N(t)] = \lambda t, E[N^2(t)] = \lambda^2t^2 + \lambda t\), thus we have,
$$Var[D(t)] = E[D^2(t)] – E^2[D(t)] = \frac{\lambda E[D^2]}{2\alpha}(1 – e^{-2\alpha t})$$
(b) $$\begin{align}
&D(t+s) = D(t)e^{-\alpha s} + D(s) \\
&Cov[D(t), D(t+s)]\\
= &E[D(t)D(t+s)] – E[D(t)]E[D(t+s)] \\
= &E[D^2(t)e^{-\alpha s} + D(s)D(t)] – E[D(t)]E[D(t)e^{-\alpha s} + D(s)] \\
= &e^{-\alpha s} (E[D^2(t)] – E^2[D(t)]) + E[D(s)]E[D(t)] – E[D(t)]E[D(s)] \\
= &\frac{\lambda E[D^2]}{2\alpha}(e^{-\alpha s} – e^{-\alpha (2t + s)})
\end{align}$$


2.25 Suppose that events occur in accordance with a Poisson process with rate \(\lambda\), and that an event occurring at time \(s\), independent of the past, contributes a random amount having distribution \(F_s, s \geq 0\). Show that \(W\), the sum of all contributions by time \(t\), is a compound Poisson random variable. That is, show \(W\) has the same distribution as \(\sum_{i=1}^N X_i\), where the \(X_i\) are independent of \(N\), a Poisson random variable. Identify the distribution of the \(X_i\) and the mean of \(N\).


The distribution function of \(X_i\) is \(F(x) = \frac{1}{t}\int_0^t F_s(x)ds\), the mean of \(N\) is \(\lambda t\).


2.27 Compute the moment generating function of \(D(t)\) in Example 2.3(C).



2.28 Prove Lemma 2.3.3


$$
E[Y_1+\dots +Y_n | Y_1+\dots +Y_n = y ] = y \\
nE[Y_1| Y_1+\dots +Y_n = y ] = y \\
E[Y_1+\dots +Y_k | Y_1+\dots +Y_n = y ] = \frac{k}{n} y \\
$$


2.29 Complete the proof that for nonhomogeneous Poisson process \(N(t+s) -N(t)\) is Poisson with mean \(m(t+s) – m(t)\).


$$\begin{align}
P_n(s+h) &= P\{N(t+s+h) – N(t) =n\} \\
&= P_n(s)[1 – \lambda(t+s)h + o(h)] + P_{n-1}(s)[\lambda(t+s)h + o(h)] \\
P_n^{\prime}(s) + \lambda(t+s)P_n &= \lambda(t+s)P_{n-1} \\
\frac{d}{ds}(e^{\int\lambda(t+s)ds}P_n(s)) &= e^{\int\lambda(t+s)ds}\lambda(t+s)P_{n-1}(s)\\
\end{align}$$
When \(n=0\), it holds true. Assume it holds true for \(n = k – 1\), we can find it’s true for \(n = k\). Proven.


2.30 Let \(T_1, T_2, \dots\) denote the interarrival times of events of a nonhomogeneous Poisson process having intensity function \(\lambda(t)\)
(a) Are the \(T_i\) independent?
(b) Are the \(T_i\) identically distributed?
(c) Find the distribution of \(T_1\).
(d) Find the distribution of \(T_2\).


(a) $$\begin{align}
P\{T_1 > t\} &= P\{N(t) = 0\} = e^{-m(t)} \\
P\{T_2 > t | T_1 = s\} &= P\{N(t+s) – N(s) = 0 | T_1 = s\} \\
&= e^{-[m(t+s) – m(s)]}
\end{align}$$Thus, \(T_i\) are not independent.
(b) No, \(T_2\) depends on \(T_1\).
(c)$$F_{T_1}(t) = 1-e^{-m(t)}$$
(d) $$F_{T_2}(t) =1- \int_0^{\infty} \lambda(s)e^{-m(t+s)}ds$$


2.31 Consider a nonhomogeneous Poisson process \(\{N(t), t \geq 0\}\), where \(\lambda(t) > 0\) for all \(t\). Let
$$N^{*}(t) = N(m^{-1}(t))$$
Show that \(\{N^{*}(t), t \geq 0\}\) is a Poisson process with rate \(\lambda = 1\).


Let \(u = m^{-1}(t)\), and suppose that \(t_1, t_2, \dots\) is a sequence of points in \([0, \infty]\) with \(0 \leq t_1 < t_2 < \dots\). Since \(m^{-1}\) is strictly increasing, we have \(0 \leq u_1 < u_2 < \dots\), where \(u_i = m^{-1}(t_i)\). Due the independence of nonhomogeneous Poisson process, the sequence \(N^*_{t_1}, N^*_{t_2}, \dots\) is also independent.
Suppose that \(t, h \in [0, \infty)\), and let \(u_1 = m^{-1}(t), u_2 = m^{-1}(t+h)\). Hence, \(N^*(t+h) – N^*(t) = N(u_2) – N(u_1)\) has the Poisson distribution with parameter \(m(u_2) – m(u_1) = h\).


2.32 (a) Let \(\{N(t), t \geq 0\}\) be a nonhomogeneous Poisson process with mean value function \(m(t)\). Given \(N(t) = n\), show that the unordered set of arrival times has the same distribution as \(n\) independent and identically distributed random variables having distribution function$$
F(x) = \left\{
\begin{array}{ll}
\frac{m(x)}{m(t)} \quad x \leq t \\
1 \quad x > t \\
\end{array}
\right. $$
(b) Suppose that workers incur accidents in accordance with a nonhomogeneous Poisson process with mean value function \(m(t)\). Suppose further that each injured person is out of work for a random amount of time having distribution \(F\). Let \(X(t)\) be the number of workers who are out of work at time \(t\). Compute \(E[X(t)]\) and \(Var(X(t))\).


(a) Similarly to the proof of Theorem 2.3.1, let \(S_i\) denote the ith event,
$$\begin{align}
&P\{t_i < S_i < t_i + h_i, i = 1, \dots, n|N(t) = n\} \\
=&\frac{n!\prod[m(t_i + h_i) – m(t_i)]}{e^{-m(t)[m(t)]^n/n!}}
\end{align}$$Dividing both sides by \(h_1, \dots, h_n\) and let \(h_i \to 0\):
$$f_{s_1\dots s_n}(t_1,\dots,t_n|N(t) = n) = n!\prod[\lambda(t_i)/m(t)]$$
(b) The probability of an arbitrary injured worker is still out of work at time \(t\) is
$$p = \int_0^t \bar{F}(t-s)d\frac{m(s)}{m(t)} = \int_0^t \bar{F}(t-s)\frac{\lambda(s)}{m(t)}ds$$Thus, we have
$$\begin{align}
E[X(t)] &= E[E[X(t)|N(t)]] \\
&= m(t)\int_0^t \bar{F}(t-s)\frac{\lambda(s)}{m(t)}ds \\
&= \int_0^t \bar{F}(t-s)\lambda(s)ds \\
Var(X(t)) &= E[Var(X(t)|N(t))] + Var(E[X(t)|N(t)]) \\
&= m(t)p(1-p) + p^2Var(N(t)) \\
&= \int_0^t \bar{F}(t-s)\lambda(s)ds
\end{align}$$


2.33 A two-dimensional Poisson process is a process of events in the plane such that (i) for any region of area \(A\), the number of events in \(A\) is Poisson distributed with mean \(\lambda A\), and (ii) the numbers of events in nonoverlapping regions are independent. Consider a fixed point, and let \(X\) denote the distance from that point to its nearest event, where distance is measured in the usual Euclidean manner. Show that
(a) \(P\{X > t\} = e^{-\lambda\pi t^2}\).
(b) \(E[X] = 1 / (2\sqrt{\lambda})\).
(c) Let \(R_i, i \geq 1\) denote the distance from an arbitrary point to the ith closest event to it. Show that, with \(R_0 = 0, \pi R_i^2 – \pi R_{i-1}^2, i \geq 1\) are independent exponential random variables, each with rate \(\lambda\).


(a) $$P\{X > t\} = e^{-\lambda A}\frac{(\lambda A)^0}{0!} = e^{-\lambda\pi t^2} $$
(b) $$\begin{align}
E[X] &= \int_0^{\infty} t d(1 – e^{-\lambda\pi t^2} ) \\
&= -te^{-\lambda \pi t^2}|_0^{\infty} + \frac{1}{\sqrt{\lambda \pi}}\int_0^{\infty} e^{-x^2} dx \\
&= \frac{1}{2\sqrt{\lambda}}
\end{align} $$
(c) Let \(D_i = \pi R_i^2 – \pi R_{i-1}^2\), then
$$
F_{D_i}(x) = 1 – P\{D_i > x\} =1 – e^{-\lambda x}\frac{x^0}{0!} =1 – e^{-\lambda x} $$


2.34 Repeat Problem 2.25 when the events occur according to a nonhomogeneous Poisson process with intensity function \(\lambda(t), t \geq 0\).


As shown in Problem 2.32, the distribution function of \(X_i\) is \(F(x) = \int_0^t F_s(x)\frac{\lambda(s)}{m(t)} ds\), the mean of \(N\) is \(m(t)\).


2.35 Let \(\{N(t), t \geq 0\}\) be a nonhomogeneous Poisson process with intensity function \(\lambda(t), t \geq 0\). However, suppose one starts observing the process at a random time \(\tau\) having distribution function \(F\). Let \(N^*(t) = N(\tau + t) – N(\tau)\) denote the number of events that occur in the first \(t\) time units of observation
(a) Does the process \(\{N^*(t), t \geq 0\}\) process independent increments?
(b) Repeat (a) when \(\{N(t), t \geq 0\}\) is a Poisson process.


\(N^*(t+s) – N^*(t) = N(\tau + t + s) – N(\tau + t)\) which counts the event in a non-overlapping interval with \(N(\tau + t) – N(\tau)\). Thus the increments are independent.


2.36 Let \(C\) denote the number of customers served in an \(M/G/1\) busy period. Find
(a) \(E[C]\).
(b) \(Var(C)\).


(a) \(E[C] = \frac{1}{1 – \lambda\mu_G}\)


2.37 Let \(\{X(t), t \geq 0\}\) be a compound Poisson process with \(X(t) = \sum_{i=1}^{N(t)}X_i\), and suppose that the \(X_i\) can only assume a finite set of possible values. Argue that, for \(t\) large, the distribution of \(X(t)\) is approximately normal.


(i) A Poisson variable converges to a normal distribution as \(\lambda \to \infty\):
Let \(X \sim \pi(\lambda)\), then
$$\begin{align}
Y &= \frac{X-\lambda}{\sqrt{\lambda}} \\
M_Y &= E[exp\{t\frac{X-\lambda}{\sqrt{\lambda}} \}] \\
&= exp\{-t\sqrt{\lambda}\}exp\{\lambda(e^{t/\sqrt{\lambda} – 1})\} \\
&= exp\{-t\sqrt{\lambda} + \lambda(\sum_{i=0}^{\infty} \frac{(t/\sqrt{\lambda})^i}{i!} – 1)\} \\
&= exp\{t^2/2 + t^3/(6\sqrt{\lambda}) + \dots\}\\
\lim_{\lambda \to \infty} M_Y &= exp\{t^2/2\}
\end{align}$$
Thus, \(Y \sim N(0, 1), X \sim N(\lambda, \lambda)\).
(ii) \(X(t)\) can be expressed as the sum the all the possible value \(\alpha_j\) from different independent Poisson process. Namely, \(X(t) = \sum_j \alpha_j N_j(t)\), with parameters \(\lambda p_j t\). When \(t\) is large, all the Poisson variables converges to normal. Hence the sum of \(j\) normal variables is also normal.


2.38 Let \(\{X(t), t \geq 0\}\) be a compound Poisson process with \(X(t) = \sum_{i=1}^{N(t)}X_i\), and suppose that \(\lambda = 1\) and \(P\{X_i = j\} = j/10, j = 1,2,3,4\). Calculate \(P\{X(4) = 20\}\).


\(X(4)\) is compound Poisson variable with parameter \(\lambda t = 4\). From Corollary 2.5.4 we know,
$$\begin{align}
P\{ X(t) = 20 \} &= P_{20} = \frac{1}{5} \sum_{j = 1}^{4} \frac{j^2}{10} P_{20 – j} \\
&= \dots
\end{align}$$


2.39 Compute \(Cov(X(s), X(t))\) for a compound Poisson process.


Suppose \(s \leq t\), then
$$\begin{align}
Cov(X(s), X(t)) &= Cov(X(s), X(t)-X(s)) + Cov(X(s), X(s)) \\
&= Var(X(s)) = \lambda s E[X^2]
\end{align}$$ Symmetrically, when \(t \leq s\), thus we have
$$Cov(X(s), X(t)) = \lambda min(s,t)E[X^2] $$


2.40 Give an example of a counting process \(\{N(t), t \geq 0\}\) that is not a Poisson process but which has the property that conditional on \(N(t) = n\) the first \(n\) event times are distributed as the order statistics from a set of \(n\) independent uniform \((0,t)\) random variables.


Conditional Poisson processes don’t have independent increments, which means they’re not Poisson process. But given \(N(t) = n\) the arrival times are distributed as the order statistics from a set of \(n\) independent uniform \((0,t)\) random variables. Refer the solution for Problem 2.41 in textbook for detail.


6 Replies to “Solutions to Stochastic Processes Ch.2”

  1. You’re a legend! Thank you for making it possible for me to study this course 🙏🏻

  2. 知乎“Beneldor”留言:
    2.4,我认为直接 N(t+s)=N(t)+N(s) 是不对的,他们依分布相等也需要后面两个泊松随机变量相互独立。可以将N(t+s)拆成[N(t+s)-N(t)]+N(t)。

  3. 2.2中第三步(c) 中漏了一个系数 $\lambda$
    $$
    P(N(t)= n ) = \int_0^t f_n(x) \lambda exp\{- \lambda(t – x) \} dx = \frac{\lambda^{n+1}}{\Gamma(n + 1)} t^n e^{-nt} = \Gamma(n+1, \lambda)
    $$

  4. Correction, 2.12(a) 应该为
    \begin{align*}
    &\mathbb{P} \{X_1 > b, X_2 > b, \cdots, X_{k-1} > b \} \\
    =&\prod_{i=1}^{k-1}\mathbb{P} \{X_i > b \} = \prod_{i=1}^{k-1} (1 – \mathbb{P} \{X_i \leq n \}) \\
    =& exp\{-\lambda b(k-1)\}
    \end{align*}

  5. 2.13中应该是想说在条件 T = t的情况下两个Poisson过程分别服从
    $$
    N_1(t) \sim P(\lambda pt), \qquad N_2(t) \sim P(\lambda (1-p) t)
    $$

Leave a Reply

Your email address will not be published. Required fields are marked *

The maximum upload file size: 32 MB. You can upload: image, audio, video, document, spreadsheet, interactive, text, archive, code, other. Links to YouTube, Facebook, Twitter and other services inserted in the comment text will be automatically embedded. Drop file here