Solutions to Stochastic Processes Ch.8

Solutions to Stochastic Processes Sheldon M. Ross Second Edition(pdf)
Since there is no official solution manual for this book, I
handcrafted the solutions by myself. Some solutions were referred from web, most copyright of which are implicit, can’t be listed clearly. Many thanks to those authors! Hope these solutions be helpful, but No Correctness or Accuracy Guaranteed. Comments are welcomed. Excerpts and links may be used, provided that full and clear credit is given.

In Problem 8.1, 8.2 and 8.3, let $$\{X(t), t \geq 0\}$$ denote a Brownian motion process.

8.1 Let $$Y(t) = tX(1/t)$$.
(a) What is distribution of $$Y(t)$$?
(b) Compute $$Cov(Y(s), Y(t)$$
(c) Argue that $$\{Y(t), t \geq 0\}$$ is also Brownian motion
(d) Let $$T = inf\{t>0: X(t)=0\}$$. Using (c) present an argument that $$P\{T = 0\} = 1$$.

(a) $$X(1/t) \sim N(0, 1/t)$$ then $$Y(t) = tX(1/t) \sim N(0, t)$$
(b) \begin{align} Cov(Y(s), Y(t)) &= Cov(sX(1/s), tX(1/t)) \\ &= st\cdot Cov(X(1/s), X(1/t)) \\ &= st\cdot min(1/s, 1/t) = min(s, t) \end{align}
(c) Since $$\{X(t)\}$$ is a Gaussian process so is $$\{Y(t)\}$$. Further from parts (a) and (b) above $$\{Y(t)\}$$ is a Brownian motion.
(d) Since $$Y(t)$$ is Brownian motion then $$T_1 \equiv sup\{t: Y(t) = 0\} = \infty$$ with probability 1. Note $$\{T = 0\} = \{T_1 = \infty\}$$. Thus, $$P\{T = 0\} = 1$$

8.2 Let $$W(t) = X(a^2t)/a$$ for $$a > 0$$. Verify that $$W(t)$$ is also Brownian motion.

$$W(0) = X(0)/a = 0$$. Non-overlapping increments of $$W(t)$$ map to non-overlapping increments of $$X(t)$$. Thus increments of $$W(t)$$ are independent. Further, for $$s < t$$,
$$W(t) – W(s) = \frac{X(a^2t) – X(a^2s)}{a} \sim N(0, t-s)$$
Thus $$W(t)$$ has stationary increments with required distribution. Therefore, $$W(t)$$ is a Brownian motion.

8.5 A stochastic process $$\{X(t), t \geq 0\}$$ is said to be stationary if $$X(t_1), \dots, X(t_n)$$ has the same joint distribution as $$X(t_1+a), \dots, X(t_n +a)$$ for all $$n, a, t_1, \dots, t_n$$.
(a) Prove that a necessary and sufficient condition for a Gaussian process to be stationary is that $$Cov(X(s), X(t))$$ depends only on $$t-s, s \leq t$$, and $$E[X(t)] = c$$.
(b) Let $$\{X(t), t \geq 0\}$$ be Brownian motion and define
$$V(t) = e^{-\alpha t/2}X(\alpha e^{\alpha t})$$
Show that $$\{V(t), t \geq 0\}$$ is a stationary Gaussian process. It is called Ornstein-Uhlenbeck process.

(a) If the Gaussian process is stationary then for $$t > s, (X(t), X(s))$$ and $$(X(t-s), X(0))$$ have same distribution. Thus, $$E[X(s)] = E[X(0)]$$ for all $$s$$ and $$Cov(X(t), X(s)) = Cov(X(t-s), X(0))$$, for all $$t < s$$. Now, assume $$E[X(t)] = c$$ and $$Cov(X(t), X(s)) = h(t-s)$$. For any $$T = (t_1, \dots, t_k)$$ define vector $$X_T \equiv (X(t_1), \dots, X(t_k))$$. Let $$\tilde{T} = (t_1-a, \dots, t_k -a)$$. If $$\{X(t)\}$$ is a Gaussian process then both $$X_T$$ and $$X_{\tilde{T}}$$ are multivariate normal and it suffices to show that they have the same mean and covariance. This follows directly from the fact that they have the same element-wise mean $$c$$ and the equal pair-wise covariances, $$Cov(X(t_i-a), X(t_j -a)) = h(t_i-t_j) = Cov(X(t_i), X(t_j))$$
(b) Since all finite dimensional distributions of $$\{V(t)\}$$ are normal, it is a Gaussian process. Thus from part (a) is suffices to show the following:
(i) $$E[V(t)] = e^{-at/2}E[X(\alpha e^{\alpha t})] = 0$$. Thus $$E[V(t)]$$ is constant.
(ii) For $$s \leq t$$, \begin{align} Cov(V(s), V(t)) &= e^{-\alpha(t+s)/2}Cov(X(\alpha e^{\alpha s}), X(\alpha e^{\alpha t}))\\ &= e^{-\alpha(t+s)/2}\alpha e^{\alpha s} = \alpha e^{-\alpha(t-s)/2} \end{align}
which depends only on $$t-s$$.

8.8 Suppose $$X(1) = B$$. Characterize, in the manner of Proposition 8.1.1, $$\{X(t), 0 \leq t \leq 1\}$$ given that $$X(1) = B$$.

Condition on $$X(1) = B, X(t) \sim N(Bt, t(1-t))$$, then $$Z(t) \sim N(0, t(1-t))$$ which is a Brownian motion.

8.9 Let $$M(t) = max_{0 \leq s \leq t} X(s)$$ and show that
$$P\{M(t) > a| M(t) = X(t)\} = e^{-a^2/2t}, \quad a > 0$$

From Section 8.3.1, we get
$$P\{M(t) > y, X(t) < x\} = \int_{2y-x}^{\infty}\frac{1}{\sqrt{2\pi t}}e^{-u^2/2t}du$$
By using Jacobian formula, we can derive the density function of $$M(t)$$ and $$W(t) = M(t) – X(t)$$, which we denote by $$f_{MW}$$. Thus
$$f_W(w) = 2\int_0^{\infty}f_{MW}(m, w)dm \\ P\{M(t) > a | W(t) = 0\} = 1 – \int_0^a \frac{f_{MW}(m, 0)}{f_W(0)}dm$$
The last equation can be computed, which equal $$e^{-a^2/2t}$$

8.10 Compute the density function of $$T_x$$, the time until Brownian motion hits $$x$$.

\begin{align} f_{T_x}(t) &= F_{T_x}^{\prime}(t) = (\frac{2}{\sqrt{2\pi}}\int_{|x|/\sqrt{t}}^{\infty}e^{-y^2/2}dy)^{\prime} \\ &= -\frac{2}{\sqrt{2\pi}} \cdot e^{-x^2/2t} \cdot \frac{x}{2}t^{-3/2}\\ &= -\frac{x}{\sqrt{2\pi}}e^{-x^2/2t}t^{-3/2} \end{align}

8.11 Let $$T_1$$ denote the largest zero of $$X(t)$$ that is less than $$t$$ and let $$T_2$$ be the smallest zero greater than $$t$$. Show that
(a) $$P\{T_2 < s\} = (2/\pi)\arccos\sqrt{t/s}, s> t$$.
(b) $$P\{T_1 < s, T_2 > y\} = (2/\pi)\arcsin\sqrt{s/y}, s < t< y$$.

(a) \begin{align} P\{T_2 < s\} &= 1 – P\{\text{no zeros in } (t, s)\} \\ &= 1 – \frac{2}{\pi}\arcsin\sqrt{t/s} \\ &= (2/\pi)\arccos\sqrt{t/s} \end{align}
(b) $$P\{T_1 < s, T_2 > y\} = P\{\text{no zeros in } (s, y)\} = \frac{2}{\pi}\arcsin\sqrt{s/y}$$

8.12 Verify the formulas given in (8.3.4) for the mean and variance of $$|X(t)|$$.

$$f_Z(y) = (\frac{2}{\sqrt{2\pi t}}\int_{-\infty}^y e^{-x^2/2t}dx – 1)^{\prime} = \frac{2}{\sqrt{2\pi t}}e^{-y^2/2t}\\ E[Z(t)] = \int_{0}^{\infty}yf_Z(y)dy = -\frac{2t}{\sqrt{2\pi t}}e^{-y^2/2t}|_0^{\infty} = \sqrt{2t/\pi} \\ Var(Z(t)) = E[Z^2(t)] – E^2[Z(t)] = E[X^2(t)] – E^2[Z(t)] = (1 – \frac{2}{\pi})t$$

8.13 For Brownian motion with drift coefficient $$\mu$$, show that for $$x>0$$
$$P\{max_{0 \leq s \leq h} |X(s)| > x\} = o(h).$$

8.18 Let $$\{X(t), t \geq 0\}$$ be a Brownian motion with drift coefficient $$\mu, \mu < 0$$, which is not allowed to become negative. Find the limiting distribution of $$X(t)$$.

8.19 Consider Brownian motion with reflecting barriers of $$-B$$ and $$A, A >0, B > 0$$. Let $$p_t(x)$$ denote the density function of $$X_t$$.
(a) Compute a differential equation satisfied by $$p_t(x)$$.
(b) Obtain $$p(x) = \lim_{t \to \infty} p_t(x)$$.

8.20 Prove that, with probability 1, for Brownian motion with drift $$\mu$$.
$$\frac{X(t)}{t} \to \mu, \quad \text{ as } t \to \infty$$

8.21 Verify that if $$\{B(t), t \geq 0\}$$ is standard Brownian motion then $$\{Y(t), t \geq 0\}$$ is a martingale with mean 1, when $$Y(t) = exp\{cB(t) – c^2t/2\}$$

\begin{align} E[Y(t)] &= \int_{-\infty}^{\infty} exp\{cx – c^2t/2\}\frac{1}{\sqrt{2\pi t}}exp\{-x^2/2t\}dx\\ &= \int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi t}}exp\{-(x-ct)^2/2t\}dx = 1 \end{align} \begin{align} E[Y(t)|Y(u), 0 \leq u \leq s] &= Y(s) + E[Y(t) – Y(s)|Y(u), 0 \leq u \leq s ]\\ &= Y(s) \cdot E[exp\{c(B(t) – B(s)) – c^2(t-s)/2\}] \\ &= Y(s) \cdot E[Y(t-s)] = Y(s) \end{align}

8.22 In Problem 8.16, find $$Var(T_a)$$ by using a martingale argument.

8.23 Show that
$$p(x,t;y) \equiv \frac{1}{\sqrt{2\pi t}}e^{-(x – y – \mu t)^2/2t}$$
satisfies the backward and forward diffusion. Equations (8.5.1) and (8.5.2)

Just do it : )

8.24 Verify Equation (8.7.2)

Let $$f(s, y) = [\phi(se^{-\alpha y}) – 1]dy$$, then
\begin{align} E[X(t)] &=\frac{d}{ds}E[exp\{sX(t)\}]|_{s=0} \\ &= exp\{\lambda\int_0^t f(0, y)dy\} \lambda \int_0^t \frac{d}{ds}f(s, y)|_{s=0} dy \\ &= \lambda E[X](1 – e^{-\alpha t})/\alpha\\ Var(X(t)) &= E[X^2(t)] – E^2[X(t)] \\ &= \frac{d^2}{ds^2}E[exp\{sX(t)\}]|_{s=0} – E^2[X(t)] \\ &= \lambda E[X^2](1 – e^{-2\alpha t})/2\alpha \end{align}

8.25 Verify that $$\{X(t) = N(t + L) – N(t), t \geq 0\}$$ is stationary when $$\{N(t)\}$$ is a Poisson process.

For any $$t, X(t) = N(t + L) – N(t) = N(L)$$, thus
$$E[X(t)] = E[N(L)] = \lambda L\\ Cov(X(t), X(t+s)) = Var(N(L)) = \lambda L$$

8.26 Let $$U$$ be uniformly distributed over $$(-\pi, \pi)$$, and let $$X_n = cos(nU)$$. By using trigonometric identity
$$\cos x \cos y = \frac{1}{2} [\cos(x+y) + \cos(x-y)]$$
verify that $$\{X_n, n \geq 1\}$$ is a second-order stationary process.

\begin{align} E[X_n] &= \frac{1}{n\pi}\int_{-n\pi}^{n\pi} \cos xdx = 0\\ Cov(X_{n+L}, X_n) &= E[X_{n+L}X_n] – E[X_{n+L}]E[X_n] \\ &= \frac{1}{2}E[X_{2n+L} + X_L] = 0 \end{align}

8.27 Show that
$$\sum_{i=1}^n \frac{R(i)}{n} \to 0 \quad \text{implies} \quad {\sum\sum}_{i < j < n}\frac{R(j-i)}{n^2} \to 0$$
thus completing the proof of Proposition 8.8.1

8.28 Prove the Cauchy-Schwarz inequality:
$$(E[XY])^2 \leq E[X^2]E[Y^2]$$
(Hint: Start with the inequality $$2|xy| \leq x^2 + y^2$$ and then substitute $$X/\sqrt{E[X^2]}$$ for $$x$$ and $$Y/\sqrt{E[Y^2]}$$ for $$y$$)

Since $$2xy \leq x^2 + y^2$$, then
\begin{align} 2\frac{X}{\sqrt{E[X^2]}}\frac{Y}{\sqrt{E[Y^2]}} &\leq \frac{X^2}{E[X^2]} + \frac{Y^2}{E[Y^2]} \\ E[2\frac{X}{\sqrt{E[X^2]}}\frac{Y}{\sqrt{E[Y^2]}}] &\leq E[\frac{X^2}{E[X^2]} + \frac{Y^2}{E[Y^2]}]\\ 2\frac{E[XY]}{\sqrt{E[X^2]E[Y^2]}} &\leq 2\\ (E[XY])^2 &\leq E[X^2]E[Y^2] \end{align}

8.29 For a second-order stationary process with mean $$\mu$$ for which $$\sum_{i=0}^{n-1}R(i)/n \to 0$$, show that for any $$\varepsilon > 0$$
$$\sum_{i=0}^{n-1}P\{|\bar{X_n} – \mu| > \varepsilon \} \to 0 \quad \text{as } n \to \infty$$