Arithmetic Brownian Motion Process and SDEs

We discuss various things related to the Arithmetic Brownian Motion Process- these include solution of the SDEs, derivation of its Characteristic Function and Moment Generating Function, derivation of the mean, variance, and covariance, and explanation of the calibration and simulation of the process.

Moment Generating Function

Let’s derive the expression for the moment generating function. Recall that the moment generating function of a real valued random variable Y with probability distribution function \(P[Y]\) is defined as follows,

$$M_{Y} \left( \theta \right)=E \left[e^{\theta Y} \right]=\int_{-\infty}^{\infty}{e^{\theta y}dP\left(y\right)}$$

If the distribution function is absolutely continuous, meaning density which we represent by \(p(Y)\), exists, then we can write the moment generating function in terms of the density,

$$M_{Y} \left( \theta \right)=E \left[e^{\theta Y} \right]=\int_{-\infty}^{\infty}{e^{\theta y}p\left(y\right)dy}$$

As the derivation of the moment generating function involves the very same steps that we used when deriving the characteristic function, let’s deduce the moment generating function of the Arithmetic Brownian motion using the already derived expressions for the characteristic function,

$$\Psi_{X_T} \left(t\right)=E \left[ e^{it \left(X_0+\mu T+\sigma \sqrt{T}\,Z\right)}\right]=e^{it \left(X_0+\mu T\right)-\frac{1}{2} t^2\,\sigma^2 T}$$

We just have \(\theta\) in place of \(it\),

$$M_{X_T} \left(\theta\right)=E \left[ e^{\theta \left(X_0+\mu T+\sigma \sqrt{T}\,Z\right)}\right]=e^{\theta \left(X_0+\mu T\right)+\frac{1}{2} \theta^2\,\sigma^2 T}$$

The moment generating function is a useful representation of the probability distribution. As a use of the mgf, we can use it to derive the mean and variance formulae that we saw before. The story goes as follows. If you want to derive the k-th moment, take the k-th derivative of the moment generating, and set it equal to zero. For example, the first two moments of our Arithmetic Brownian can be derived using the following equalities,

$$\left. \frac{d}{d \theta} M_{X_T} \left( \theta \right)\right|_{\theta=0}=E \left[ X_T\right]$$

$$\left. \frac{d^2}{d \theta^2} M_{X_T} \left( \theta \right)\right|_{\theta=0}=E \left[ X_T^2\right]$$

To see that this produces the correct result, let’s calculate the first derivative of the moment generating function, which is just down to the chain rule,

$$\frac{d}{d \theta} M_{X_T} \left( \theta \right)=\frac{d}{d\theta}\left(e^{\theta \left(X_0+\mu T\right)+\frac{1}{2} \theta^2\,\sigma^2 T}\right)$$ $$=e^{\theta \left(X_0+\mu T\right)+\frac{1}{2} \theta^2\,\sigma^2 T}\frac{d}{d\theta}\left(\theta \left(X_0+\mu T\right)+\frac{1}{2} \theta^2\,\sigma^2 T\right)$$ $$=e^{\theta \left(X_0+\mu T\right)+\frac{1}{2} \theta^2\,\sigma^2 T}\left( \left(X_0+\mu T\right)+ \theta\,\sigma^2 T\right)$$

Setting \(\theta\) equal to zero indeed gives the first moment, which is the expected value/mean of the process,

$$\left. \frac{d}{d \theta} M_{X_T} \left( \theta \right)\right|_{\theta=0}=e^{0}\left( \left(X_0+\mu T\right)+ 0\right)$$ $$=X_0+\mu T=E\left[X_T\right]$$

As an exercise, it would be great if you could try to verify the variance formula. It would help deepen understanding of the steps involved in this kinda use of the moment generating function.