Stationary Stochastic Processes

Oh, Hyunzi. (email: wisdom302@naver.com)
Korea University, Graduate School of Economics.


Main References

  • Kim, Dukpa. (2022). "Time Series Econometrics" (2022 Fall) ECON 512, Department of Economics, Korea University.
  • Hamilton, J. D. (1994). "Time Series Analysis". Princeton University Press.

Stochastic Processes

Definition (stochastic process).

A stochastic process is a family of random variables defined on a probability space .

A stochastic process is a function with two arguments, one from the index set and the other from the sample space . For a given , the stochastic process can be viewed as a function of , making it possible to plot along the index set . This plot is called as a realization or sample path of the process .

Remark (time index set).

A time index set is a set of time points, very often defined as either

Depending on the nature of the index set, the stochastic process defined on is either called as discrete or continuous stochastic process. While we mostly define as or , we will only focus on the case when takes the real value.

Strict Stationarity

Definition (strict stationarity).

The time series is said to be strictly stationary if the joint distributions of and are the same for all positive integers and for all .

Here, you can simply understand the joint distribution as a joint cdf (cumulative distribution function). Note that the indexes do not have to be a consecutive number of integers, implying that we can just randomly chooses the time indexes and shift it to . Here, the shift must not affect to the joint distribution to guarantee that the process is stationary.

Remark (implication of strict stationarity).

Strict stationarity is a very strong concept, since it requires that the joint distribution of any finite intervals in the stochastic process is invariant to any finite shift of the time origin.

Remark (iid process).

If the random variable are independently and identically distributed with mean zero and variance , we denote and by definition, iid random variables are strictly stationary.

If the random variables are independent, then we only need to consider the marginal distributions to get the joint distribution. Furthermore, if it is a iid (identically independently distributed) process, then it is intuitively a strict stationary process.

However, considering the characteristics of the time series data, it is rarely a set of independent random variables and would present significant dependence between the different time index. Thus, in practice, we need to make the concept flexible that embraces the time dependence among the random variables.

Before defining less strict concept, we need to consider the foremost property of the random variables, which is the moments. The (weak) stationarity will be defined using the moments, rather than directly using the distributions.

Weak Stationarity

Definition (autocovariance function).

If is a process such that for each , then the autocovariance function (ACF) of is defined by

Definition (weak stationarity).

The time series is said to be (weak) stationary if the following three conditions hold:

  • ,
  • ,
  • ,

Note that Definition 7 (weak stationarity) is often referred to as second order stationarity or covariance stationarity. Since the second moment is finite, the first moment of the weak stationary process will also be finite. Here, we exclude the case when the second moment is infinite, to ensure the case when it has the same finite moments.

Also, remark that the last condition also defines the equality of the variance over any , since if , then

Remark (strictly and weak stationarity).

Note that strict stationarity does not always imply weak stationarity since the strict stationarity may not exhibit the second moment. Of course, if the a time series is strictly stationary and have finite variance, then it is also weakly stationary.

Remark (Gaussian stationarity).

Note that the strictly stationary Gaussian process is also a weakly stationary process, since the Gaussian distribution is completely characterized by the first two moments.

Definition (white noise process).

The random variables is said to be white noise, denoted as if it have mean zero and autocovariance function of

White noise process is a process that does not have correlation with any different time, except itself. One can imagine of a process that vibrates around zero without any specific pattern. Of course, white noise process is a stationary process.

Autocorrelation Function

Remark (acf under stationary).

If is stationary, then only depends on since it equals to any shift by . Thus we can simply write it as .

Definition (autocorrelation function).

Let be a stationary process. Then an autocorrelation function (acrf) is defined as

Note that we can drive Definition 12 (autocorrelation function) from the definition of the correlation coefficient: since is a stationary process.

Lemma (properties of autocovariance).

Let be a stationary process. Then for the autocovariance function, we have

  1. by CSI
  2. is even, i.e.

Proof.First statement can be directly driven since for any , since is stationary.

Second statement can be shown using the Cauchy-Schwarz inequality as thus by taking square root on the both sides, we have .

Third statement directly comes from the Definition 7 (weak stationarity).

Lemma (properties of autocorrelation).

Let be a stationary process. Then for the autocorrelation function, we have:

  1. is even;

Proof., and the second statement is directly from the properties of the auto-covariance. The last statement is corollary of Lemma 13 (properties of autocovariance).

Martingale Transform

In econometrics, the stationarity is often understand as martingale transformation. Here, we briefly introduces the key definitions of the martingale.

Definition (filterlation and adaption).

Let be a filterlation, which is an increasing sequence of fields. A sequence is said to be adapted to if for all .

Definition (martingale).

A sequence is martingale with respect to if the following three conditions hold:

  1. is adapted to
  2. for all .

If in the last definition, is replaced by or , then is said to be supermartingale or submartingale, respectively.

Definition (predictable sequence).

Let be a filterlation for . Then , for , is said to be a predictable sequence if for all . Intuitively, the value of may be predicted (with certainty) from the information available at time .

Definition (martingale transformation).

If is predictable with for any , and if is a bounded martingale, then the martingale transformation of by is defined by and note that it is also a martingale.

Here, you can understand the martingale transformation of as a multiplied by the difference between of some time series .

For instance, consider a definition of a profit defined in finance. You can understand as a stock price and as a profile of investment. Thus can be understand as a payoff from the profile of .