



Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
Stationary stochastic processes, their statistical properties, and methods for estimating the mean, autocovariance, and autocorrelation. The document also covers the relationship between the autocorrelation function and the periodogram.
What you will learn
Typology: Study notes
1 / 7
This page cannot be seen from the preview
Don't miss anything!
A discrete time stochastic process is a sequence of random variables Z 1 , Z 2 ,.. .. In practice we will typically analyze a single realization z 1 , z 2 ,.. ., zn of the stochastic process and attempt to esimate the statistical properties of the stochastic process from the realization. We will also consider the problem of predicting zn+1 from the previous elements of the sequence. We will begin by focusing on the very important class of stationary stochas- tic processes. A stochastic process is strictly stationary if its statistical prop- erties are unaffected by shifting the stochastic process in time. In particular, this means that if we take a subsequence Zk+1,.. ., Zk+m, then the joint distribution of the m random variables will be the same no matter what k is. Stationarity requires that the mean of the stochastic process be a constant.
E[Zk] = μ.
and that the variance is constant
V ar[Zk] = σ Z^2.
Also, stationarity requires that the covariance of two elements separated by a distance m is constant. That is, Cov(Zk, Zk+m) is constant. This covariance is called the autocovariance at lag m, and we will use the notation γm. Since Cov(Zk, Zk+m) = Cov(Zk+m, Zk), we need only find γm for m ≥ 0. The correlation of Zk and Zk+m is the autocorrelation at lag m. We will use the notation ρm for the autocorrelation. It is easy to show that
ρk =
γk γ 0
The covariance matrix for the random variables Z 1 ,.. ., Zn is called an auto- covariance matrix.
Γn =
γ 0 γ 1 γ 2... γn− 1 γ 1 γ 0 γ 1... γn− 2
............... γn− 1 γn− 2... γ 1 γ 0
Similarly, we can form an autocorrelation matrix
Pn =
1 ρ 1 ρ 2... ρn− 1 ρ 1 1 ρ 1... ρn− 2
............... ρn− 1 ρn− 2... ρ 1 1
Note that Γn = σ Z^2 Pn. An important property of the autocovariance and autocorrelation matrices is that they are positive semidefinite (PSD). That is, for any vector x, xT^ Γnx ≥ 0 and xT^ Pnx ≥ 0. To prove this, consider the stochastic process Wk, where Wk = x 1 Zk + x 2 Zk− 1 +... xnZk−n+
The variance of Wk is given by
V ar(Wk) =
∑^ n
i=
∑^ n
j=
xixj γ|i−j|
V ar(Wk) = xT^ Γnx
Since V ar(Wk) ≥ 0, the matrix Γn is PSD. In fact, the only way that V ar(Wk) can be zero is Wk is constant. This rarely happens in practice. Challenge: Find a nonconstant stochastic process Zk for which there is some constant Wk. Notice that by the definition of stationarity, the process Wk is also stationary! An important example of a stationary process that we will work with occurs when the joint distribution of Zk,.. ., Zk+n is multivariate normal. In this situation, the autocovariance matrix Γn is precisely the covariance matrix C for the multivariate normal distribution.
0 5 10 15 20 −0.
−0.
0
1
k
rk
Figure 1: Estimated autocorrelation for the example data.
Just as with the sample mean, the autocorrelation estimate rk is a random quantity with its own standard deviation. It can be shown that
V ar(rk) ≈
n
v=−∞
(ρ^2 v + ρv+kρv−k − 4 ρkρv ρv−k + 2ρ^2 v ρ^2 k)
The autocorrelation function typically decays rapidly, so that we can identify a lag q beyond which rk is effectively 0. Under these circumstances, the formula simplifies to
V ar(rk) ≈
n
∑^ q
v=
ρ^2 v ), k > q.
In practice we don’t know ρv , but we can use the estimates rv in the above formula. This provides a statistical test to determine whether or not an auto- correlation rk is statistically different from 0. An approximate 95% confidence interval for rk is rk ± 1. 96 ∗
V ar(rk). If this confidence interval includes 0, then we can’t rule out the possibility that rk really is 0 and that there is no correlation at lag k. Example 2 Returning to our earlier data set, consider the variance of our estimate of r 6. Using q = 5, we estimate that V ar(r 6 ) = .0225 and that the standard deviation is about 0.14. Since r 6 = − 0 .0471 is considerably smaller than the standard deviation, we will decide to ignore rk for k ≥ 6.
The major alternative to using autocorrelation/autocovariance to analyze time series is the use of the periodogram. In this section we will show that there is an equivalence between a form of the periodogram and the autocovariance function. For our purposes, the autocorrelation function is more convenient, so we will continue to use it throughout this sequence of lectures. The form of the periodogram that we will use is slightly different from the periodogram that we have previously obtained from the FFT. We will fit a Fourier series of the form
zn = a 0 +
∑^ q
j=
(aj cos(2πjn/N ) + bj sin(2πjn/N ))
to the time series. Here N is the number of data points (which we will assume is even), and q = N/2. The coefficients can be obtained by
a 0 = ¯z
aj =
k=
zk cos(2πjk/N ), j = 1, 2 ,... , q − 1
bj =
k=
zk sin(2πjk/N ), j = 1, 2 ,... , q − 1
aq =
j=
(−1)j^ zj
bq = 0. The relationship between these coefficients and the fft Zn is
aj = 2 real(Zj+1e−^2 πij/N^ ), j = 1, 2 ,... , q − 1
bj = −2 imag(Zj+1e−^2 πij/N^ ), j = 1, 2 ,... , q − 1 aq = −Zq+1/N bq = 0 The periodogram is then defined to be
I(j) =
(a^2 j + b^2 j ), j = 1, 2 ,... , q − 1
I(q) = N a^2 q
We can go a step further, and define the sample spectrum at any frequency f , 0 ≤ f < 1 /2, as
I(f ) =
(a^2 f + b^2 f )
0 0.1 0.2 0.3 0.4 0. −
0
20
40
60
80
100
120
140
f
I(f) (db)
Figure 2: Sample spectrum for the example data.