Application Center - Maplesoft

App Preview:

Stationary and Invertible AR & MA - Models

You can switch back to the summary page by clicking here.

Learn about Maple
Download Application


 

Image 

 

 

 

 

Stationary and Invertible AR & MA Models 

 

 

The following was implemented in Maple by Marcus Davidsson (2008) davidsson_marcus@hotmail.com 

and is based upon the work by Brooks, C (2008) Introductory Econometrics for Finance 

 

 

 

 

 

 

This worksheet is divided into four main chapters: 

 

 

 

1) Basic Concepts            

 

2) MA(1) Model          

 

3) MA(2) Model                      

 

4) AR(1) Model                       

 

 

 

 

 

 

1) Basic Concepts 

 

 

 

 

1.1) Proposition-1 

 

restart; 1 

 

We first note that if E(x) is the Arithmetic Average of x then the Covariance(x,x) is given by. 

 

 

`and`(Covariance(x, x) = `*`(E, `*`([`*`(`+`(x, `-`(E(x))), `+`(x, `-`(E(x))))])), `and`(`*`(E, `*`([`*`(`+`(x, `-`(E(x))), `+`(x, `-`(E(x))))])) = `*`(E, `*`([`*`(`^`(`+`(x, `-`(E(x))), 2))])), `*`(E... 

 

 

and the Correlation between x and x is defined as 

 

`and`(Correlation(x, x) = `/`(`*`(Covariance(x, x)), `*`(Variance(x))), `and`(`/`(`*`(Covariance(x, x)), `*`(Variance(x))) = `*`(Variance(x), `/`(1, `*`(Variance(x)))), `*`(Variance(x), `/`(1, `*`(Var... 

 

 

 

1.2) Proposition-2 

 

 

if E(y) is the Arithmetic Average of y then the Covariance(x,y) is given by. 

 

 

Covariance(x, y) = `*`(E, `*`([`*`(`+`(x, `-`(E(x))), `*`(`+`(y, `-`(E(y)))))])) 

 

 

and the Correlation between a and y is defined as 

 

 

`and`(Correlation(x, y) = `/`(`*`(Covariance(x, y)), `*`(Variance(x))), `/`(`*`(Covariance(x, y)), `*`(Variance(x))) = `*`(`*`(E, `*`([`*`(`+`(x, `-`(E(x))), `*`(`+`(y, `-`(E(y)))))])), `/`(1, `*`(E, ... 

 

 

 

1.3) Proposition-3 

 

 

The AutoCovariance for y(t) and y(t) at time t is therefore given by: 

 

`and`(AutoCovariance(y(t), y(t)) = `*`(E, `*`([`*`(`+`(y(t), `-`(E(y(t)))), `+`(y(t), `-`(E(y(t)))))])), `and`(`*`(E, `*`([`*`(`+`(y(t), `-`(E(y(t)))), `+`(y(t), `-`(E(y(t)))))])) = `*`(E, `*`([`*`(`^...
`and`(AutoCovariance(y(t), y(t)) = `*`(E, `*`([`*`(`+`(y(t), `-`(E(y(t)))), `+`(y(t), `-`(E(y(t)))))])), `and`(`*`(E, `*`([`*`(`+`(y(t), `-`(E(y(t)))), `+`(y(t), `-`(E(y(t)))))])) = `*`(E, `*`([`*`(`^...
 

 

 

and the AutoCorrelation between y(t) and y(t) is defined as 

 

`and`(AutoCorrelation(x, x) = `/`(`*`(AutoCovariance(y(t), y(t))), `*`(Variance(y(t)))), `and`(`/`(`*`(AutoCovariance(y(t), y(t))), `*`(Variance(y(t)))) = `*`(Variance(y(t)), `/`(1, `*`(Variance(y(t))... 

 

 

 

 

1.4) Proposition-4 

 

 

The AutoCovariance for y(t) and y(t-1) at time t is given by: 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `*`(E, `*`([`*`(`+`(y(t), `-`(E(y(t)))), `*`(`+`(y(`+`(t, `-`(1))), `-`(E(y(`+`(t, `-`(1))))))))])) 

 

 

The AutoCorrelation for lag L is therefore defined as: 

 

 

`and`(AutoCorrelation(L) = `/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(L))))), `*`(AutoCovariance(y(t), y(t)))), `/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(L))))), `*`(AutoCovariance(y(t), y(t)))) = `/`(... 

 

 

 

 

1.5) Proposition-5 

 

 

The first assumption states that the Mean of the error terms is equal to zero, E(u) = 0  

 

This also makes sense since the error term is completely random (coin toss: Head (1) or tail (-1), average value is zero) 

 

This means that Expected Value (Arithmetic Average) of a random variable with any lag is approximately equal to zero. 

 

 

For example if  `∼`(r, `*`(Typesetting:-delayDotProduct(Typesetting:-delayDotProduct(i, i), d), `*`(N(0, 1)))) then E[r[t]] = 0 or   E[r[`+`(t, `-`(1))]] = 0 or  E[r[`+`(t, `-`(2))]] = 0  

 

 

restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i to n do `:=`(A[1], 0); `:=`(A[i], ...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i to n do `:=`(A[1], 0); `:=`(A[i], ...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i to n do `:=`(A[1], 0); `:=`(A[i], ...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i to n do `:=`(A[1], 0); `:=`(A[i], ...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i to n do `:=`(A[1], 0); `:=`(A[i], ...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i to n do `:=`(A[1], 0); `:=`(A[i], ...
 

0.5695197572e-2 (1)
 

 

 

restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
 

-0.2201708302e-1 (2)
 

 

 

restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
 

0.2390784009e-1 (3)
 

 

 

 

 

1.6) Proposition-6 

 

 

We know that the Mean of the error terms is equal to zero, E(u) = 0 

 

The second assumption therefore states that the variance `*`(`^`(sigma, 2))of the error terms is equal to the  

 

Expected Value of the error terms^2 (Since E(u) = 0 ) 

 

`and`(Variance(u) = `*`(`^`(sigma, 2)), `and`(`*`(`^`(sigma, 2)) = E(`*`(`^`(`+`(u, `-`(E(u))), 2))), `and`(E(`*`(`^`(`+`(u, `-`(E(u))), 2))) = E(`*`(`^`(`+`(u, 0), 2))), E(`*`(`^`(`+`(u, 0), 2))) = E... 

 

 

restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, [seq(r[i], i = 1 .. n)]); -1; Variance(rr); 1
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, [seq(r[i], i = 1 .. n)]); -1; Variance(rr); 1
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, [seq(r[i], i = 1 .. n)]); -1; Variance(rr); 1
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, [seq(r[i], i = 1 .. n)]); -1; Variance(rr); 1
 

.9731764954 (4)
 

 

restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, [seq(`*`(`^`(r[i], 2)), i = 1 .. n)]); -1; ExpectedValue(rr); 1
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, [seq(`*`(`^`(r[i], 2)), i = 1 .. n)]); -1; ExpectedValue(rr); 1
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, [seq(`*`(`^`(r[i], 2)), i = 1 .. n)]); -1; ExpectedValue(rr); 1
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, [seq(`*`(`^`(r[i], 2)), i = 1 .. n)]); -1; ExpectedValue(rr); 1
 

.9750803709 (5)
 

 

 

This means that Expected Value of a random variable^2 with any lag is approximately equal to the Variance of that random variable. 

 

For example if  `∼`(r, `*`(Typesetting:-delayDotProduct(Typesetting:-delayDotProduct(i, i), d), `*`(N(0, 1))))  then  E(`*`(`^`(r[t], 2))) = `*`(`^`(sigma, 2)) or  E(`*`(`^`(r[`+`(t, `-`(1))], 2))) = `*`(`^`(sigma, 2))  or  E(`*`(`^`(r[`+`(t, `-`(2))], 2))) = `*`(`^`(sigma, 2)) 

 

 

restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, [seq(r[i], i = 1 .. n)]); -1; Variance(rr); 1
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, [seq(r[i], i = 1 .. n)]); -1; Variance(rr); 1
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, [seq(r[i], i = 1 .. n)]); -1; Variance(rr); 1
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, [seq(r[i], i = 1 .. n)]); -1; Variance(rr); 1
 

.9262921950 (6)
 

 

restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
 

.9065590715 (7)
 

 

restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
 

.9509723990 (8)
 

 

 

 

1.7) Proposition-7 

 

 

We know that the Mean of the error terms is equal to zero, E(u) = 0 

 

The third assumption therefore states that that the Covariance between the error terms is zero (Since E(u) = 0 ). 

 

This also makes sense since the error term is completely random there should not be any dependency between them. 

 

`and`(Covariance(u[i], u[j]) = E[(`+`(u[i], `-`(E(u[i]))))(`+`(u[j], `-`(E(u[j]))))], `and`(E[(`+`(u[i], `-`(E(u[i]))))(`+`(u[j], `-`(E(u[j]))))] = E[(`+`(u[i], 0))(`+`(u[j], 0))], `and`(E[(`+`(u[i], ...     

 

 

This means that Expected Value (Arithmetic Average) of the product of any lagged random variables is approximately zero. 

 

 

For example if  `∼`(r, `*`(Typesetting:-delayDotProduct(Typesetting:-delayDotProduct(i, i), d), `*`(N(0, 1))))  then  E(`*`(r[t], `*`(r[`+`(t, `-`(1))]))) = 0 or  E(`*`(r[t], `*`(r[`+`(t, `-`(2))]))) = 0  or  E(`*`(r[`+`(t, `-`(1))], `*`(r[`+`(t, `-`(2))]))) = 0 

 

 

restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 2 to n do `:=`(A[1], 0); `:=`...
 

0.1434372477e-1 (9)
 

 

 

restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
 

-0.9104557114e-2 (10)
 

 

 

restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
restart; 1; randomize(); -1; with(Statistics); -1; `:=`(n, 500); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; `:=`(rr, seq(r[i], i = 1 .. n)); -1; for i from 3 to n do `:=`(A[1], 0); `:=`...
 

0.3146246689e-1 (11)
 

 

 

 

 

 

 

 

 

2) A Moving Average Model of Order One MA(1) 

 

 

 

Our First Order Moving Average Model is given by: 

 

 

`and`(y(t) = `+`(`*`(p1, `*`(r[`+`(t, `-`(1))])), `*`(r[t], `*`(where, `*`(r)))), `∼`(`+`(`*`(p1, `*`(r[`+`(t, `-`(1))])), `*`(r[t], `*`(where, `*`(r)))), `*`(Typesetting:-delayDotProduct(Typesett... 

 

 

where p1 is a first order serial correlation parameter. Note that the amount of first order serial correlation in y(t) is not equal to p1 

 

Note that `<`(p1, 1) for a invertible MA(1) model. 

 

For an invertible MA(q) all roots of the characteristic equation should lie outside of the unit circle. 

 

 

 

We know that we can calculate the AutoCorrelation for any lag as: 

 

 

AutoCorrelation(L) = `/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(L))))), `*`(AutoCovariance(y(t), y(t)))) 

 

 

 

 

 

2.1) AutoCovariance(y(t) y(t)) 

 

 

 

We can now calculate the denominator in the equation for the AutoCorrelation(L) given by AutoCovariance(y(t), y(t)) 

 

 

If we take the Expectation on both sides of the equation for the First Order Moving Average Model we get: 

 

 

E(y(t)) = E[`+`(`*`(p1, `*`(r[`+`(t, `-`(1))])), r[t])] 

 

 

Which can be written as: 

 

 

E(y(t)) = `+`(`*`(p1, `*`(E[r[`+`(t, `-`(1))]])), E[r[t]]) 

 

 

We know from Proposition-5 that the Expected Value of a random variable with any lag is approximately equal to zero 

 

This means that E[r[`+`(t, `-`(2))]] = 0 and E[r[`+`(t, `-`(1))]] = 0 and E[r[t]] = 0 

 

 

This gives us: 

 

 

E[y(t)] = 0 

 

 

We now note that the AutoCovariance for y(t) and y(t) is given by 

 

`and`(AutoCovariance(y(t), y(t)) = `*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `+`(y(t), `-`(E[y(t)])))])), `*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `+`(y(t), `-`(E[y(t)])))])) = `*`(E, `*`([`*`(`^`(`+`(... 

 

 

Since E[y(t)] = 0 the above equation is reduced to 

 

 

AutoCovariance(y(t), y(t)) = `*`(E, `*`([`*`(`^`(y(t), 2))])) 

 

 

We now note that y(t) = z is defined as: 

 

 

`:=`(z, `+`(`*`(p1, `*`(r[`+`(t, `-`(1))])), r[t])) 

`+`(`*`(p1, `*`(r[`+`(t, `-`(1))])), r[t]) (12)
 

 

expand(`*`(`^`(z, 2))) 

`+`(`*`(`^`(p1, 2), `*`(`^`(r[`+`(t, `-`(1))], 2))), `*`(2, `*`(p1, `*`(r[`+`(t, `-`(1))], `*`(r[t])))), `*`(`^`(r[t], 2))) (13)
 

 

 

If we substitute in that expression we get: 

 

 

AutoCovariance(y(t), y(t)) = `*`(E, `*`([`+`(`*`(`^`(p1, 2), `*`(`^`(r[`+`(t, `-`(1))], 2))), `*`(2, `*`(p1, `*`(r[`+`(t, `-`(1))], `*`(r[t])))), `*`(`^`(r[t], 2)))])) 

 

 

Which can be written as: 

 

 

AutoCovariance(y(t), y(t)) = E[`+`(`*`(`^`(p1, 2), `*`(`^`(r[`+`(t, `-`(1))], 2))), `*`(`^`(r[t], 2)), [`+`(`*`(2, `*`(p1, `*`(r[`+`(t, `-`(1))], `*`(r[t])))))])] 

 

 

Which can be written as: 

 

 

AutoCovariance(y(t), y(t)) = `+`(`*`(`^`(p1, 2), `*`(E(`*`(`^`(r[`+`(t, `-`(2))], 2))))), E(`*`(`^`(r[t], 2))), [`+`(`*`(2, `*`(p1, `*`(E(`*`(r[`+`(t, `-`(1))], `*`(r[t])))))))]) 

 

 

 

We know from Proposition-6 that the Expected Value of a random variable^2 with any lag is approximately equal to the Variance of that

variable.
 

 

This means that E(`*`(`^`(r[`+`(t, `-`(2))], 2))) = `*`(`^`(sigma, 2)) and  E(`*`(`^`(r[t], 2))) = `*`(`^`(sigma, 2)) 

 

 

This gives us: 

 

 

AutoCovariance(y(t), y(t)) = `+`(`*`(`^`(p1, 2), `*`(`^`(sigma, 2))), `*`(`^`(sigma, 2)), [`+`(`*`(2, `*`(p1, `*`(E(`*`(r[`+`(t, `-`(1))], `*`(r[t])))))))]) 

 

 

 

We know from Proposition-7 that the Expected Value of the product of a lagged random variables is approximately zero. 

 

 

This means that E(`*`(r[`+`(t, `-`(1))], `*`(r[t]))) = 0 .  

 

 

Note that these terms are called cross products and are always equal to zero 

 

 

This gives us: 

 

 

AutoCovariance(y(t), y(t)) = `+`(`*`(`^`(p1, 2), `*`(`^`(sigma, 2))), `*`(`^`(sigma, 2))) 

 

 

Which can be written as: 

 

 

AutoCovariance(y(t), y(t)) = `*`(`+`(1, `*`(`^`(p1, 2))), `*`(`^`(sigma, 2))) 

 

 

 

Which is our final expression for the AutoCovariance(y(t), y(t)) 

 

 

 

 

 

 

2.2) AutoCovariance(y(t) y(t-1)) 

 

 

restart; 1 

 

 

We can now calculate the numerator in the equation for the AutoCorrelation(L) given by AutoCovariance(y(t), y(`+`(t, `-`(L)))) 

 

 

We will in this section calculate the AutoCovariance for lag one 

 

 

The AutoCovariance for y(t) and y(t-1) AutoCovariance(y(t), y(`+`(t, `-`(1))))is given by 

 

 

`and`(`*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `*`(`+`(y(`+`(t, `-`(1))), `-`(E[y(`+`(t, `-`(1)))]))))])) = `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(1))), 0)))])), `*`(E, `*`([`*`(`+`(y(t), ...
`and`(`*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `*`(`+`(y(`+`(t, `-`(1))), `-`(E[y(`+`(t, `-`(1)))]))))])) = `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(1))), 0)))])), `*`(E, `*`([`*`(`+`(y(t), ...
`and`(`*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `*`(`+`(y(`+`(t, `-`(1))), `-`(E[y(`+`(t, `-`(1)))]))))])) = `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(1))), 0)))])), `*`(E, `*`([`*`(`+`(y(t), ...
      AA 

 

 

We now note that the equation for our First Order Moving Average Model is given by 

 

 

y(t) = `+`(`*`(p1, `*`(r[`+`(t, `-`(1))])), r[t]) 

 

 

This equation can be generalized. The equation for y(`+`(t, `-`(1))) is therefore given by 

 

 

y(`+`(t, `-`(1))) = `+`(`*`(p1, `*`(r[`+`(t, `-`(2))])), r[`+`(t, `-`(1))]) 

 

 

We now note that y(t) = z is defined as: 

 

 

`:=`(z, `+`(`*`(p1, `*`(r[`+`(t, `-`(1))])), r[t])) 

`+`(`*`(p1, `*`(r[`+`(t, `-`(1))])), r[t]) (14)
 

 

We now note that y(`+`(t, `-`(1))) = zz is defined as: 

 

 

`:=`(zz, `+`(`*`(p1, `*`(r[`+`(t, `-`(2))])), r[`+`(t, `-`(1))])) 

`+`(`*`(p1, `*`(r[`+`(t, `-`(2))])), r[`+`(t, `-`(1))]) (15)
 

This gives us 

 

expand(`*`(z, `*`(zz))) 

`+`(`*`(`^`(p1, 2), `*`(r[`+`(t, `-`(1))], `*`(r[`+`(t, `-`(2))]))), `*`(p1, `*`(`^`(r[`+`(t, `-`(1))], 2))), `*`(r[t], `*`(p1, `*`(r[`+`(t, `-`(2))]))), `*`(r[t], `*`(r[`+`(t, `-`(1))]))) (16)
 

 

 

If we plug in this expression into the expression for the AutoCovariance for y(t) and y(t-1) given by expression  AA we get: 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `*`(E, `*`([`+`(`*`(`^`(p1, 2), `*`(r[`+`(t, `-`(1))], `*`(r[`+`(t, `-`(2))]))), `*`(p1, `*`(`^`(r[`+`(t, `-`(1))], 2))), `*`(r[t], `*`(p1, `*`(r[`+`(t, `-`(2... 

 

 

We know since previously ( Proposition-7 ) that the Expected value of the cross products are equal to zero, so we get: 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `*`(E, `*`([`*`(p1, `*`(`^`(r[`+`(t, `-`(1))], 2)))])) 

 

 

Which can be written as 

 

 

 

 

 

 

We know since previously (Proposition-6) that the Expected Value of a random variable^2 with any lag is approximately equal to the Variance

of that variable. This gives us:
 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `*`(p1, `*`(`^`(sigma, 2))) 

 

 

 

Which is our final expression for the AutoCovariance(y(t), y(`+`(t, `-`(1)))) 

 

 

 

 

 

2.3) AutoCovariance(y(t) y(t-2)) 

 

 

 

restart; 1 

 

 

We can now calculate the numerator in the equation for the AutoCorrelation(L) given by AutoCovariance(y(t), y(`+`(t, `-`(L)))) 

 

 

We will in this section calculate the AutoCovariance for lag two 

 

 

The AutoCovariance for y(t) and y(t-2) AutoCovariance(y(t), y(`+`(t, `-`(2))))is given by 

 

 

`and`(`*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `*`(`+`(y(`+`(t, `-`(2))), `-`(E[y(`+`(t, `-`(2)))]))))])) = `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(2))), 0)))])), `*`(E, `*`([`*`(`+`(y(t), ...
`and`(`*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `*`(`+`(y(`+`(t, `-`(2))), `-`(E[y(`+`(t, `-`(2)))]))))])) = `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(2))), 0)))])), `*`(E, `*`([`*`(`+`(y(t), ...
`and`(`*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `*`(`+`(y(`+`(t, `-`(2))), `-`(E[y(`+`(t, `-`(2)))]))))])) = `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(2))), 0)))])), `*`(E, `*`([`*`(`+`(y(t), ...
      AA 

 

 

We now note that the equation for our First Order Moving Average Model is given by 

 

 

y(t) = `+`(`*`(p1, `*`(r[`+`(t, `-`(1))])), r[t]) 

 

 

This equation can be generalized. The equation for y(`+`(t, `-`(2))) is therefore given by 

 

 

y(`+`(t, `-`(2))) = `+`(`*`(p1, `*`(r[`+`(t, `-`(3))])), r[`+`(t, `-`(2))]) 

 

 

We now note that y(t) = z is defined as: 

 

 

`:=`(z, `+`(`*`(p1, `*`(r[`+`(t, `-`(1))])), r[t])) 

`+`(`*`(p1, `*`(r[`+`(t, `-`(1))])), r[t]) (17)
 

 

We now note that y(`+`(t, `-`(2))) = zz is defined as: 

 

 

`:=`(zz, `+`(`*`(p1, `*`(r[`+`(t, `-`(3))])), r[`+`(t, `-`(2))])) 

`+`(`*`(p1, `*`(r[`+`(t, `-`(3))])), r[`+`(t, `-`(2))]) (18)
 

This gives us 

 

expand(`*`(z, `*`(zz))) 

`+`(`*`(`^`(p1, 2), `*`(r[`+`(t, `-`(1))], `*`(r[`+`(t, `-`(3))]))), `*`(p1, `*`(r[`+`(t, `-`(1))], `*`(r[`+`(t, `-`(2))]))), `*`(r[t], `*`(p1, `*`(r[`+`(t, `-`(3))]))), `*`(r[t], `*`(r[`+`(t, `-`(2))... (19)
 

 

 

If we plug in this expression into the expression for the AutoCovariance for y(t) and y(t-1) given by expression  AA we get: 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `*`(E, `*`([`+`(`*`(`^`(p1, 2), `*`(r[`+`(t, `-`(1))], `*`(r[`+`(t, `-`(3))]))), `*`(p1, `*`(r[`+`(t, `-`(1))], `*`(r[`+`(t, `-`(2))]))), `*`(r[t], `*`(p1, `*... 

 

 

 

We know since previously ( Proposition-7 ) that the Expected Value of the cross products are equal to zero, so we get: 

 

 

`and`(AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `*`(E, `*`([0])), `*`(E, `*`([0])) = 0) 

 

 

 

Which is our final expression for the AutoCovariance(y(t), y(`+`(t, `-`(2)))) 

 

 

 

 

 

 

2.4) Summary for Our First Order Moving Average Model  

 

 

 

We have in the previous sections calculated the AutoCovariances for y(t), y(t-1), y(t-2) and y(t-3) given by: 

 

AutoCovariance(y(t), y(t)) = `*`(`+`(1, `*`(`^`(p1, 2))), `*`(`^`(sigma, 2))) 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `*`(p1, `*`(`^`(sigma, 2))) 

 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = 0 

 

 

 

We know that we can calculate the AutoCorrelation for any lag as: 

 

 

AutoCorrelation(L) = `/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(L))))), `*`(AutoCovariance(y(t), y(t)))) 

 

 

This means that: 

 

`and`(AutoCorrelation(0) = `*`(AutoCovariance(y(t), y(t)), `/`(1, `*`(AutoCovariance(y(t), y(t))))), `and`(`*`(AutoCovariance(y(t), y(t)), `/`(1, `*`(AutoCovariance(y(t), y(t))))) = `*`(`*`(`+`(1, `*`... 

 

 

`and`(AutoCorrelation(1) = `/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(1))))), `*`(AutoCovariance(y(t), y(t)))), `and`(`/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(1))))), `*`(AutoCovariance(y(t), y(t)))) ... 

 

 

 

`and`(AutoCorrelation(2) = `/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(2))))), `*`(AutoCovariance(y(t), y(t)))), `and`(`/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(2))))), `*`(AutoCovariance(y(t), y(t)))) ... 

 

 

 

For example if we assume that

`:=`(p1, -.5); -1

then the AutoCorrelation for each lag is given by
 

 

 

 

AutoCorrelation(1) = `/`(`*`(p1), `*`(`+`(1, `*`(`^`(p1, 2))))) 

AutoCorrelation(1) = -.4000000000 (20)
 

 

AutoCorrelation(2) = 0 

AutoCorrelation(2) = 0 (21)
 

 

 

 

 

We can show that this indeed is true by simulating a MA(1) 

 

 

restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(p...
 

 

 

 

 

 

Plot_2d
`.`(first, order, serial, correlation) = -.4081605040
`.`(second, order, serial, correlation) = 0.1660581667e-1
`.`(third, order, serial, correlation) = -0.1604253316e-1
`.`(fourth, order, serial, correlation) = 0.1112586367e-1
Plot_2d
 

 

 

 

 

 

 

3) A Moving Average Model of Order Two MA(2) 

 

 

 

Our Second Order Moving Average Model is given by: 

 

 

`and`(y(t) = `+`(`*`(p2, `*`(r[`+`(t, `-`(2))])), `*`(p1, `*`(r[`+`(t, `-`(1))])), `*`(r[t], `*`(where, `*`(r)))), `∼`(`+`(`*`(p2, `*`(r[`+`(t, `-`(2))])), `*`(p1, `*`(r[`+`(t, `-`(1))])), `*`(r[t... 

 

 

where again p1 is a first order serial correlation parameter and p2 is a second order serial correlation parameter

Again note that the amount of first and second order serial correlation in y(t) is not equal to p1 and p2 respectively
 

 

 

We know that we can calculate the AutoCorrelation for any lag as: 

 

 

AutoCorrelation(L) = `/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(L))))), `*`(AutoCovariance(y(t), y(t)))) 

 

 

 

 

 

3.1) AutoCovariance(y(t) y(t)) 

 

 

 

We can now calculate the denominator in the equation for the AutoCorrelation(L) given by AutoCovariance(y(t), y(t)) 

 

 

If we take the Expectation on both sides of the equation for the Second Order Moving Average Model we get: 

 

 

E(y(t)) = E[`+`(`*`(p2, `*`(r[`+`(t, `-`(2))])), `*`(p1, `*`(r[`+`(t, `-`(1))])), r[t])] 

 

 

Which can be written as: 

 

 

E(y(t)) = `+`(`*`(p2, `*`(E[r[`+`(t, `-`(2))]])), `*`(p1, `*`(E[r[`+`(t, `-`(1))]])), E[r[t]]) 

 

 

We know from Proposition-5 that the Expected Value of a random variable with any lag is approximately equal to zero 

 

This means that E[r[`+`(t, `-`(2))]] = 0 and E[r[`+`(t, `-`(1))]] = 0 and E[r[t]] = 0 

 

 

This gives us: 

 

 

E[y(t)] = 0 

 

 

We now note that the AutoCovariance for y(t) and y(t) is given by 

 

`and`(AutoCovariance(y(t), y(t)) = `*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `+`(y(t), `-`(E[y(t)])))])), `*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `+`(y(t), `-`(E[y(t)])))])) = `*`(E, `*`([`*`(`^`(`+`(... 

 

 

Since E[y(t)] = 0 the above equation is reduced to 

 

 

AutoCovariance(y(t), y(t)) = `*`(E, `*`([`*`(`^`(y(t), 2))])) 

 

 

We now note that y(t) = z is defined as: 

 

 

`:=`(z, `+`(`*`(p2, `*`(r[`+`(t, `-`(2))])), `*`(p1, `*`(r[`+`(t, `-`(1))])), r[t])) 

`+`(`*`(p2, `*`(r[`+`(t, `-`(2))])), `*`(p1, `*`(r[`+`(t, `-`(1))])), r[t]) (22)
 

 

expand(`*`(`^`(z, 2))) 

`+`(`*`(`^`(p2, 2), `*`(`^`(r[`+`(t, `-`(2))], 2))), `*`(2, `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(p1, `*`(r[`+`(t, `-`(1))]))))), `*`(2, `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(r[t])))), `*`(`^`(p1, 2), `*`(`... (23)
 

 

 

If we substitute in that expression we get: 

 

 

AutoCovariance(y(t), y(t)) = `*`(E, `*`([`+`(`*`(`^`(p2, 2), `*`(`^`(r[`+`(t, `-`(2))], 2))), `*`(2, `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(p1, `*`(r[`+`(t, `-`(1))]))))), `*`(2, `*`(p2, `*`(r[`+`(t, `-`(...
AutoCovariance(y(t), y(t)) = `*`(E, `*`([`+`(`*`(`^`(p2, 2), `*`(`^`(r[`+`(t, `-`(2))], 2))), `*`(2, `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(p1, `*`(r[`+`(t, `-`(1))]))))), `*`(2, `*`(p2, `*`(r[`+`(t, `-`(...
 

 

 

Which can be written as: 

 

 

AutoCovariance(y(t), y(t)) = E[`+`(`*`(`^`(p2, 2), `*`(`^`(r[`+`(t, `-`(2))], 2))), `*`(`^`(p1, 2), `*`(`^`(r[`+`(t, `-`(1))], 2))), `*`(`^`(r[t], 2)), [`+`(`*`(2, `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(p...
AutoCovariance(y(t), y(t)) = E[`+`(`*`(`^`(p2, 2), `*`(`^`(r[`+`(t, `-`(2))], 2))), `*`(`^`(p1, 2), `*`(`^`(r[`+`(t, `-`(1))], 2))), `*`(`^`(r[t], 2)), [`+`(`*`(2, `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(p...
 

 

 

Which can be written as: 

 

 

AutoCovariance(y(t), y(t)) = `+`(`*`(`^`(p2, 2), `*`(E(`*`(`^`(r[`+`(t, `-`(2))], 2))))), `*`(`^`(p1, 2), `*`(E(`*`(`^`(r[`+`(t, `-`(1))], 2))))), E(`*`(`^`(r[t], 2))), [`+`(`*`(2, `*`(p2, `*`(p1, `*`...
AutoCovariance(y(t), y(t)) = `+`(`*`(`^`(p2, 2), `*`(E(`*`(`^`(r[`+`(t, `-`(2))], 2))))), `*`(`^`(p1, 2), `*`(E(`*`(`^`(r[`+`(t, `-`(1))], 2))))), E(`*`(`^`(r[t], 2))), [`+`(`*`(2, `*`(p2, `*`(p1, `*`...
 

 

 

 

We know from Proposition-6 that the Expected Value of a random variable^2 with any lag is approximately equal to the Variance of that variable. 

 

This means that E(`*`(`^`(r[`+`(t, `-`(2))], 2))) = `*`(`^`(sigma, 2)) and  E(`*`(`^`(r[`+`(t, `-`(1))], 2))) = `*`(`^`(sigma, 2))  and  E(`*`(`^`(r[t], 2))) = `*`(`^`(sigma, 2)) 

 

 

This gives us: 

 

 

AutoCovariance(y(t), y(t)) = `+`(`*`(`^`(p2, 2), `*`(`^`(sigma, 2))), `*`(`^`(p1, 2), `*`(`^`(sigma, 2))), `*`(`^`(sigma, 2)), [`+`(`*`(2, `*`(p2, `*`(p1, `*`(E(`*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`...
AutoCovariance(y(t), y(t)) = `+`(`*`(`^`(p2, 2), `*`(`^`(sigma, 2))), `*`(`^`(p1, 2), `*`(`^`(sigma, 2))), `*`(`^`(sigma, 2)), [`+`(`*`(2, `*`(p2, `*`(p1, `*`(E(`*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`...
 

 

 

 

We know from Proposition-7 that the Expected Value of the product of a lagged random variables is approximately zero. 

 

 

This means that E(`*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(1))]))) = 0 and  E(`*`(r[`+`(t, `-`(2))], `*`(r[t]))) = 0  or  E(`*`(r[`+`(t, `-`(1))], `*`(r[t]))) = 0 .  

 

 

Note that these terms are called cross products and are always equal to zero 

 

 

This gives us: 

 

 

AutoCovariance(y(t), y(t)) = `+`(`*`(`^`(p2, 2), `*`(`^`(sigma, 2))), `*`(`^`(p1, 2), `*`(`^`(sigma, 2))), `*`(`^`(sigma, 2))) 

 

 

Which can be written as: 

 

 

AutoCovariance(y(t), y(t)) = `*`(`+`(1, `*`(`^`(p1, 2)), `*`(`^`(p2, 2))), `*`(`^`(sigma, 2))) 

 

 

 

Which is our final expression for the AutoCovariance(y(t), y(t)) 

 

 

 

 

 

 

3.2) AutoCovariance(y(t) y(t-1)) 

 

 

restart; 1 

 

 

We can now calculate the numerator in the equation for the AutoCorrelation(L) given by AutoCovariance(y(t), y(`+`(t, `-`(L)))) 

 

 

We will in this section calculate the AutoCovariance for lag one 

 

 

The AutoCovariance for y(t) and y(t-1) AutoCovariance(y(t), y(`+`(t, `-`(1))))is given by 

 

 

`and`(`*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `*`(`+`(y(`+`(t, `-`(1))), `-`(E[y(`+`(t, `-`(1)))]))))])) = `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(1))), 0)))])), `*`(E, `*`([`*`(`+`(y(t), ...
`and`(`*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `*`(`+`(y(`+`(t, `-`(1))), `-`(E[y(`+`(t, `-`(1)))]))))])) = `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(1))), 0)))])), `*`(E, `*`([`*`(`+`(y(t), ...
`and`(`*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `*`(`+`(y(`+`(t, `-`(1))), `-`(E[y(`+`(t, `-`(1)))]))))])) = `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(1))), 0)))])), `*`(E, `*`([`*`(`+`(y(t), ...
      AA 

 

 

We now note that the equation for our Second Order Moving Average Model is given by 

 

 

y(t) = `+`(`*`(p2, `*`(r[`+`(t, `-`(2))])), `*`(p1, `*`(r[`+`(t, `-`(1))])), r[t]) 

 

 

This equation can be generalized. The equation for y(`+`(t, `-`(1))) is therefore given by 

 

 

y(`+`(t, `-`(1))) = `+`(`*`(p2, `*`(r[`+`(t, `-`(3))])), `*`(p1, `*`(r[`+`(t, `-`(2))])), r[`+`(t, `-`(1))]) 

 

 

We now note that y(t) = z is defined as: 

 

 

`:=`(z, `+`(`*`(p2, `*`(r[`+`(t, `-`(2))])), `*`(p1, `*`(r[`+`(t, `-`(1))])), r[t])) 

`+`(`*`(p2, `*`(r[`+`(t, `-`(2))])), `*`(p1, `*`(r[`+`(t, `-`(1))])), r[t]) (24)
 

 

We now note that y(`+`(t, `-`(1))) = zz is defined as: 

 

 

`:=`(zz, `+`(`*`(p2, `*`(r[`+`(t, `-`(3))])), `*`(p1, `*`(r[`+`(t, `-`(2))])), r[`+`(t, `-`(1))])) 

`+`(`*`(p2, `*`(r[`+`(t, `-`(3))])), `*`(p1, `*`(r[`+`(t, `-`(2))])), r[`+`(t, `-`(1))]) (25)
 

This gives us 

 

expand(`*`(z, `*`(zz))) 

`+`(`*`(`^`(p2, 2), `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(3))]))), `*`(p2, `*`(`^`(r[`+`(t, `-`(2))], 2), `*`(p1))), `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(1))]))), `*`(p1, `*`(r[`+`(t, `-`...
`+`(`*`(`^`(p2, 2), `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(3))]))), `*`(p2, `*`(`^`(r[`+`(t, `-`(2))], 2), `*`(p1))), `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(1))]))), `*`(p1, `*`(r[`+`(t, `-`...
(26)
 

 

 

If we plug in this expression into the expression for the AutoCovariance for y(t) and y(t-1) given by expression  AA we get: 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `*`(E, `*`([`+`(`*`(`^`(p2, 2), `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(3))]))), `*`(p2, `*`(`^`(r[`+`(t, `-`(2))], 2), `*`(p1))), `*`(p2, `*`(r[`+`(t, `-`(2)...
AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `*`(E, `*`([`+`(`*`(`^`(p2, 2), `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(3))]))), `*`(p2, `*`(`^`(r[`+`(t, `-`(2))], 2), `*`(p1))), `*`(p2, `*`(r[`+`(t, `-`(2)...
 

 

 

We know since previously ( Proposition-7 ) that the Expected value of the cross products are equal to zero, so we get: 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `*`(E, `*`([`+`(`*`(p2, `*`(`^`(r[`+`(t, `-`(2))], 2), `*`(p1))), `*`(p1, `*`(`^`(r[`+`(t, `-`(1))], 2))))])) 

 

 

Which can be written as 

 

 

 

 

 

 

We know since previously (Proposition-6) that the Expected Value of a random variable^2 with any lag is approximately equal to the Variance of  

 

that variable. This gives us: 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `+`(`*`(p1, `*`(p2, `*`(`^`(sigma, 2)))), `*`(p1, `*`(`^`(sigma, 2)))) 

 

 

Which can be written as 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `*`(`+`(p1, `*`(p1, `*`(p2))), `*`(`^`(sigma, 2))) 

 

 

 

Which is our final expression for the AutoCovariance(y(t), y(`+`(t, `-`(1)))) 

 

 

 

 

 

3.3) AutoCovariance(y(t) y(t-2)) 

 

 

 

restart; 1 

 

 

We can now calculate the numerator in the equation for the AutoCorrelation(L) given by AutoCovariance(y(t), y(`+`(t, `-`(L)))) 

 

 

We will in this section calculate the AutoCovariance for lag two 

 

 

The AutoCovariance for y(t) and y(t-2) AutoCovariance(y(t), y(`+`(t, `-`(2))))is given by 

 

 

`and`(`*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `*`(`+`(y(`+`(t, `-`(2))), `-`(E[y(`+`(t, `-`(2)))]))))])) = `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(2))), 0)))])), `*`(E, `*`([`*`(`+`(y(t), ...
`and`(`*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `*`(`+`(y(`+`(t, `-`(2))), `-`(E[y(`+`(t, `-`(2)))]))))])) = `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(2))), 0)))])), `*`(E, `*`([`*`(`+`(y(t), ...
`and`(`*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `*`(`+`(y(`+`(t, `-`(2))), `-`(E[y(`+`(t, `-`(2)))]))))])) = `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(2))), 0)))])), `*`(E, `*`([`*`(`+`(y(t), ...
      AA 

 

 

We now note that the equation for our Second Order Moving Average Model is given by 

 

 

y(t) = `+`(`*`(p2, `*`(r[`+`(t, `-`(2))])), `*`(p1, `*`(r[`+`(t, `-`(1))])), r[t]) 

 

 

This equation can be generalized. The equation for y(`+`(t, `-`(2))) is therefore given by 

 

 

y(`+`(t, `-`(2))) = `+`(`*`(p2, `*`(r[`+`(t, `-`(4))])), `*`(p1, `*`(r[`+`(t, `-`(3))])), r[`+`(t, `-`(2))]) 

 

 

We now note that y(t) = z is defined as: 

 

 

`:=`(z, `+`(`*`(p2, `*`(r[`+`(t, `-`(2))])), `*`(p1, `*`(r[`+`(t, `-`(1))])), r[t])) 

`+`(`*`(p2, `*`(r[`+`(t, `-`(2))])), `*`(p1, `*`(r[`+`(t, `-`(1))])), r[t]) (27)
 

 

We now note that y(`+`(t, `-`(2))) = zz is defined as: 

 

 

`:=`(zz, `+`(`*`(p2, `*`(r[`+`(t, `-`(4))])), `*`(p1, `*`(r[`+`(t, `-`(3))])), r[`+`(t, `-`(2))])) 

`+`(`*`(p2, `*`(r[`+`(t, `-`(4))])), `*`(p1, `*`(r[`+`(t, `-`(3))])), r[`+`(t, `-`(2))]) (28)
 

This gives us 

 

expand(`*`(z, `*`(zz))) 

`+`(`*`(`^`(p2, 2), `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(4))]))), `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(p1, `*`(r[`+`(t, `-`(3))])))), `*`(p2, `*`(`^`(r[`+`(t, `-`(2))], 2))), `*`(p1, `*`(r[`+`(t, `-`...
`+`(`*`(`^`(p2, 2), `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(4))]))), `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(p1, `*`(r[`+`(t, `-`(3))])))), `*`(p2, `*`(`^`(r[`+`(t, `-`(2))], 2))), `*`(p1, `*`(r[`+`(t, `-`...
(29)
 

 

 

If we plug in this expression into the expression for the AutoCovariance for y(t) and y(t-1) given by expression  AA we get: 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `*`(E, `*`([`+`(`*`(`^`(p2, 2), `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(4))]))), `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(p1, `*`(r[`+`(t, `-`(3))])))), `*`(p2, `*`...
AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `*`(E, `*`([`+`(`*`(`^`(p2, 2), `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(4))]))), `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(p1, `*`(r[`+`(t, `-`(3))])))), `*`(p2, `*`...
 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `*`(E, `*`([`+`(`*`(`^`(p2, 2), `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(4))]))), `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(p1, `*`(r[`+`(t, `-`(3))])))), `*`(p2, `*`...
AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `*`(E, `*`([`+`(`*`(`^`(p2, 2), `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(4))]))), `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(p1, `*`(r[`+`(t, `-`(3))])))), `*`(p2, `*`...
(30)
 

 

 

We know since previously ( Proposition-7 ) that the Expected Value of the cross products are equal to zero, so we get: 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `*`(E, `*`([`*`(p2, `*`(`^`(r[`+`(t, `-`(2))], 2)))])) 

 

 

Which can be written as 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `*`(p2, `*`(E(`*`(`^`(r[`+`(t, `-`(2))], 2))))) 

 

 

 

We know since previously (Proposition-6) that the Expected Value of a random variable^2 with any lag is approximately equal to the Variance of  

 

that variable. This gives us: 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `*`(p2, `*`(`^`(sigma, 2))) 

 

 

Which is our final expression for the AutoCovariance(y(t), y(`+`(t, `-`(2)))) 

 

 

 

 

 

3.4) AutoCovariance(y(t) y(t-3)) 

 

 

 

restart; 1 

 

 

We can now calculate the numerator in the equation for the AutoCorrelation(L) given by AutoCovariance(y(t), y(`+`(t, `-`(L)))) 

 

 

We will in this section calculate the AutoCovariance for lag three 

 

 

The AutoCovariance for y(t) and y(t-3) AutoCovariance(y(t), y(`+`(t, `-`(3))))is given by 

 

 

`and`(`*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `*`(`+`(y(`+`(t, `-`(3))), `-`(E[y(`+`(t, `-`(3)))]))))])) = `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(3))), 0)))])), `*`(E, `*`([`*`(`+`(y(t), ...
`and`(`*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `*`(`+`(y(`+`(t, `-`(3))), `-`(E[y(`+`(t, `-`(3)))]))))])) = `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(3))), 0)))])), `*`(E, `*`([`*`(`+`(y(t), ...
`and`(`*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `*`(`+`(y(`+`(t, `-`(3))), `-`(E[y(`+`(t, `-`(3)))]))))])) = `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(3))), 0)))])), `*`(E, `*`([`*`(`+`(y(t), ...
      AA 

 

 

We now note that the equation for our Second Order Moving Average Model is given by 

 

 

y(t) = `+`(`*`(p2, `*`(r[`+`(t, `-`(2))])), `*`(p1, `*`(r[`+`(t, `-`(1))])), r[t]) 

 

 

This equation can be generalized. The equation for y(`+`(t, `-`(3))) is therefore given by 

 

 

y(`+`(t, `-`(3))) = `+`(`*`(p2, `*`(r[`+`(t, `-`(5))])), `*`(p1, `*`(r[`+`(t, `-`(4))])), r[`+`(t, `-`(3))]) 

 

 

We now note that y(t) = z is defined as: 

 

 

`:=`(z, `+`(`*`(p2, `*`(r[`+`(t, `-`(2))])), `*`(p1, `*`(r[`+`(t, `-`(1))])), r[t])) 

`+`(`*`(p2, `*`(r[`+`(t, `-`(2))])), `*`(p1, `*`(r[`+`(t, `-`(1))])), r[t]) (31)
 

 

We now note that y(`+`(t, `-`(3))) = zz is defined as: 

 

 

`:=`(zz, `+`(`*`(p2, `*`(r[`+`(t, `-`(5))])), `*`(p1, `*`(r[`+`(t, `-`(4))])), r[`+`(t, `-`(3))])) 

`+`(`*`(p2, `*`(r[`+`(t, `-`(5))])), `*`(p1, `*`(r[`+`(t, `-`(4))])), r[`+`(t, `-`(3))]) (32)
 

This gives us 

 

expand(`*`(z, `*`(zz))) 

`+`(`*`(`^`(p2, 2), `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(5))]))), `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(p1, `*`(r[`+`(t, `-`(4))])))), `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(3))]))), `*`(p1, ...
`+`(`*`(`^`(p2, 2), `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(5))]))), `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(p1, `*`(r[`+`(t, `-`(4))])))), `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(3))]))), `*`(p1, ...
(33)
 

 

 

If we plug in this expression into the expression for the AutoCovariance for y(t) and y(t-1) given by expression  AA we get: 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(3)))) = `*`(E, `*`([`+`(`*`(`^`(p2, 2), `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(5))]))), `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(p1, `*`(r[`+`(t, `-`(4))])))), `*`(p2, `*`...
AutoCovariance(y(t), y(`+`(t, `-`(3)))) = `*`(E, `*`([`+`(`*`(`^`(p2, 2), `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(5))]))), `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(p1, `*`(r[`+`(t, `-`(4))])))), `*`(p2, `*`...
 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `*`(E, `*`([`+`(`*`(`^`(p2, 2), `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(5))]))), `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(p1, `*`(r[`+`(t, `-`(4))])))), `*`(p2, `*`...
AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `*`(E, `*`([`+`(`*`(`^`(p2, 2), `*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(5))]))), `*`(p2, `*`(r[`+`(t, `-`(2))], `*`(p1, `*`(r[`+`(t, `-`(4))])))), `*`(p2, `*`...
(34)
 

 

 

We know since previously ( Proposition-7 ) that the Expected Value of the cross products are equal to zero, so we get: 

 

 

`and`(AutoCovariance(y(t), y(`+`(t, `-`(3)))) = `*`(E, `*`([0])), `*`(E, `*`([0])) = 0) 

 

 

 

Which is our final expression for the AutoCovariance(y(t), y(`+`(t, `-`(3)))) 

 

 

 

 

 

 

3.5) Summary for Our Second Order Moving Average Model  

 

 

 

 

We have in the previous sections calculated the AutoCovariances for y(t), y(t-1), y(t-2) and y(t-3) given by: 

 

AutoCovariance(y(t), y(t)) = `*`(`+`(1, `*`(`^`(p1, 2)), `*`(`^`(p2, 2))), `*`(`^`(sigma, 2))) 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `*`(`+`(p1, `*`(p1, `*`(p2))), `*`(`^`(sigma, 2))) 

 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `*`(p2, `*`(`^`(sigma, 2))) 

 

`and`(AutoCovariance(y(t), y(`+`(t, `-`(3)))) = `*`(E, `*`([0])), `*`(E, `*`([0])) = 0) 

 

 

We know that we can calculate the AutoCorrelation for any lag as: 

 

 

AutoCorrelation(L) = `/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(L))))), `*`(AutoCovariance(y(t), y(t)))) 

 

 

This means that: 

 

`and`(AutoCorrelation(0) = `*`(AutoCovariance(y(t), y(t)), `/`(1, `*`(AutoCovariance(y(t), y(t))))), `and`(`*`(AutoCovariance(y(t), y(t)), `/`(1, `*`(AutoCovariance(y(t), y(t))))) = `*`(`*`(`+`(1, `*`... 

 

 

`and`(AutoCorrelation(1) = `/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(1))))), `*`(AutoCovariance(y(t), y(t)))), `and`(`/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(1))))), `*`(AutoCovariance(y(t), y(t)))) ...
`and`(AutoCorrelation(1) = `/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(1))))), `*`(AutoCovariance(y(t), y(t)))), `and`(`/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(1))))), `*`(AutoCovariance(y(t), y(t)))) ...
 

 

 

`and`(AutoCorrelation(2) = `/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(2))))), `*`(AutoCovariance(y(t), y(t)))), `and`(`/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(2))))), `*`(AutoCovariance(y(t), y(t)))) ...
`and`(AutoCorrelation(2) = `/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(2))))), `*`(AutoCovariance(y(t), y(t)))), `and`(`/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(2))))), `*`(AutoCovariance(y(t), y(t)))) ...
 

 

`and`(AutoCorrelation(3) = `/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(3))))), `*`(AutoCovariance(y(t), y(t)))), `and`(`/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(3))))), `*`(AutoCovariance(y(t), y(t)))) ... 

 

 

 

For example if we assume that

`:=`(p1, -.5); -1; `:=`(p2, .4); -1

then the AutoCorrelation for each lag is given by
 

 

 

AutoCorrelation(1) = `/`(`*`(`+`(p1, `*`(p1, `*`(p2)))), `*`(`+`(1, `*`(`^`(p1, 2)), `*`(`^`(p2, 2))))) 

AutoCorrelation(1) = -.4964539007 (35)
 

 

AutoCorrelation(2) = `/`(`*`(p2), `*`(`+`(1, `*`(`^`(p1, 2)), `*`(`^`(p2, 2))))) 

AutoCorrelation(2) = .2836879433 (36)
 

 

 

AutoCorrelation(3) = 0 

AutoCorrelation(3) = 0 (37)
 

 

 

 

 

We can show that this indeed is true by simulating a MA(2) 

 

 

restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(p1, -.5); -1; `:=`(p2, .4); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 3 to n do `:=`(y[1], 0); `:...
 

 

 

 

 

 

Plot_2d
`.`(first, order, serial, correlation) = -.4990750489
`.`(second, order, serial, correlation) = .2963680643
`.`(third, order, serial, correlation) = -0.7303409994e-2
`.`(fourth, order, serial, correlation) = 0.6261242198e-2
Plot_2d
 

 

 

 

 

 

 

 

 

4) An AutoRegressive Model of Order One AR(1) 

 

 

 

restart; 1 

 

 

Our First Order AutoRegressive Model is given by: 

 

 

`and`(y(t) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(1))])), `*`(`*`(r(t), `*`(where)), r(t))), `∼`(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(1))])), `*`(`*`(r(t), `*`(where)), r(t))), `*`(Typesetting:-delayDotPr... 

 

 

Note that `<`(phi[1], 1) for a stationary AR(1) model. The stationarity conditions depend only the AR part. 

 

For an stationary AR(p) all roots of the characteristic equation should lie outside of the unit circle. 

 

 

We know that we can calculate the AutoCorrelation for any lag as: 

 

 

AutoCorrelation(L) = `/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(L))))), `*`(AutoCovariance(y(t), y(t)))) 

 

 

 

 

 

4.1) AutoCovariance(y(t) y(t)) 

 

 

restart; 1 

 

 

We can now calculate the denominator in the equation for the AutoCorrelation(L) given by AutoCovariance(y(t), y(t)) 

 

 

We again note that our Autoregressive equation of order one AR(1) is given by 

 

y(t) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(1))])), r(t)) 

y(t) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(1))])), r(t)) (38)
 

 

We now note that we can write the above AR(1) model as: 

 

`and`(y(t) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(1))])), `*`(`*`(r(t), `*`(where)), r(t))), `∼`(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(1))])), `*`(`*`(r(t), `*`(where)), r(t))), `*`(Typesetting:-delayDotPr...     

 

y(`+`(t, `-`(1))) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(2))])), r(`+`(t, `-`(1)))) 

 

y(`+`(t, `-`(2))) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(3))])), r(`+`(t, `-`(2)))) 

 

y(`+`(t, `-`(3))) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(4))])), r(`+`(t, `-`(3)))) 

 

y(`+`(t, `-`(4))) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(5))])), r(`+`(t, `-`(4)))) 

 

etc etc  

 

 

 

So if we for example start from y(`+`(t, `-`(4))) an recursively substitute in the previous expressions we get: 

 

 

`:=`(y[`+`(t, `-`(4))], expand(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(5))])), r[`+`(t, `-`(4))]))) 

`+`(`*`(phi[1], `*`(y[`+`(t, `-`(5))])), r[`+`(t, `-`(4))]) (39)
 

`:=`(y[`+`(t, `-`(3))], expand(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(4))])), r[`+`(t, `-`(3))]))) 

`+`(`*`(`^`(phi[1], 2), `*`(y[`+`(t, `-`(5))])), `*`(phi[1], `*`(r[`+`(t, `-`(4))])), r[`+`(t, `-`(3))]) (40)
 

`:=`(y[`+`(t, `-`(2))], expand(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(3))])), r[`+`(t, `-`(2))]))) 

`+`(`*`(`^`(phi[1], 3), `*`(y[`+`(t, `-`(5))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(4))])), `*`(phi[1], `*`(r[`+`(t, `-`(3))])), r[`+`(t, `-`(2))]) (41)
 

`:=`(y[`+`(t, `-`(1))], expand(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(2))])), r[`+`(t, `-`(1))]))) 

`+`(`*`(`^`(phi[1], 4), `*`(y[`+`(t, `-`(5))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(4))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(3))])), `*`(phi[1], `*`(r[`+`(t, `-`(2))])), r[`+`(t, `-`(1))]) (42)
 

`:=`(y[t], expand(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(1))])), r[t]))) 

`+`(`*`(`^`(phi[1], 5), `*`(y[`+`(t, `-`(5))])), `*`(`^`(phi[1], 4), `*`(r[`+`(t, `-`(4))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(phi[1], `*`(... (43)
 

 


 

This is the result of World's decomposition theorem that states that any stationary series can be express by a deterministic part and a

stochastic part. We now note that since we assume that our AR(1) equation is stationary `<`(phi[1], 1) it means that the `*`(`^`(phi[1], 5), `*`(y[`+`(t, `-`(5))])) expression
 

will go to zero if we would have selected a larger starting value for example y(`+`(t, `-`(infinity))) 

 

 

For example if we assume that phi[1] = .9 we get: 

 

`^`(.9, 2) 

.81 (44)
 

`^`(.9, 3) 

.729 (45)
 

`^`(.9, 4) 

.6561 (46)
 

`^`(.9, 5) 

.59049 (47)
 

etc etc  

 

 

This means that we are left with  

 

 

restart; 1; y[t] = `+`(`*`(`^`(phi[1], 4), `*`(r[`+`(t, `-`(4))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(phi[1], `*`(r[`+`(t, `-`(1))])), r[t]) 

y[t] = `+`(`*`(`^`(phi[1], 4), `*`(r[`+`(t, `-`(4))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(phi[1], `*`(r[`+`(t, `-`(1))])), r[t]) (48)
 

 

Which can be written as
 

y[t] = `+`(`*`(`^`(phi[1], 4), `*`(r[`+`(t, `-`(4))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(phi[1], `*`(r[`+`(t, `-`(1))])), r[t]) 

 

 

We again note that if we would have selected a much larger starting value than y(`+`(t, `-`(4)))then our expression would have been

infinite long. This means that we can write the above equation as:
 

 

 

 

 

 

We now take expectation on both sides, so we get: 

 

 

 

 

 

Which can be written as: 

 

 

 

 

 

We know from Proposition-5 that the Expected Value of a random variable with any lag is approximately equal to zero 

 

This means that  

 

 

This gives us: 

 

 

E[y(t)] = 0 

 

 

We now note that the AutoCovariance for y(t) and y(t) is given by 

 

`and`(AutoCovariance(y(t), y(t)) = `*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `+`(y(t), `-`(E[y(t)])))])), `*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `+`(y(t), `-`(E[y(t)])))])) = `*`(E, `*`([`*`(`^`(`+`(... 

 

 

Since E[y(t)] = 0 the above equation is reduced to 

 

 

AutoCovariance(y(t), y(t)) = `*`(E, `*`([`*`(`^`(y(t), 2))])) 

 

 

 

We now note that y[t] = z is defined as: 

 

`:=`(z, `+`(r[t], `*`(phi[1], `*`(r[`+`(t, `-`(1))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 4), `*`(r[`+`(t, `-`(4))])))) 

`+`(`*`(`^`(phi[1], 4), `*`(r[`+`(t, `-`(4))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(phi[1], `*`(r[`+`(t, `-`(1))])), r[t]) (49)
 

 

expand(`*`(`^`(z, 2))) 

`+`(`*`(`^`(phi[1], 8), `*`(`^`(r[`+`(t, `-`(4))], 2))), `*`(`^`(phi[1], 6), `*`(`^`(r[`+`(t, `-`(3))], 2))), `*`(`^`(phi[1], 4), `*`(`^`(r[`+`(t, `-`(2))], 2))), `*`(`^`(phi[1], 2), `*`(`^`(r[`+`(t, ...
`+`(`*`(`^`(phi[1], 8), `*`(`^`(r[`+`(t, `-`(4))], 2))), `*`(`^`(phi[1], 6), `*`(`^`(r[`+`(t, `-`(3))], 2))), `*`(`^`(phi[1], 4), `*`(`^`(r[`+`(t, `-`(2))], 2))), `*`(`^`(phi[1], 2), `*`(`^`(r[`+`(t, ...
`+`(`*`(`^`(phi[1], 8), `*`(`^`(r[`+`(t, `-`(4))], 2))), `*`(`^`(phi[1], 6), `*`(`^`(r[`+`(t, `-`(3))], 2))), `*`(`^`(phi[1], 4), `*`(`^`(r[`+`(t, `-`(2))], 2))), `*`(`^`(phi[1], 2), `*`(`^`(r[`+`(t, ...
(50)
 

 

 

If we substitute in that expression into the equation for the AutoCovariance(y(t), y(t)) we get 

 

 

AutoCovariance(y(t), y(t)) = `*`(E, `*`([`+`(`*`(`^`(phi[1], 8), `*`(`^`(r[`+`(t, `-`(4))], 2))), `*`(`^`(phi[1], 6), `*`(`^`(r[`+`(t, `-`(3))], 2))), `*`(`^`(phi[1], 4), `*`(`^`(r[`+`(t, `-`(2))], 2)...
AutoCovariance(y(t), y(t)) = `*`(E, `*`([`+`(`*`(`^`(phi[1], 8), `*`(`^`(r[`+`(t, `-`(4))], 2))), `*`(`^`(phi[1], 6), `*`(`^`(r[`+`(t, `-`(3))], 2))), `*`(`^`(phi[1], 4), `*`(`^`(r[`+`(t, `-`(2))], 2)...
AutoCovariance(y(t), y(t)) = `*`(E, `*`([`+`(`*`(`^`(phi[1], 8), `*`(`^`(r[`+`(t, `-`(4))], 2))), `*`(`^`(phi[1], 6), `*`(`^`(r[`+`(t, `-`(3))], 2))), `*`(`^`(phi[1], 4), `*`(`^`(r[`+`(t, `-`(2))], 2)...
 

 

 

Which can be written as: 

 




 

 

 

We know from Proposition-6 that the Expected Value of a random variable^2 with any lag is approximately equal to the Variance of that

variable.
 

 

This means that   E(`*`(`^`(r[`+`(t, `-`(4))], 2))) = `*`(`^`(sigma, 2))    E(`*`(`^`(r[`+`(t, `-`(3))], 2))) = `*`(`^`(sigma, 2))  etc etc 

 

 

This gives us: 

 

 

AutoCovariance(y(t), y(t)) = `+`(`*`(`^`(phi[1], 8), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 6), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 4), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 2), `*`(`^`(sigma, 2))), `*`...
AutoCovariance(y(t), y(t)) = `+`(`*`(`^`(phi[1], 8), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 6), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 4), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 2), `*`(`^`(sigma, 2))), `*`...
AutoCovariance(y(t), y(t)) = `+`(`*`(`^`(phi[1], 8), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 6), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 4), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 2), `*`(`^`(sigma, 2))), `*`...
AutoCovariance(y(t), y(t)) = `+`(`*`(`^`(phi[1], 8), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 6), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 4), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 2), `*`(`^`(sigma, 2))), `*`...
 

 

 

 

We know from Proposition-7 that the Expected Value of the product of a lagged random variables is approximately zero. 

 

 

This means that  E(`*`(r[`+`(t, `-`(2))], `*`(r[`+`(t, `-`(1))]))) = 0     E(`*`(r[`+`(t, `-`(2))], `*`(r[t]))) = 0 .    etc etc 

 

 

Note that these terms are called cross products and are always equal to zero 

 

 

This gives us: 

 

 

AutoCovariance(y(t), y(t)) = `+`(`*`(`^`(phi[1], 8), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 6), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 4), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 2), `*`(`^`(sigma, 2))), `*`... 

 

 

Which can be written as: 

 

 

 

 

 

We now note that if we multiply both sides by `*`(`^`(phi[1], 2)) we get: 

 

 

 

 

Which can be written as: 

 

 

 

 

We now subtract the second expression from the first which gives us: 

 

 


 

 

 

Which is given by: 

 

 

simplify(`+`(`*`(`^`(sigma, 2), `*`(`+`(1, `*`(`^`(phi[1], 2)), `*`(`^`(phi[1], 4)), `*`(`^`(phi[1], 6)), `*`(`^`(phi[1], 8))))), `-`(`*`(`^`(sigma, 2), `*`(`+`(`*`(`^`(phi[1], 2)), `*`(`^`(phi[1], 4)... 

`+`(`*`(`^`(sigma, 2)), `-`(`*`(`^`(sigma, 2), `*`(`^`(phi[1], 10))))) (51)
 

 

We now note that since we assume stationarity `<`(phi[1], 1) it means that the term  `*`(`^`(sigma, 2), `*`(`^`(phi[1], 10))) will become zero if we select a larger starting value. 

 

 

For example if we assume that phi[1] = .9 the we get: 

 

`*`(`^`(sigma, 2), `*`(`^`(.9, 10))) 

`+`(`*`(.3486784401, `*`(`^`(sigma, 2)))) (52)
 

`*`(`^`(sigma, 2), `*`(`^`(.9, 12))) 

`+`(`*`(.2824295365, `*`(`^`(sigma, 2)))) (53)
 

`*`(`^`(sigma, 2), `*`(`^`(.9, 14))) 

`+`(`*`(.2287679246, `*`(`^`(sigma, 2)))) (54)
 

etc etc  

 

 

 

This means that we can write the above equation as: 

 

`+`(AutoCovariance(y(t), y(t)), `-`(`*`(`^`(phi[1], 2), `*`(AutoCovariance(y(t), y(t)))))) = `*`(`^`(sigma, 2)) 

 

which means that: 

 

`*`(AutoCovariance(y(t), y(t)), `*`(`+`(1, `-`(`*`(`^`(phi[1], 2)))))) = `*`(`^`(sigma, 2)) 

 

 

Which can be written as: 

 

AutoCovariance(y(t), y(t)) = `/`(`*`(`^`(sigma, 2)), `*`(`+`(1, `-`(`*`(`^`(phi[1], 2)))))) 

 

 

Which is our final expression for the AutoCovariance(y(t), y(t)) 

 

 

 

 

 

 

4.2) AutoCovariance(y(t) y(t-1)) 

 

 

restart; 1 

 

 

We can now calculate the numerator in the equation for the AutoCorrelation(L) given by AutoCovariance(y(t), y(`+`(t, `-`(L)))) 

 

 

We will in this section calculate the AutoCovariance for lag one 

 

 

We now note that the equation for our First Order Autoregressive equation AR(1) is given by 

 

 

y(t) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(1))])), r(t)) 

y(t) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(1))])), r(t)) (55)
 

 

We now note that we can write the above AR(1) model as: 

 

`and`(y(t) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(1))])), `*`(`*`(r(t), `*`(where)), r(t))), `∼`(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(1))])), `*`(`*`(r(t), `*`(where)), r(t))), `*`(Typesetting:-delayDotPr...     

 

y(`+`(t, `-`(1))) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(2))])), r(`+`(t, `-`(1)))) 

 

y(`+`(t, `-`(2))) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(3))])), r(`+`(t, `-`(2)))) 

 

y(`+`(t, `-`(3))) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(4))])), r(`+`(t, `-`(3)))) 

 

y(`+`(t, `-`(4))) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(5))])), r(`+`(t, `-`(4)))) 

 

etc etc  

 

 

 

So if we for example start from y(`+`(t, `-`(4))) an recursively substitute in the previous expressions we get: 

 

 

`:=`(y[`+`(t, `-`(4))], expand(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(5))])), r[`+`(t, `-`(4))]))) 

`+`(`*`(phi[1], `*`(y[`+`(t, `-`(5))])), r[`+`(t, `-`(4))]) (56)
 

`:=`(y[`+`(t, `-`(3))], expand(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(4))])), r[`+`(t, `-`(3))]))) 

`+`(`*`(`^`(phi[1], 2), `*`(y[`+`(t, `-`(5))])), `*`(phi[1], `*`(r[`+`(t, `-`(4))])), r[`+`(t, `-`(3))]) (57)
 

`:=`(y[`+`(t, `-`(2))], expand(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(3))])), r[`+`(t, `-`(2))]))) 

`+`(`*`(`^`(phi[1], 3), `*`(y[`+`(t, `-`(5))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(4))])), `*`(phi[1], `*`(r[`+`(t, `-`(3))])), r[`+`(t, `-`(2))]) (58)
 

`:=`(y[`+`(t, `-`(1))], expand(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(2))])), r[`+`(t, `-`(1))]))) 

`+`(`*`(`^`(phi[1], 4), `*`(y[`+`(t, `-`(5))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(4))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(3))])), `*`(phi[1], `*`(r[`+`(t, `-`(2))])), r[`+`(t, `-`(1))]) (59)
 

`:=`(y[t], expand(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(1))])), r[t]))) 

`+`(`*`(`^`(phi[1], 5), `*`(y[`+`(t, `-`(5))])), `*`(`^`(phi[1], 4), `*`(r[`+`(t, `-`(4))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(phi[1], `*`(... (60)
 

 

 

Again this is the result of World's decomposition theorem that states that any stationary series can be express by a deterministic part and a

stochastic part. We are now interested in the equation for y(t) and y(`+`(t, `-`(1))) which are given by
 


 

y[t] = `+`(`*`(`^`(phi[1], 5), `*`(y[`+`(t, `-`(5))])), `*`(`^`(phi[1], 4), `*`(r[`+`(t, `-`(4))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(phi[1... 

 

 

y[`+`(t, `-`(1))] = `+`(`*`(`^`(phi[1], 4), `*`(y[`+`(t, `-`(5))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(4))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(3))])), `*`(phi[1], `*`(r[`+`(t, `-`(2))])), r[`+... 

 

 

or as: 

 

 

y[t] = `+`(r[t], `*`(phi[1], `*`(r[`+`(t, `-`(1))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 4), `*`(r[`+`(t, `-`(4))])), `*`(`^`(phi... 

 

 

y[`+`(t, `-`(1))] = `+`(`*`(`^`(phi[1], 4), `*`(y[`+`(t, `-`(5))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(4))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(3))])), `*`(phi[1], `*`(r[`+`(t, `-`(2))])), r[`+... 

 

 

 

We again note that since we assume that our AR(1) equation is stationary `<`(phi[1], 1) it means that the `*`(`^`(phi[1], 5), `*`(y[`+`(t, `-`(5))])) and `*`(`^`(phi[1], 4), `*`(y[`+`(t, `-`(5))]))expressions
 

will go to zero if we would have selected a larger starting value for example y(`+`(t, `-`(infinity))) 

 

 

This means that we are left with  

 

 

y[t] = `+`(r[t], `*`(phi[1], `*`(r[`+`(t, `-`(1))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 4), `*`(r[`+`(t, `-`(4))]))) 

 

 

y[`+`(t, `-`(1))] = `+`(r[`+`(t, `-`(1))], `*`(phi[1], `*`(r[`+`(t, `-`(2))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(4))]))) 

 

 

 

We now take expectation on both sides E 

 

E(y[t]) = E(`+`(r[t], `*`(phi[1], `*`(r[`+`(t, `-`(1))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 4), `*`(r[`+`(t, `-`(4))])))) 

 

 

E(y[`+`(t, `-`(1))]) = E(`+`(r[`+`(t, `-`(1))], `*`(phi[1], `*`(r[`+`(t, `-`(2))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(4))])))) 

 

 

Which can be written as: 

 

 

E(y[t]) = `+`(E(r[t]), `*`(phi[1], `*`(E(r[`+`(t, `-`(1))]))), `*`(`^`(phi[1], 2), `*`(E(r[`+`(t, `-`(2))]))), `*`(`^`(phi[1], 3), `*`(E(r[`+`(t, `-`(3))]))), `*`(`^`(phi[1], 4), `*`(E(r[`+`(t, `-`(4)... 

 

 

E(y[`+`(t, `-`(1))]) = `+`(E(r[`+`(t, `-`(1))]), `*`(phi[1], `*`(E(r[`+`(t, `-`(2))]))), `*`(`^`(phi[1], 2), `*`(E(r[`+`(t, `-`(3))]))), `*`(`^`(phi[1], 3), `*`(E(r[`+`(t, `-`(4))])))) 

 

 

 

We know from Proposition-5 that the Expected Value of a random variable with any lag is approximately equal to zero 

 

This means that  

 

 

This means that: 

 

 

E(y(t)) = 0 

 

 

E(y(`+`(t, `-`(1)))) = 0 

 

 

We now note that the AutoCovariance for y(t) and y(t-1) is given by 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `*`(`+`(y(`+`(t, `-`(1))), `-`(E[y(`+`(t, `-`(1)))]))))]))      

 

 

Since E(y(t)) = 0 and E(y(`+`(t, `-`(1)))) = 0  it means that we can write the autocovariance as: 

 

 

`and`(AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(1))), 0)))])), `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(1))), 0)))])) = `*`(y(t), `*`(y(`+`... 

 

 

So if we substitute in the expression for y[t] and y[`+`(t, `-`(1))] and multiply we get: 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = E(expand(`*`(`+`(r[t], `*`(phi[1], `*`(r[`+`(t, `-`(1))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(...
AutoCovariance(y(t), y(`+`(t, `-`(1)))) = E(expand(`*`(`+`(r[t], `*`(phi[1], `*`(r[`+`(t, `-`(1))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(...
AutoCovariance(y(t), y(`+`(t, `-`(1)))) = E(expand(`*`(`+`(r[t], `*`(phi[1], `*`(r[`+`(t, `-`(1))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(...
 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = E(`+`(`*`(r[t], `*`(r[`+`(t, `-`(1))])), `*`(r[t], `*`(phi[1], `*`(r[`+`(t, `-`(2))]))), `*`(r[t], `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(3))]))), `*`(r[t], `*`...
AutoCovariance(y(t), y(`+`(t, `-`(1)))) = E(`+`(`*`(r[t], `*`(r[`+`(t, `-`(1))])), `*`(r[t], `*`(phi[1], `*`(r[`+`(t, `-`(2))]))), `*`(r[t], `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(3))]))), `*`(r[t], `*`...
AutoCovariance(y(t), y(`+`(t, `-`(1)))) = E(`+`(`*`(r[t], `*`(r[`+`(t, `-`(1))])), `*`(r[t], `*`(phi[1], `*`(r[`+`(t, `-`(2))]))), `*`(r[t], `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(3))]))), `*`(r[t], `*`...
(61)
 

 

 

Which can be written as
 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `+`(E(`*`(r[t], `*`(r[`+`(t, `-`(1))]))), `*`(phi[1], `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(2))]))))), `*`(`^`(phi[1], 2), `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(3))])))...
AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `+`(E(`*`(r[t], `*`(r[`+`(t, `-`(1))]))), `*`(phi[1], `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(2))]))))), `*`(`^`(phi[1], 2), `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(3))])))...
AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `+`(E(`*`(r[t], `*`(r[`+`(t, `-`(1))]))), `*`(phi[1], `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(2))]))))), `*`(`^`(phi[1], 2), `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(3))])))...
AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `+`(E(`*`(r[t], `*`(r[`+`(t, `-`(1))]))), `*`(phi[1], `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(2))]))))), `*`(`^`(phi[1], 2), `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(3))])))...
 

 

 

 

We know from Proposition-6 that the Expected Value of a random variable^2 with any lag is approximately equal to the Variance of that

variable.
 

 

This means that   E(`*`(`^`(r[`+`(t, `-`(4))], 2))) = `*`(`^`(sigma, 2))    E(`*`(`^`(r[`+`(t, `-`(3))], 2))) = `*`(`^`(sigma, 2))  etc etc 

 

 

This gives us: 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `+`(E(`*`(r[t], `*`(r[`+`(t, `-`(1))]))), `*`(phi[1], `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(2))]))))), `*`(`^`(phi[1], 2), `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(3))])))...
AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `+`(E(`*`(r[t], `*`(r[`+`(t, `-`(1))]))), `*`(phi[1], `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(2))]))))), `*`(`^`(phi[1], 2), `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(3))])))...
AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `+`(E(`*`(r[t], `*`(r[`+`(t, `-`(1))]))), `*`(phi[1], `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(2))]))))), `*`(`^`(phi[1], 2), `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(3))])))...
 

 

 

 

We know from Proposition-7 that the Expected Value of the product of a lagged random variables is approximately zero. 

 

 

This means that  E(`*`(r[`+`(t, `-`(1))], `*`(r[`+`(t, `-`(2))]))) = 0     E(`*`(r[t], `*`(r[`+`(t, `-`(2))]))) = 0 .    etc etc 

 

 

Again note that these terms are called cross products and are always equal to zero 

 

 

This gives us: 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `+`(`*`(phi[1], `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 3), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 5), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 7), `*`(`^`(sigma, 2)))... 

 

 

We again note that if we would have selected a much larger starting value than y(`+`(t, `-`(4)))then our expression would have been

infinite long. This means that we can write the above equation as:
 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `+`(`*`(phi[1], `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 3), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 5), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 7), `*`(`^`(sigma, 2)))... 

 

 

Which can be written as: 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `*`(`^`(sigma(`+`(phi[1], `*`(`^`(phi[1], 3)), `*`(`^`(phi[1], 5)), `*`(`^`(phi[1], 7))) .. ()), 2)) 

 

 

We now note that if we multiply both sides by `*`(`^`(phi[1], 2)) we get: 

 

`*`(`^`(phi[1], 2), `*`(AutoCovariance(y(t), y(`+`(t, `-`(1)))))) = `*`(`^`(phi[1], 2), `*`(`^`(sigma(`+`(phi[1], `*`(`^`(phi[1], 3)), `*`(`^`(phi[1], 5)), `*`(`^`(phi[1], 7))) .. ()), 2))) 

 

 

Which can be written as: 

 

`*`(`^`(phi[1], 2), `*`(AutoCovariance(y(t), y(`+`(t, `-`(1)))))) = `*`(`^`(sigma(`+`(`*`(`^`(phi[1], 3)), `*`(`^`(phi[1], 5)), `*`(`^`(phi[1], 7)), `*`(`^`(phi[1], 9))) .. ()), 2)) 

 

 

 

We now subtract the second expression from the first which gives us: 

 

 

`+`(AutoCovariance(y(t), y(`+`(t, `-`(1)))), `-`(`*`(`^`(phi[1], 2), `*`(AutoCovariance(y(t), y(`+`(t, `-`(1)))))))) = `+`(`*`(`^`(sigma(`+`(phi[1], `*`(`^`(phi[1], 3)), `*`(`^`(phi[1], 5)), `*`(`^`(p...
`+`(AutoCovariance(y(t), y(`+`(t, `-`(1)))), `-`(`*`(`^`(phi[1], 2), `*`(AutoCovariance(y(t), y(`+`(t, `-`(1)))))))) = `+`(`*`(`^`(sigma(`+`(phi[1], `*`(`^`(phi[1], 3)), `*`(`^`(phi[1], 5)), `*`(`^`(p...
 

 

 

Which is given by: 

 

 

simplify(`+`(`*`(`^`(sigma, 2), `*`(`+`(phi[1], `*`(`^`(phi[1], 3)), `*`(`^`(phi[1], 5)), `*`(`^`(phi[1], 7))))), `-`(`*`(`^`(sigma, 2), `*`(`+`(`*`(`^`(phi[1], 3)), `*`(`^`(phi[1], 5)), `*`(`^`(phi[1... 

`+`(`*`(phi[1], `*`(`^`(sigma, 2))), `-`(`*`(`^`(sigma, 2), `*`(`^`(phi[1], 9))))) (62)
 

 

We now note that since we assume stationarity `<`(phi[1], 1) it means that the term `*`(`^`(sigma, 2), `*`(`^`(phi[1], 9))) will become zero if we select a larger starting value. 

 

 

This means that we can write the above equation as: 

 

 

`+`(AutoCovariance(y(t), y(`+`(t, `-`(1)))), `-`(`*`(`^`(phi[1], 2), `*`(AutoCovariance(y(t), y(`+`(t, `-`(1)))))))) = `*`(phi[1], `*`(`^`(sigma, 2))) 

 

 

which means that: 

 

`*`(AutoCovariance(y(t), y(`+`(t, `-`(1)))), `*`(`+`(1, `-`(`*`(`^`(phi[1], 2)))))) = `*`(phi[1], `*`(`^`(sigma, 2))) 

 

 

Which can be written as: 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `/`(`*`(phi[1], `*`(`^`(sigma, 2))), `*`(`+`(1, `-`(`*`(`^`(phi[1], 2)))))) 

 

 

Which is our final expression for the AutoCovariance(y(t), y(`+`(t, `-`(1)))) 

 

 

 

 

 

 

4.3) AutoCovariance(y(t) y(t-2)) 

 

 

restart; 1 

 

 

We can now calculate the numerator in the equation for the AutoCorrelation(L) given by AutoCovariance(y(t), y(`+`(t, `-`(L)))) 

 

 

We will in this section calculate the AutoCovariance for lag two 

 

 

We now note that the equation for our First Order Autoregressive equation AR(1) is given by 

 

 

y(t) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(1))])), r(t)) 

y(t) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(1))])), r(t)) (63)
 

 

We now note that we can write the above AR(1) model as: 

 

`and`(y(t) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(1))])), `*`(`*`(r(t), `*`(where)), r(t))), `∼`(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(1))])), `*`(`*`(r(t), `*`(where)), r(t))), `*`(Typesetting:-delayDotPr...     

 

y(`+`(t, `-`(1))) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(2))])), r(`+`(t, `-`(1)))) 

 

y(`+`(t, `-`(2))) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(3))])), r(`+`(t, `-`(2)))) 

 

y(`+`(t, `-`(3))) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(4))])), r(`+`(t, `-`(3)))) 

 

y(`+`(t, `-`(4))) = `+`(`*`(phi[1], `*`(y[`+`(t, `-`(5))])), r(`+`(t, `-`(4)))) 

 

etc etc  

 

 

 

So if we for example start from y(`+`(t, `-`(4))) an recursively substitute in the previous expressions we get: 

 

 

`:=`(y[`+`(t, `-`(4))], expand(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(5))])), r[`+`(t, `-`(4))]))) 

`+`(`*`(phi[1], `*`(y[`+`(t, `-`(5))])), r[`+`(t, `-`(4))]) (64)
 

`:=`(y[`+`(t, `-`(3))], expand(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(4))])), r[`+`(t, `-`(3))]))) 

`+`(`*`(`^`(phi[1], 2), `*`(y[`+`(t, `-`(5))])), `*`(phi[1], `*`(r[`+`(t, `-`(4))])), r[`+`(t, `-`(3))]) (65)
 

`:=`(y[`+`(t, `-`(2))], expand(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(3))])), r[`+`(t, `-`(2))]))) 

`+`(`*`(`^`(phi[1], 3), `*`(y[`+`(t, `-`(5))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(4))])), `*`(phi[1], `*`(r[`+`(t, `-`(3))])), r[`+`(t, `-`(2))]) (66)
 

`:=`(y[`+`(t, `-`(1))], expand(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(2))])), r[`+`(t, `-`(1))]))) 

`+`(`*`(`^`(phi[1], 4), `*`(y[`+`(t, `-`(5))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(4))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(3))])), `*`(phi[1], `*`(r[`+`(t, `-`(2))])), r[`+`(t, `-`(1))]) (67)
 

`:=`(y[t], expand(`+`(`*`(phi[1], `*`(y[`+`(t, `-`(1))])), r[t]))) 

`+`(`*`(`^`(phi[1], 5), `*`(y[`+`(t, `-`(5))])), `*`(`^`(phi[1], 4), `*`(r[`+`(t, `-`(4))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(phi[1], `*`(... (68)
 

 

 

Again this is the result of World's decomposition theorem that states that any stationary series can be express by a deterministic part and a

stochastic part. We are now interested in the equation for y(t) and y(`+`(t, `-`(2))) which are given by
 


 

y[t] = `+`(`*`(`^`(phi[1], 5), `*`(y[`+`(t, `-`(5))])), `*`(`^`(phi[1], 4), `*`(r[`+`(t, `-`(4))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(phi[1... 

 

 

y[`+`(t, `-`(2))] = `+`(`*`(`^`(phi[1], 3), `*`(y[`+`(t, `-`(5))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(4))])), `*`(phi[1], `*`(r[`+`(t, `-`(3))])), r[`+`(t, `-`(2))]) 

 

 

or as: 

 

 

y[t] = `+`(`*`(`^`(phi[1], 5), `*`(y[`+`(t, `-`(5))])), `*`(`^`(phi[1], 4), `*`(r[`+`(t, `-`(4))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(phi[1... 

 

 

y[`+`(t, `-`(2))] = `+`(r[`+`(t, `-`(2))], `*`(phi[1], `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(4))])), `*`(`^`(phi[1], 3), `*`(y[`+`(t, `-`(5))]))) 

 

 

We again note that since we assume that our AR(1) equation is stationary `<`(phi[1], 1) it means that the `*`(`^`(phi[1], 5), `*`(y[`+`(t, `-`(5))])) and `*`(`^`(phi[1], 3), `*`(y[`+`(t, `-`(5))])) expression
 

will go to zero if we would have selected a larger starting value for example y(`+`(t, `-`(infinity))) 

 

 

This means that we are left with  

 

 

y[t] = `+`(r[t], `*`(phi[1], `*`(r[`+`(t, `-`(1))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 4), `*`(r[`+`(t, `-`(4))]))) 

 

 

y[`+`(t, `-`(2))] = `+`(r[`+`(t, `-`(2))], `*`(phi[1], `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(4))]))) 

 

 

 

We now take expectation on both sides E 

 

E(y[t]) = E(`+`(r[t], `*`(phi[1], `*`(r[`+`(t, `-`(1))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 4), `*`(r[`+`(t, `-`(4))])))) 

 

 

E(y[`+`(t, `-`(2))]) = E(`+`(r[`+`(t, `-`(2))], `*`(phi[1], `*`(r[`+`(t, `-`(3))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(4))])))) 

 

 

Which can be written as: 

 

 

E(y[t]) = `+`(E(r[t]), `*`(phi[1], `*`(E(r[`+`(t, `-`(1))]))), `*`(`^`(phi[1], 2), `*`(E(r[`+`(t, `-`(2))]))), `*`(`^`(phi[1], 3), `*`(E(r[`+`(t, `-`(3))]))), `*`(`^`(phi[1], 4), `*`(E(r[`+`(t, `-`(4)... 

 

 

E(y[`+`(t, `-`(2))]) = `+`(E(r[`+`(t, `-`(2))]), `*`(phi[1], `*`(E(r[`+`(t, `-`(3))]))), `*`(`^`(phi[1], 2), `*`(E(r[`+`(t, `-`(4))])))) 

 

 

 

We know from Proposition-5 that the Expected Value of a random variable with any lag is approximately equal to zero 

 

This means that  

 

 

This means that: 

 

 

E(y(t)) = 0 

 

 

E(y(`+`(t, `-`(2)))) = 0 

 

 

We now note that the AutoCovariance for y(t) and y(t-1) is given by 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `*`(E, `*`([`*`(`+`(y(t), `-`(E[y(t)])), `*`(`+`(y(`+`(t, `-`(2))), `-`(E[y(`+`(t, `-`(2)))]))))]))      

 

 

Since E(y(t)) = 0 and E(y(`+`(t, `-`(2)))) = 0  it means that we can write the autocovariance as: 

 

 

`and`(AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(2))), 0)))])), `*`(E, `*`([`*`(`+`(y(t), 0), `*`(`+`(y(`+`(t, `-`(2))), 0)))])) = `*`(y(t), `*`(y(`+`... 

 

 

So if we substitute in the expression for y[t] and y[`+`(t, `-`(2))] and multiply we get: 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = E(expand(`*`(`+`(r[t], `*`(phi[1], `*`(r[`+`(t, `-`(1))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(...
AutoCovariance(y(t), y(`+`(t, `-`(2)))) = E(expand(`*`(`+`(r[t], `*`(phi[1], `*`(r[`+`(t, `-`(1))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(...
AutoCovariance(y(t), y(`+`(t, `-`(2)))) = E(expand(`*`(`+`(r[t], `*`(phi[1], `*`(r[`+`(t, `-`(1))])), `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(2))])), `*`(`^`(phi[1], 3), `*`(r[`+`(t, `-`(3))])), `*`(`^`(...
 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = E(`+`(`*`(r[t], `*`(r[`+`(t, `-`(2))])), `*`(r[t], `*`(phi[1], `*`(r[`+`(t, `-`(3))]))), `*`(r[t], `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(4))]))), `*`(phi[1], `...
AutoCovariance(y(t), y(`+`(t, `-`(2)))) = E(`+`(`*`(r[t], `*`(r[`+`(t, `-`(2))])), `*`(r[t], `*`(phi[1], `*`(r[`+`(t, `-`(3))]))), `*`(r[t], `*`(`^`(phi[1], 2), `*`(r[`+`(t, `-`(4))]))), `*`(phi[1], `...
(69)
 

 

 

Which can be written as
 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `+`(E(`*`(r[t], `*`(r[`+`(t, `-`(2))]))), `*`(phi[1], `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(3))]))))), `*`(`^`(phi[1], 2), `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(4))])))...
AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `+`(E(`*`(r[t], `*`(r[`+`(t, `-`(2))]))), `*`(phi[1], `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(3))]))))), `*`(`^`(phi[1], 2), `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(4))])))...
AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `+`(E(`*`(r[t], `*`(r[`+`(t, `-`(2))]))), `*`(phi[1], `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(3))]))))), `*`(`^`(phi[1], 2), `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(4))])))...
 

 

 

 

We know from Proposition-6 that the Expected Value of a random variable^2 with any lag is approximately equal to the Variance of that

variable.
 

 

This means that   E(`*`(`^`(r[`+`(t, `-`(4))], 2))) = `*`(`^`(sigma, 2))    E(`*`(`^`(r[`+`(t, `-`(3))], 2))) = `*`(`^`(sigma, 2))  etc etc 

 

 

This gives us: 

 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `+`(E(`*`(r[t], `*`(r[`+`(t, `-`(2))]))), `*`(phi[1], `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(3))]))))), `*`(`^`(phi[1], 2), `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(4))])))...
AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `+`(E(`*`(r[t], `*`(r[`+`(t, `-`(2))]))), `*`(phi[1], `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(3))]))))), `*`(`^`(phi[1], 2), `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(4))])))...
AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `+`(E(`*`(r[t], `*`(r[`+`(t, `-`(2))]))), `*`(phi[1], `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(3))]))))), `*`(`^`(phi[1], 2), `*`(E(`*`(r[t], `*`(r[`+`(t, `-`(4))])))...
 

 

 

 

We know from Proposition-7 that the Expected Value of the product of a lagged random variables is approximately zero. 

 

 

This means that  E(`*`(r[`+`(t, `-`(1))], `*`(r[`+`(t, `-`(2))]))) = 0     E(`*`(r[t], `*`(r[`+`(t, `-`(2))]))) = 0 .    etc etc 

 

 

Again note that these terms are called cross products and are always equal to zero 

 

 

This gives us: 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `+`(`*`(`^`(phi[1], 2), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 4), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 6), `*`(`^`(sigma, 2)))) 

 

 

We again note that if we would have selected a much larger starting value than y(`+`(t, `-`(4)))then our expression would have been

infinite long. This means that we can write the above equation as:
 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `+`(`*`(`^`(phi[1], 2), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 4), `*`(`^`(sigma, 2))), `*`(`^`(phi[1], 6), `*`(`^`(sigma, 2)))) .. () 

 

 

Which can be written as: 

 

 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `*`(`^`(sigma(`+`(`*`(`^`(phi[1], 2)), `*`(`^`(phi[1], 4)), `*`(`^`(phi[1], 6))) .. ()), 2)) 

 

 

We now note that if we multiply both sides by `*`(`^`(phi[1], 2)) we get: 

 

`*`(`^`(phi[1], 2), `*`(AutoCovariance(y(t), y(`+`(t, `-`(2)))))) = `*`(`^`(phi[1], 2), `*`(`^`(sigma(`+`(`*`(`^`(phi[1], 2)), `*`(`^`(phi[1], 4)), `*`(`^`(phi[1], 6))) .. ()), 2))) 

 

 

Which can be written as: 

 

`*`(`^`(phi[1], 2), `*`(AutoCovariance(y(t), y(`+`(t, `-`(1)))))) = `*`(`^`(sigma(`+`(`*`(`^`(phi[1], 4)), `*`(`^`(phi[1], 6)), `*`(`^`(phi[1], 8))) .. ()), 2)) 

 

 

 

We now subtract the second expression from the first which gives us: 

 

 

`+`(AutoCovariance(y(t), y(`+`(t, `-`(2)))), `-`(`*`(`^`(phi[1], 2), `*`(AutoCovariance(y(t), y(`+`(t, `-`(2)))))))) = `+`(`*`(`^`(sigma(`+`(`*`(`^`(phi[1], 2)), `*`(`^`(phi[1], 4)), `*`(`^`(phi[1], 6...
`+`(AutoCovariance(y(t), y(`+`(t, `-`(2)))), `-`(`*`(`^`(phi[1], 2), `*`(AutoCovariance(y(t), y(`+`(t, `-`(2)))))))) = `+`(`*`(`^`(sigma(`+`(`*`(`^`(phi[1], 2)), `*`(`^`(phi[1], 4)), `*`(`^`(phi[1], 6...
 

 

 

Which is given by: 

 

 

simplify(`+`(`*`(`^`(sigma, 2), `*`(`+`(`*`(`^`(phi[1], 2)), `*`(`^`(phi[1], 4)), `*`(`^`(phi[1], 6))))), `-`(`*`(`^`(sigma, 2), `*`(`+`(`*`(`^`(phi[1], 4)), `*`(`^`(phi[1], 6)), `*`(`^`(phi[1], 8))))... 

`+`(`*`(`^`(phi[1], 2), `*`(`^`(sigma, 2))), `-`(`*`(`^`(sigma, 2), `*`(`^`(phi[1], 8))))) (70)
 

 

We now note that since we assume stationarity `<`(phi[1], 1) it means that the term `*`(`^`(sigma, 2), `*`(`^`(phi[1], 8))) will become zero if we select a larger starting value. 

 

 

This means that we can write the above equation as: 

 

 

`+`(AutoCovariance(y(t), y(`+`(t, `-`(2)))), `-`(`*`(`^`(phi[1], 2), `*`(AutoCovariance(y(t), y(`+`(t, `-`(2)))))))) = `*`(`^`(phi[1], 2), `*`(`^`(sigma, 2))) 

 

 

which means that: 

 

`*`(AutoCovariance(y(t), y(`+`(t, `-`(2)))), `*`(`+`(1, `-`(`*`(`^`(phi[1], 2)))))) = `*`(`^`(phi[1], 2), `*`(`^`(sigma, 2))) 

 

 

Which can be written as: 

 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `/`(`*`(`^`(phi[1], 2), `*`(`^`(sigma, 2))), `*`(`+`(1, `-`(`*`(`^`(phi[1], 2)))))) 

 

 

Which is our final expression for the AutoCovariance(y(t), y(`+`(t, `-`(2)))) 

 

 

 

 

 

 

4.4) Summary for Our First Order Autoregressive  Model  

 

 

 

We have in the previous sections calculated the AutoCovariances for y(t), y(t-1), y(t-2) given by: 

 

AutoCovariance(y(t), y(t)) = `/`(`*`(`^`(sigma, 2)), `*`(`+`(1, `-`(`*`(`^`(phi[1], 2)))))) 

 

AutoCovariance(y(t), y(`+`(t, `-`(1)))) = `/`(`*`(phi[1], `*`(`^`(sigma, 2))), `*`(`+`(1, `-`(`*`(`^`(phi[1], 2)))))) 

 

AutoCovariance(y(t), y(`+`(t, `-`(2)))) = `/`(`*`(`^`(phi[1], 2), `*`(`^`(sigma, 2))), `*`(`+`(1, `-`(`*`(`^`(phi[1], 2)))))) 

 

 

 

We know that we can calculate the AutoCorrelation for any lag as: 

 

 

AutoCorrelation(L) = `/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(L))))), `*`(AutoCovariance(y(t), y(t)))) 

 

 

This means that: 

 

`and`(AutoCorrelation(0) = `*`(AutoCovariance(y(t), y(t)), `/`(1, `*`(AutoCovariance(y(t), y(t))))), `and`(`*`(AutoCovariance(y(t), y(t)), `/`(1, `*`(AutoCovariance(y(t), y(t))))) = `*`(`/`(`*`(`^`(si... 

 

 

`and`(AutoCorrelation(1) = `/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(1))))), `*`(AutoCovariance(y(t), y(t)))), `and`(`/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(1))))), `*`(AutoCovariance(y(t), y(t)))) ... 

 

 

 

`and`(AutoCorrelation(2) = `/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(2))))), `*`(AutoCovariance(y(t), y(t)))), `and`(`/`(`*`(AutoCovariance(y(t), y(`+`(t, `-`(2))))), `*`(AutoCovariance(y(t), y(t)))) ... 

 

 

 

For example if we assume that

`:=`(phi[1], .5); -1

then the AutoCorrelation for each lag is given by
 

 

 

AutoCorrelation(1) = phi[1] 

AutoCorrelation(1) = .5 (71)
 

 

AutoCorrelation(2) = `*`(`^`(phi[1], 2)) 

AutoCorrelation(2) = .25 (72)
 

 

AutoCorrelation(3) = `*`(`^`(phi[1], 3)) 

AutoCorrelation(3) = .125 (73)
 

 

AutoCorrelation(4) = `*`(`^`(phi[1], 4)) 

AutoCorrelation(4) = 0.625e-1 (74)
 

 

AutoCorrelation(5) = `*`(`^`(phi[1], 5)) 

AutoCorrelation(5) = 0.3125e-1 (75)
 

 

etc etc 

 

 

 

 

We can show that this indeed is true by simulating a AR(1) 

 

 

restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
restart; -1; with(Statistics); -1; randomize(); -1; `:=`(n, 5000); -1; `:=`(phi1, .5); -1; `:=`(r, Sample(RandomVariable(Normal(0, 1)), n)); -1; for t from 2 to n do `:=`(y[1], 0); `:=`(y[t], `+`(`*`(...
 

 

 

 

 

 

 

Plot_2d
`.`(first, order, serial, correlation) = .4869985889
`.`(second, order, serial, correlation) = .2293958134
`.`(third, order, serial, correlation) = .1069916824
`.`(fourth, order, serial, correlation) = 0.5752349755e-1
`.`(fourth, order, serial, correlation) = 0.1499858304e-1
Plot_2d
 

 

 

Legal Notice: ? Maplesoft, a division of Waterloo Maple Inc. 2009. Maplesoft and Maple are trademarks of Waterloo Maple Inc. Neither Maplesoft nor the authors are responsible for any errors contained within and are not liable for any damages resulting from the use of this material.  This application is intended for non-commercial, non-profit use only. Contact the authors for permission if you wish to use this application in for-profit activities.