没有合适的资源?快使用搜索试试~ 我知道了~
首页完整详尽的课后习题答案-时间序列分析及应用(R语言原书第2版)
包含所有课后习题答案,非常详尽! 《时间序列分析及应用(R语言)(原书第2版)》以易于理解的方式讲述了时间序列模型及其应用,内容包括趋势、平稳时间序列模型、非平稳时间序列模型、模型识别、参数估计、模型诊断、预测、季节模型、时间序列回归模型、异方差模型、谱分析入门、谱估计和门限模型。对所有的思想和方法,都用真实数据集和模拟数据集进行了说明。 《时间序列分析及应用(R语言)(原书第2版)》的一大特点是采用R语言来作图和分析数据,书中的所有图表和实证结果都是用R命令得到的。作者还为《时间序列分析及应用(R语言)(原书第2版)》制作了大量新增或增强的-函数。《时间序列分析及应用(R语言)(原书第2版)》的另一特点是包含很多有用的附录.例如,回顾了有关期望、方差、协方差、相关系数等概念.筒述了条件期望的性质以及最小均方误差预测等内容,这些附录有利于关心技术细节的读者深入了解相关内容。
资源详情
资源评论
资源推荐

1
Solutions Manual to Accompany
Time Series Analysis
with Applications in R, Second Edition
by Jonathan D. Cryer and Kung-Sik Chan
Solutions by Jonathan Cryer and Xuemiao Hao, updated 7/28/08
CHAPTER 1
Exercise 1.1 Use software to produce the time series plot shown in Exhibit (1.2), page 2. The following R code will
produce the graph.
> library(TSA); data(larain); win.graph(width=3,height=3,pointsize=8)
> plot(y=larain,x=zlag(larain),ylab='Inches',xlab='Previous Year Inches')
Exercise 1.2 Produce the time series plot displayed in Exhibit (1.3), page 3. Use the R code
> data(color); plot(color,ylab='Color Property',xlab='Batch',type='o')
Exercise 1.3 Simulate a completely random process of length 48 with independent, normal values. Repeat this exer-
cise several times with a new simulation, that is, a new seed, each time.
> plot(ts(rnorm(n=48)),type='o') # If you repeat this command R will use a new “random
numbers” each time. If you want to reproduce the same simulation first use the
command set.seed(#########) where ######### is an integer of your choice.
Exercise 1.4 Simulate a completely random process of length 48 with independent, chi-square distributed values
each with 2 degrees of freedom. Use the same R code as in the solution of Exercise 1.3 but replace
rnorm(n=48)
with rchisq(n=48,df=2).
Exercise 1.5 Simulate a completely random process of length 48 with independent, t-distributed values each with 5
degrees of freedom. Construct the time series plot. Use the same R code as in the solution of Exercise 1.3 but
replace
rnorm(n=48) with rt(n=48,df=5).
Exercise 1.6 Construct a time series plot with monthly plotting symbols for the Dubuque temperature series as in
Exhibit (1.7), page 6. (Make the plot full screen so that you can see all of detail.)
> data(tempdub); plot(tempdub,ylab='Temperature')
> points(y=tempdub,x=time(tempdub), pch=as.vector(season(tempdub)))
CHAPTER 2
Exercise 2.1 Suppose E(X) = 2, Var (X) = 9, E(Y) = 0, Va r(Y) = 4, and Corr(X,Y) = 0.25. Find:
(a) Var (X + Y) = Var(X) + Var(Y) +2Cov(X,Y) = 9 + 4 + 2(3*2*0.25) = 16
(b) Cov(X, X + Y) = Cov(X,X) + Cov(X,Y) = 9 + ((3*2*0.25) = 9 + 3/2 = 10.5
(c) Corr(X + Y, X − Y). As in part (a), Var(X−Y) = 10. Then Cov(X + Y, X − Y) = Cov(X,X
) − Cov(Y,Y) + Cov(X,Y)
− Cov(X,Y) = Va r(X) − Var(Y) = 9 − 4 = 5. So
Exercise 2.2 If X and Y are dependent but Va r (X) = Va r(Y), find Cov(X + Y, X − Y).
Cov(X + Y, X − Y) = Cov(X,X) − Cov(Y,Y) + Cov(X,Y) − Cov(Y,X) =
Var (X) − Var((Y) = 0
Exercise 2.3 Let X have a distribution with mean μ and variance σ
2
and let Y
t
= X for all t.
(a) Show that {Y
t
} is strictly and weakly stationary. Let t
1
, t
2
,…, t
n
be any set of time points and k any time lag.
Then
Corr X Y X Y–,+()
Cov X Y X Y–,+()
Var X Y+()Var X Y–()
------------------------------------------------------------
5
16 10×
----------------------
5
410
------------- 0 . 3 9 5 2 8 4 7 1====

2
as required for strict stationarity. Since the autocovariance clearly exists, (see part (b)), the process is also weakly
stationary.
(b) Find the autocovariance function for {Y
t
}. Cov(Y
t
,Y
t − k
) = Cov(X,X) = σ
2
for all t and k, free of t (and k).
(c) Sketch a “typical” time plot of Y
t
. The plot will be a horizontal “line” (really a discrete-time horizontal line)
at the height of the observed X.
Exercise 2.4 Let {e
t
} be a zero mean white noise processes. Suppose that the observed process is Y
t
= e
t
+ θe
t − 1
where θ is either 3 or 1/3.
(a) Find the autocorrelation function for {Y
t
} both when θ = 3 and when θ =1/3. E(Y
t
) = E(e
t
+ θe
t−1
) = 0.
Also Var(Y
t
) = Var (e
t
+ θe
t − 1
) = σ
2
+ θ
2
σ
2
= σ
2
(1 + θ
2
). Also Cov(Y
t
,Y
t − 1
) = Cov(e
t
+ θe
t − 1
,
e
t − 1
+ θe
t − 2
) = θσ
2
free of t. Now for k > 1, Cov(Y
t
,Y
t − k
) = Cov(e
t
+ θe
t − 1
,
e
t − k
+ θe
t − k − 1
) = 0 since all of these error terms are
uncorrelated. So
But 3/(1+3
2
) = 3/10 and (1/3)/[1+(1/3)
2
] = 3/10. So the autocorrelation functions are identical.
(b) You should have discovered that the time series is stationary regardless of the value of θ and that the autocor-
relation functions are the same for θ = 3 and θ = 1/3. For simplicity, suppose that the process mean is known
to be zero and the variance of Y
t
is known to be 1. You observe the series {Y
t
} for t = 1, 2,..., n and suppose
that you can produce good estimates of the autocorrelations ρ
k
. Do you think that you could determine
which value of θ is correct (3 or 1/3) based on the estimate of ρ
k
? Why or why not?
Exercise 2.5 Suppose Y
t
= 5 + 2t + X
t
where {X
t
} is a zero mean stationary series with autocovariance function γ
k
.
(a) Find the mean function for {Y
t
}. E(Y
t
) = E(5 + 2t + X
t
) = 5 + 2t + E(X
t
) = 5 + 2t.
(b) Find the autocovariance function for {Y
t
}.
Cov(Y
t
,Y
t − k
) = Cov(5 + 2t + X
t
, 5 + 2(t − k) + X
t − k
) = Cov(X
t
,X
t − k
) = γ
k
free of t.
(c) Is {Y
t
} stationary? (Why or why not?) In spite of part (b), The process {Y
t
} is not stationary since its mean
varies with time.
Exercise 2.6 Let {X
t
} be a stationary time series and define
(a) Show that is free of t for all lags k.
Cov(Y
t
,Y
t − k
) = Cov(X
t
+ 3,X
t − k
+ 3) = Cov(X
t
,X
t−k
) is free of t since {X
t
} is stationary.
(b) Is {Y
t
} stationary? {Y
t
} is not stationary since E(Y
t
) = E(X
t
) = μ
X
for t odd but E(Y
t
) = E(X
t
+ 3) = μ
X
+ 3 for
t even.
Exercise 2.7 Suppose that {Y
t
} is stationary with autocovariance function γ
k
.
(a) Show that W
t
= ∇Y
t
= Y
t
− Y
t−1
is stationary by finding the mean and autocovariance function for {W
t
}.
E(W
t
) = E(Y
t
− Y
t − 1
) = E(Y
t
) − E(Y
t − 1
) = 0 since {Y
t
} is stationary. Also
Cov(Y
t
,Y
t − k
) = Cov(Y
t
− Y
t − 1
,Y
t − k
− Y
t − k − 1
) = Cov(Y
t
,Y
t − k
) − Cov(Y
t
, Y
t − k − 1
) − Cov(Y
t − 1
,Y
t − k
) +
Cov(Y
t − 1
,Y
t − k − 1
)
= γ
k
− γ
k+1
− γ
k − 1
+ γ
k
= 2γ
k
− γ
k+1
− γ
k − 1
, free of t.
(b) Show that U
t
= ∇
2
Y
t
= ∇[Y
t
− Y
t − 1
] = Y
t
− 2Y
t − 1
+ Y
t − 2
is stationary. (You need not find the mean and auto-
covariance function for {U
t
}.) U
t
is the first difference of the process{∇Y
t
}. By part (a), {∇Y
t
} is stationary.
So U
t
is the difference of a stationary process and, again by part (a), is itself stationary.
Pr Y
t
1
y
t
1
< Y
t
2
y
t
2
≤…Y
t
n
y
t
n
≤,,,()Pr X y
t
1
< Xy
t
2
≤…Xy
t
n
≤,,,()=
Pr Y
t
1
k–
y
t
1
< Y
t
2
k–
y
t
2
≤…Y
t
n
k–
y
t
n
≤,,,()=
Corr Y
t
Y
tk–
,()
Cov Y
t
Y
tk–
,()
Var Y
t
()Var Y
tk–
()
---------------------------------------------------
1 for k 0=
θσ
2
σ
2
1 θ
2
+()
--------------------------
θ
1 θ
2
+
---------------= for k 1=
0for k 1>
⎩
⎪
⎪
⎨
⎪
⎪
⎧
==
Y
t
X
t
X
t
3+
⎩
⎨
⎧
=
for t odd
for t even
Cov Y
t
Y
tk–
,()

3
Exercise 2.8 Suppose that {Y
t
} is stationary with autocovariance function γ
k
. Show that for any fixed positive integer
n and any constants c
1
, c
2
,..., c
n
, the process {W
t
} defined by is station-
ary. First
free of t. Also
Exercise 2.9 Suppose Y
t
= β
0
+ β
1
t + X
t
where {X
t
} is a zero mean stationary series with autocovariance function γ
k
and β
0
and β
1
are constants.
(a) Show that {Y
t
} is not stationary but that W
t
= ∇Y
t
= Y
t
− Y
t − 1
is stationary. {Y
t
} is not stationary since its
mean, β
0
+ β
1
t, varies with t. However, E(W
t
) = E(Y
t
− Y
t − 1
) = (β
0
+ β
1
t) − (β
0
+ β
1
(t − 1)) = β
1
, free of t.
The argument in the solution of Exercise 2.7 shows that the covariance function for {W
t
} is free of t.
(b) In general, show that if Y
t
= μ
t
+ X
t
where {X
t
} is a zero mean stationary series and μ
t
is a polynomial in t of
degree d, then ∇
m
Y
t
= ∇(∇
m − 1
Y
t
) is stationary for m ≥ d and nonstationary for 0 ≤ m < d.
Use part (a) and proceed by induction.
Exercise 2.10 Let {X
t
} be a zero-mean, unit-variance stationary process with autocorrelation function ρ
k
. Suppose
that μ
t
is a nonconstant function and that σ
t
is a positive-valued nonconstant function. The observed series is formed
as Y
t
= μ
t
+ σ
t
X
t
(a) Find the mean and covariance function for the {Y
t
} process.
Notice that Cov(X
t
,X
t − k
) = Corr(X
t
,X
t − k
) since {X
t
} has unit variance. E(Y
t
) = E(μ
t
+ σ
t
X
t
) = μ
t
+ σ
t
E(X
t
) = μ
t
.
Now Cov(Y
t
,Y
t − k
) = Cov(μ
t
+ σ
t
X
t
,μ
t − k
+ σ
t − k
X
t − k
) = σ
t
σ
t − k
Cov(X
t
,X
t − k
) = σ
t
σ
t − k
ρ
k
. Notice that Va r(Y
t
) =
(σ
t
)
2
.
(b) Show that autocorrelation function for the {Y
t
} process depends only on time lag. Is the {Y
t
} process station-
ary? Corr(Y
t
,Y
t − k
) = σ
t
σ
t − k
ρ
k
/[σ
t
σ
t − k
] = ρ
k
but {Y
t
} is not necessarily stationary since E(Y
t
) = μ
t
.
(c) Is it possible to have a time series with a constant mean and with Corr(Y
t
,Y
t − k
) free of t but with {Y
t
} not
stationary? If μ
t
is constant but σ
t
varies with t, this will be the case.
Exercise 2.11 Suppose Cov(X
t
,X
t − k
) = γ
k
is free of t but that E(X
t
) = 3t.
(a) Is {X
t
} stationary? No since E(X
t
) varies with t.
(b) Let Y
t
= 7 − 3t + X
t
. Is {Y
t
} stationary? Yes, since the covariances are unchanged but now E(X
t
) = 7 − 3t + 3t
= 7, free of t.
Exercise 2.12 Suppose that Y
t
= e
t
− e
t − 12
. Show that {Y
t
} is stationary and that, for k > 0, its autocorrelation func-
tion is nonzero only for lag k = 12.
E(Y
t
) = E(e
t
− e
t − 12
) = 0. Also Cov(Y
t
,Y
t − k
) = Cov(e
t
− e
t − 12
,e
t − k
− e
t − 12 − k
) = −Cov(e
t − 12
,e
t − k
) = −(σ
e
)
2
when
k = 12. It is nonzero only for k = 12 since, otherwise, all of the error terms involved are uncorrelated.
Exercise 2.13 Let . For this exercise, assume that the white noise series is normally distributed.
(a) Find the autocorrelation function for {Y
t
}. First recall that for a zero-mean normal distribution
and . Then which is constant in t and
Also
All other covariances are also zero.
(b) Is {Y
t
} stationary? Yes, in fact, it is a non-normal white noise in disguise!
W
t
c
1
Y
t
c
2
Y
t 1–
…
c
n
Y
tn–1+
+++=
EW
t
() c
1
EY
t
c
2
EY
t 1–
…
c
n
EY
tn–1+
+++ c
1
c
2
…
c
n
+++()μ
Y
==
Cov W
t
W
tk–
,()Cov c
1
Y
t
c
2
Y
t 1–
…
c
n
Y
tn–1+
+++ c
1
Y
tk–
c
2
Y
t 1– k–
…
c
n
Y
tn–1k–+
+++,()=
c
j
c
i
Cov Y
tj–
Y
tk– i–
,()
i 0=
n
∑
j 0=
n
∑
= c
j
c
i
γ
jk– i–
i 0=
n
∑
j 0=
n
∑
= free of t.
Y
t
e
t
θe
t 1–
2
–=
Ee
t 1–
3
()0= Ee
t 1–
4
()3σ
e
4
= EY
t
() θVar e
t 1–
()– θσ
e
2
–==
Var Y
t
() Var e
t
() θ
2
Var e
t 1–
2
()+ σ
e
2
θ
2
Ee
t 1–
4
()Ee
t 1–
2
()[]
2
–{}+==
σ
e
2
θ
2
3σ
e
4
σ
e
2
[]
2
–{}+=
σ
e
2
2θ
2
σ
e
4
+=
Cov Y
t
Y
t 1–
,()Cov e
t
θe
t 1–
2
– e
t 1–
θe
t 2–
2
–,()Cov θe
t 1–
2
– e
t 1–
,()θEe
t 1–
3
()–0====

4
Exercise 2.14 Evaluate the mean and covariance function for each of the following processes. In each case determine
whether or not the process is stationary.
(a) Y
t
= θ
0
+ te
t
. The mean is θ
0
but it is not stationary since Var (Y
t
) = t
2
Va r(e
t
) = t
2
σ
2
is not free of t.
(b) W
t
= ∇Y
t
where Y
t
is as given in part (a). W
t
= ∇Y
t
= (θ
0
+ te
t
)−(θ
0
+ (t−1)e
t − 1
) = te
t
−(t−1)e
t − 1
So the mean of W
t
is zero. However, Va r(W
t
) = [t
2
+ (t−1)
2
](σ
e
)
2
which depends on t and W
t
is not stationary.
(c) Y
t
= e
t
e
t − 1
. (You may assume that {e
t
} is normal white noise.) The mean of Y
t
is clearly zero. Lag one is the
only lag at which there might be correlation. However, Cov(Y
t
,Y
t − 1
) = E(e
t
e
t − 1
e
t − 1
e
t − 2
)
= E(e
t
) E[e
t − 1
]
2
E(e
t − 2
) = 0. So the process Y
t
= e
t
e
t − 1
is stationary and is a non-normal white noise!
Exercise 2.15 Suppose that X is a random variable with zero mean. Define a time series by Y
t
=(−1)
t
X.
(a) Find the mean function for {Y
t
}. E(Y
t
) = (−1)
t
E(X) = 0.
(b) Find the covariance function for {Y
t
}. Cov(Y
t
,Y
t − k
) = Cov[(−1)
t
X,(−1)
t − k
X] = (−1)
2t − k
Cov(X,X) =
(−1)
k
(σ
X
)
2
(c) Is {Y
t
} stationary? Yes, the mean is constant and the covariance only depends on lag.
Exercise 2.16 Suppose Y
t
= A + X
t
where {X
t
} is stationary and A is random but independent of {X
t
}. Find the mean
and covariance function for {Y
t
} in terms of the mean and autocovariance function for {X
t
} and the mean and vari-
ance of A. First E(Y
t
) = E(A) + E(X
t
) = μ
A
+ μ
X
, free of t. Also, since {X
t
} and A are independent,
Exercise 2.17 Let {Y
t
} be stationary with autocovariance function γ
k
. Let . Show that
Now make the change of variable t − s = k and t = j in the double sum. The range of the summation
{1 ≤ t ≤ n, 1 ≤ s ≤n} is transformed into {1 ≤ j ≤ n, 1 ≤ j − k ≤ n} = {k + 1 ≤ j ≤ n + k, 1 ≤ j ≤ n} which may be
written . Thus
Use γ
k
= γ
−k
to get the first expression in the exercise.
Exercise 2.18 Let {Y
t
} be stationary with autocovariance function γ
k
. Define the sample variance as
.
(a) First show that .
(b) Use part (a) to show that .
(Use the results of Exercise (2.17) for the last expression.)
Cov Y
t
Y
tk–
,()Cov A X
t
+ AX
tk–
+,()Cov A A,()Cov X
t
X
tk–
,()+ Var A() γ
k
X
+= = = free of t
·
Y
_
1
n
---
Y
t
t 1=
n
∑
=
Var Y
_
()
γ
0
n
-----
2
n
---
1
k
n
---–
⎝⎠
⎛⎞
γ
k
k 1=
n 1–
∑
+=
1
n
---
1
k
n
-----–
⎝⎠
⎛⎞
γ
k
kn–1+=
n 1–
∑
=
Var Y
_
()
1
n
2
-----
Var Y
t
t 1=
n
∑
[]
1
n
2
-----
Cov Y
t
t 1=
n
∑
Y
s
s 1=
n
∑
,[]
1
n
2
-----
γ
ts–
s 1=
n
∑
t 1=
n
∑
== =
k 0 k 1 jn≤≤+,>{}k 01 jnk+≤≤,≤{}∪
Var Y
_
()
1
n
2
-----
γ
k
jk1+=
n
∑
k 1=
n 1–
∑
γ
k
j 1=
nk+
∑
kn–1+=
0
∑
+[]=
1
n
2
-----
nk–()γ
k
k 1=
n 1–
∑
nk+()γ
k
kn–1+=
0
∑
+[]=
1
n
---
1
k
n
-----–
⎝⎠
⎛⎞
γ
k
kn–1+=
n 1–
∑
=
S
2
1
n 1–
------------
Y
t
Y
_
–()
2
t 1=
n
∑
=
Y
t
μ–()
2
t 1=
n
∑
Y
t
Y
_
–()
2
t 1=
n
∑
nY
_
μ–()
2
+=
Y
t
μ–()
2
t 1=
n
∑
Y
t
Y
_
– Y
_
μ–+()
2
t 1=
n
∑
Y
t
Y
_
–()
2
t 1=
n
∑
Y
_
μ–()
2
t 1=
n
∑
2 Y
t
Y
_
–()Y
_
μ–()
t 1=
n
∑
++==
Y
t
Y
_
–()
2
t 1=
n
∑
nY
_
μ–()
2
2 Y
_
μ–()Y
t
Y
_
–()
t 1=
n
∑
++= Y
t
Y
_
–()
2
t 1=
n
∑
nY
_
μ–()
2
+=
ES
2
()
n
n 1–
------------
γ
0
n
n 1–
------------
Var Y
_
()– γ
0
2
n 1–
------------
1
k
n
---–
⎝⎠
⎛⎞
γ
k
k 1=
n 1–
∑
–==

5
(c) If {Y
t
} is a white noise process with variance γ
0
, show that E(S
2
) = γ
0
. This follows since for white noise γ
k
= 0 for k > 0.
Exercise 2.19 Let Y
1
= θ
0
+ e
1
and then for t > 1 define Y
t
recursively by Y
t
= θ
0
+ Y
t − 1
+ e
t
. Here θ
0
is a constant.
The process {Y
t
} is called a random walk with drift.
(a) Show that Y
t
may be rewritten as . Substitute Y
t − 1
= θ
0
+ Y
t − 2
+ e
t − 1
into
Y
t
= θ
0
+ Y
t − 1
+ e
t
and repeat until you get back to e
1
.
(b) Find the mean function for Y
t
.
(c) Find the autocovariance function for Y
t
.
Exercise 2.20 Consider the standard random walk model where Y
t
= Y
t − 1
+ e
t
with Y
1
= e
1
.
(a) Use the above representation of Y
t
to show that μ
t
= μ
t−1
for t > 1 with initial condition μ
1
= E(e
1
) = 0. Hence
show that μ
t
= 0 for all t. Clearly, μ
1
= E(Y
1
) = E(e
1
) = 0. Then E(Y
t
) = E(Y
t − 1
+ e
t
) = E(Y
t − 1
) + E(e
t
) =
E(Y
t − 1
) or μ
t
= μ
t − 1
for t > 1 and the result follows by induction.
(b) Similarly, show that Va r(Y
t
) = Var (Y
t − 1
) + , for t > 1 with Va r(Y
1
) = , and, hence Var (Y
t
) = t.
Var (Y
1
) = is immediate. Then .
Recursion or induction on t yields Va r(Y
t
) = t .
(c) For 0 ≤ t ≤ s, use Y
s
= Y
t
+ e
t +1
+ e
t +2
++ e
s
to show that Cov(Y
t
, Y
s
) = Var (Y
t
) and, hence, that Cov(Y
t
,
Y
s
) = min(t, s) . For 0 ≤ t ≤ s,
and hence the result.
Exercise 2.21 A random walk with random starting value. Let for t > 0 where Y
0
has
a distribution with mean μ
0
and variance . Suppose further that Y
0
, e
1
,..., e
t
are independent.
(a) Show that E(Y
t
) = μ
0
for all t.
.
(b) Show that Var(Y
t
) = t +.
(c) Show that Cov(Y
t
, Y
s
) = min(t, s) + . Let t be less than s. Then, as in the previous exercise,
(d) Show that . Just use the results of parts (b) and (c).
Exercise 2.22 Let {e
t
} be a zero-mean white noise process and let c be a constant with |c| < 1. Define Y
t
recursively
by Y
t
= cY
t−1
+ e
t
with Y
1
= e
1
.
This exercise can be solved using the recursive definition of Y
t
or by expressing Y
t
explicitly using repeated substi-
tution as . Parts (c), (d), and (e) essen-
tially assume you are working with the recursive version of Y
t
but they can also be solved using this explicit
representation.
ES
2
() E
1
n 1–
------------
Y
t
Y
_
–()
2
t 1=
n
∑
⎝⎠
⎜⎟
⎜⎟
⎛⎞
1
n 1–
------------
EY
t
μ–()
2
t 1=
n
∑
nY
_
μ–()
2
–
⎝⎠
⎜⎟
⎜⎟
⎛⎞
1
n 1–
------------
EY
t
μ–()
2
[]
t 1=
n
∑
nE Y
_
μ–()
2
–== =
1
n 1–
------------
nγ
0
nVar Y
_
()–[]=
1
n 1–
------------
nγ
0
n
γ
0
n
-----
2
n
---
1
k
n
---–
⎝⎠
⎛⎞
γ
k
k 1=
n 1–
∑
+
⎩⎭
⎪⎪
⎨⎬
⎪⎪
⎧⎫
– γ
0
2
n 1–
------------
1
k
n
---–
⎝⎠
⎛⎞
γ
k
k 1=
n 1–
∑
–==
Y
t
tθ
0
e
t
e
t 1–
…
e
1
++ + +=
EY
t
() Etθ
0
e
t
e
t 1–
…
e
1
++ + +()tθ
0
==
Cov Y
t
Y
tk–
,()Cov tθ
0
e
t
e
t 1–
…
e
1
++ + + tk–()θ
0
e
tk–
e
t 1– k–
…
e
1
++ ++,[]=
Cov e
tk–
e
t 1– k–
…
e
1
+++e
tk–
e
t 1– k–
…
e
1
+++,[]=
Var e
tk–
e
t 1– k–
…
e
1
+++()= tk–()σ
e
2
= for tk≥
σ
e
2
σ
e
2
σ
e
2
σ
e
2
Var Y
t
() Var Y
t 1–
e
t
+()Var Y
t 1–
()Var e
t
()+ Var Y
t 1–
()σ
e
2
+== =
σ
e
2
…
σ
e
2
Cov Y
t
Y
s
,()Cov Y
t
Y
t
e
t 1+
e
t 2+
… e
s
++++,()Cov Y
t
Y
t
,()Var Y
t
() tσ
e
2
====
Y
t
Y
0
e
t
e
t 1–
…
e
1
++ + +=
σ
0
2
EY
t
() EY
0
e
t
e
t 1–
…
e
1
++ + +()EY
0
()Ee
t
() Ee
t 1–
()
…
Ee
1
()++ ++ EY
0
() μ
0
== ==
σ
e
2
σ
0
2
Var Y
t
() Var Y
0
e
t
e
t 1–
…
e
1
++ + +()Var Y
1
()Var e
t
() Var e
t 1–
()…Var e
1
()++ ++ σ
0
2
tσ
e
2
+== =
σ
e
2
σ
0
2
Cov Y
t
Y
s
,()Cov Y
t
Y
t
e
t 1+
e
t 2+
… e
s
++++,()Var Y
t
() σ
0
2
tσ
e
2
+===
Corr Y
t
Y
s
,()
tσ
a
2
σ
0
2
+
sσ
a
2
σ
0
2
+
----------------------= for 0 ts≤≤
Y
t
ccY
t 2–
e
t 1–
+()e
t
+
…
e
t
ce
t 1–
c
2
e
t 2–
…
c
t 1–
e
1
++ ++===
剩余303页未读,继续阅读














lh499315384
- 粉丝: 5
- 资源: 5
上传资源 快速赚钱
我的内容管理 收起
我的资源 快来上传第一个资源
我的收益
登录查看自己的收益我的积分 登录查看自己的积分
我的C币 登录后查看C币余额
我的收藏
我的下载
下载帮助

会员权益专享
安全验证
文档复制为VIP权益,开通VIP直接复制

评论20