RHODES:
ESTIbTATION
AND
FILTERING
of many physical random phenomena. We summarize here
the properties
of
Gaussian random vectors that are of
greatest, importance in estimation theory, leaving
it
to the
reader to consult., e.g.,
[l
1,
[4]
for a det,ailed discussion of
the Gaussian distxibution.
DeJinition
I:
An r-dimensional random vector
Z
is
said
to be
Gaussian.
(or
normal) wit,h parameters
m
(an
r-
vector) and
x
(an
r
X
r
positive definite mat.rix)
if
its
proba.bility density functionfz(
0)
is given for all
z
E
R'
by
fz(z)
=
(2rr)-r121z1-1/2exp[--4(2
-
m)'z-l(z
-
m)]
(11)
where
I
B
I
is
the determinant of
Z.
The corresponding char-
acteristic function
of
2
is
pz(u)
=
E{exp
[~v'z]}
=
exp
[ju'm
-
iu'zu]
(12)
and t,his constitutes an alternative definition of a Gaussian
ra.ndom vector 1vit.h parameters
m
and
2;.
In fact, the
definition of a Gaussian ra.ndom vect,or as one whose char-
acteristic funct,ion is given by (12)
is
more general because
it includes the possibilit,y that
Z
may be degenerate and
of
R',
in which case
1x1
=
0
a.nd
X
is nonnegative definite
but not positive definite and not invertible. In any case, we
henceforth adopt the shorthand notation
N(m,
z)
for t.he
Gaussian
(or
normal) distribution with parameters
m
and
z.
Using the well-known [I
1,
[4]
properties
of
the char-
acterist,ic function to compute t,he moments
of
Z
we have,
in
pa.rticular,
I
have its entire density concentrated on a proper subspace
1
d2&
22
dv2
F%
Y
cov
[Z,
Z]
=
7
.
-
(0)
-
mm'
=
X.
Thus t,he pammeters
m
and
X
of the Gaussian probability
distribution (11)
or
(12) are, respectively, the mean and
t.he covariance of
2.
It
.-
is important, to note that the prob-
ability densit,y funct.ion of
a
Gaussian ra.ndom vect.or
is
t,herefore comdetelv sDecified by a knowledge of its mean
.
~
.~__________
__
.
~__~
~ ~~~
I_
~ ~~~~ ~~
and cova.riance. The importance
of
Gaussian random vec-
tors in estimation and control t,heory
is
due largely to trhis
factand
to
the following facts.
1)
~
UncoyGlated
~
jointly Gaussian random vect.ors are
independent..
2)
Linear
~
.....
functions of Gaussian random vectors are
themselves Gaussian random vectors.
3) In particular, sums of jointly Gaussian random vec-
tors are Gaussian random vectors.
4)
The conditional expectation of one jointly Gaussian
random vector given another.is-a-GAussian random vector
t,hat is a linear funct.ion of theconditionjng vector.
In particular, let
X
and
Y
be jointly distributed random
vectors with respective dimensions
n
and
rn
whose com-
posite vector
2
=
[XI,
Y']'
is
N(m,
8)
with mean
b/=/M+r/L
.
~ ~
-~
-~~~~~
.
.
~~~~~_____~
......
..
.
-.
___
. ._
~-
-
~
~-
~
.
.....
._____
~~~-~
~
~~ ~
~-
~
~-~
~
-
E[Z]
=
m
=
I'
x
I
and covariance
69
1
....................
=
LCOV
[Y,
X]
cov
[Y, Y]
1
(13b)
..'W
x
4-
Then t,he following properties hold.
W%
ffi'L
Property
1:
If
W
=
A2
where
A
is any nonrandom
q
x
r
matrix t,hen, from (12),
pw(v)
=
E{exp
(ju'w)}
=
EIexp
Ij(u'A)Z]}
=
pz(A'u)
=
exp
[i(v'A)m
-
$(u'A)x(A'u)]
=
exp
[ju'(Am)
-
+v'(AXA')u]
so
tha.t
W
is
hr(Am, AXA').
Property
2:
In particular, taking
A
=
[Znl
01
and then
A
=
[o:
I,],
where
I,
is the
?2.
x
n
unit mat,rix and
0
is a
zero matrix
of
the appropriate dimensions,
we
see with the
use of (13) t.hat, the marginal dist,rihut.ions of
X
and
Y
are
Ga.ussian, i.e.,
X
and
Y
are, respect>ively,
N(mx,
BXX)
and
N(my,
ZYY).
Property
3:
Furthermore, if
X
and
Y
have the same di-
mension, t*alring
A
=
[In,
Z,]
in Property
1
and using (13)
we
see
that
X
+
Y
is
N(mx
+
my,
Bxx
+
Zxy
+
8yx
+
XYY).
Property
4:
If
X
and
Y
arc uncorrelated,
so
that
Zxy
=
(xy$
=
0,
then they are also independent, beca.use
in
this
case
(z
-
m)'x-l(z
-
m)
=
(x
-
mx)'8xx-1(x
-
mx)
and (11) reduces to
~x,Y(x,
Y)
=
fx(x>f~(~>.
Property
5:
Assuming
for
the moment, that,
mx
=
0
and
my
=
0,
the conditional density of
X
given
Y
is, from (11)
and (13),
=
K
exp
-
4
[z'z-lz
-
y'zyy-ly]
=
K
exp
-
+[x'Sxxx
+
x'Sxyy
+
y'Syxx
so
tha.t,, expanding
Sz
=
I,
we ha.ve
sxxxxx
+
SXYZYX
=
1
(15a.)