没有合适的资源?快使用搜索试试~ 我知道了~
首页Matrix Calculus Operations矩阵微分
Matrix Calculus Operations矩阵微分
需积分: 33 122 浏览量
更新于2023-03-16
评论 1
收藏 417KB PDF 举报
Matrix Calculus Operations and Taylor Expansions
资源详情
资源评论
资源推荐

Linear Fitting Revisited
Linear Fitting Revisited
Linear fitting solves this problem:
Given n data points p
i
= [x
i1
· · · x
im
]
⊤
, 1 ≤ i ≤ n, and their
corresponding values v
i
, find a li near function f that
minimizes the error
E =
n
X
i=1
(f(p
i
) − v
i
)
2
. (1)
The linear function f(p
i
) has the form
f(p) = f(x
1
, . . . , x
m
) = a
1
x
1
+ · · · + a
m
x
m
+ a
m+1
. (2)
Leow Wee Khen g (NUS) Matrix Differentiation 2 / 34

Linear Fitting Revisited
Denote each row of D as d
⊤
i
. Then,
E =
n
X
i=1
(d
⊤
i
a − v
i
)
2
= kD a − vk
2
. (6)
So, linear least squares problem can be described very compactly as
min
a
kD a − vk
2
. (7)
To show that the solution in Eq. 5 minimizes error E, need to
differentiate E with respect to a and set it to zero:
dE
da
= 0. (8)
How to do this differentiation?
Leow Wee Khen g (NUS) Matrix Differentiation 4 / 34

Linear Fitting Revisited
The obvious (but hard) way:
E =
n
X
i=1
m
X
j=1
a
j
x
ij
+ a
m+1
− v
i
2
. (9)
Expand equation explicitly giving
∂E
∂a
k
=
2
n
X
i=1
m
X
j=1
a
j
x
ij
+ a
m+1
− v
i
x
ik
, for k 6= m + 1
2
n
X
i=1
m
X
j=1
a
j
x
ij
+ a
m+1
− v
i
, for k = m + 1
Then, set ∂E/∂a
k
= 0 and solve for a
k
.
This is slow, tedious and error prone!
Leow Wee Khen g (NUS) Matrix Differentiation 5 / 34
剩余33页未读,继续阅读
















安全验证
文档复制为VIP权益,开通VIP直接复制

评论0