梯度下降法求minf(x)=(x1-2)^4+(x1-2*x2)^2,其中,X=mat(x1,x2).T,选取初始点为X0= mat(0,3).T,终止误差为0.01
时间: 2024-06-06 07:11:19 浏览: 128
首先,求函数f(x)的偏导数:
∂f/∂x1 = 4(x1-2)^3(x1-2*x2) + 2(x1-2*x2)^2
∂f/∂x2 = -8(x1-2*x2)^2(x1-2)
然后,根据梯度下降法的公式,迭代更新X的值:
X(k+1) = X(k) - α∇f(X(k))
其中,α为学习率,可以根据实际情况调整。
代码如下:
import numpy as np
def f(x):
x1 = x[0]
x2 = x[1]
return (x1-2)**4 * (x1-2*x2)**2
def grad_f(x):
x1 = x[0]
x2 = x[1]
df_dx1 = 4*(x1-2)**3*(x1-2*x2) + 2*(x1-2*x2)**2
df_dx2 = -8*(x1-2*x2)**2*(x1-2)
return np.array([df_dx1, df_dx2])
def gradient_descent(X0, alpha, epsilon):
X = X0.copy()
while True:
grad = grad_f(X)
if np.linalg.norm(grad) < epsilon:
break
X = X - alpha*grad
return X
X0 = np.array([0, 3]).reshape(-1, 1)
alpha = 0.01
epsilon = 0.01
X = gradient_descent(X0, alpha, epsilon)
print(X)
阅读全文