使用Python实现神经网络回归的详细代码 基本要求: ●使用sklearn 的 MLP 实现神经网络 ●数据要求: ●回归问题: ●选取随机生成的标准数据集测试 提高练习: ●尝试使用pytorch实现梯度下降(先手动update,再使用torch.optim模块)
时间: 2024-10-06 20:01:51 浏览: 72
在Python中,我们可以使用scikit-learn库(sklearn)中的Multi-Layer Perceptron (MLP)模型来实现神经网络回归。这里是一个基础的例子,展示了如何构建、训练和评估一个简单的线性回归模型:
```python
# 导入所需的库
from sklearn.neural_network import MLPRegressor
from sklearn.model_selection import train_test_split
import numpy as np
# 随机生成标准数据集(假设我们有两列特征x和一列目标y)
np.random.seed(42)
n_samples = 100
X = np.random.rand(n_samples, 2)
y = X[:, 0] + 2 * X[:, 1] + np.random.randn(n_samples)
# 数据分割(80%用于训练,20%用于测试)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
# 创建并初始化神经网络模型
mlp = MLPRegressor(hidden_layer_sizes=(10, 10), activation='relu', solver='adam', max_iter=500)
# 训练模型
mlp.fit(X_train, y_train)
# 预测
y_pred = mlp.predict(X_test)
# 评估性能
from sklearn.metrics import mean_squared_error
mse = mean_squared_error(y_test, y_pred)
print(f"Mean Squared Error: {mse}")
# 提高练习部分:使用PyTorch实现梯度下降
import torch
import torch.nn as nn
import torch.optim as optim
# 创建神经网络结构
class NeuralNet(nn.Module):
def __init__(self, input_dim, output_dim):
super(NeuralNet, self).__init__()
self.fc1 = nn.Linear(input_dim, 10)
self.fc2 = nn.Linear(10, 10)
self.fc3 = nn.Linear(10, output_dim)
def forward(self, x):
x = torch.relu(self.fc1(x))
x = torch.relu(self.fc2(x))
return self.fc3(x)
# 将数据转换为张量并创建模型
device = torch.device("cpu") # 可以改为cuda加速
input_dim = 2
output_dim = 1
net = NeuralNet(input_dim, output_dim).to(device)
# 手动设置初始权重和偏置
params = [p for p in net.parameters()]
for param in params:
torch.nn.init.uniform_(param, -0.1, 0.1)
# 梯度下降优化器
optimizer = optim.SGD(net.parameters(), lr=0.01)
# 手动更新梯度
for epoch in range(500): # 更改循环次数
optimizer.zero_grad()
inputs = torch.tensor(X_train, dtype=torch.float32).to(device)
targets = torch.tensor(y_train, dtype=torch.float32).unsqueeze(-1).to(device)
outputs = net(inputs)
loss = nn.MSELoss()(outputs, targets)
loss.backward()
with torch.no_grad():
for param in net.parameters():
param -= 0.01 * param.grad
# 这里仅作演示,实际应用会用torch.optim自动完成梯度计算和更新
```
阅读全文
相关推荐
![zip](https://img-home.csdnimg.cn/images/20241231045053.png)
![pdf](https://img-home.csdnimg.cn/images/20241231044930.png)
![zip](https://img-home.csdnimg.cn/images/20241231045053.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![-](https://img-home.csdnimg.cn/images/20241226111658.png)
![-](https://img-home.csdnimg.cn/images/20241226111658.png)
![-](https://img-home.csdnimg.cn/images/20241226111658.png)
![-](https://img-home.csdnimg.cn/images/20241226111658.png)
![-](https://img-home.csdnimg.cn/images/20241226111658.png)
![-](https://img-home.csdnimg.cn/images/20241226111658.png)
![-](https://img-home.csdnimg.cn/images/20241226111658.png)
![-](https://img-home.csdnimg.cn/images/20241226111658.png)
![-](https://img-home.csdnimg.cn/images/20241226111658.png)
![-](https://img-home.csdnimg.cn/images/20241226111658.png)
![-](https://img-home.csdnimg.cn/images/20241226111658.png)