基于matlab使用单层感知机对鸢尾花进行分类的完整代码,数据集分为训练集和测试集,不能使用自带的神经网络工具箱
时间: 2024-02-16 20:03:24 浏览: 97
基于Matlab有导师学习神经网络分类-鸢尾花种类识别仿真(源码+数据).rar
好的,以下是基于MATLAB使用单层感知机对鸢尾花进行分类的完整代码:
首先,导入数据集,这里使用经典的鸢尾花数据集(iris.csv)。
```matlab
data = csvread('iris.csv'); % 导入鸢尾花数据集
X = data(:,1:4); % 特征值
y = data(:,5); % 类别标签
```
接下来,将数据集分为训练集和测试集,这里使用了随机采样的方法。
```matlab
% 随机划分训练集和测试集
cv = cvpartition(size(X,1),'HoldOut',0.3); % 30% 为测试集
idx = cv.test;
X_train = X(~idx,:);
y_train = y(~idx,:);
X_test = X(idx,:);
y_test = y(idx,:);
```
然后,对特征值进行归一化处理。
```matlab
% 特征值归一化
mu = mean(X_train);
sigma = std(X_train);
X_train = (X_train - mu) ./ sigma;
X_test = (X_test - mu) ./ sigma;
```
接着,定义单层感知机的结构和参数。
```matlab
% 单层感知机结构和参数
input_size = size(X_train,2); % 输入层大小
hidden_size = 1; % 隐藏层大小
output_size = 3; % 输出层大小
learning_rate = 0.01; % 学习率
max_epoch = 1000; % 最大迭代次数
```
然后,初始化权重和偏置。
```matlab
% 初始化权重和偏置
W1 = randn(hidden_size, input_size);
b1 = randn(hidden_size, 1);
W2 = randn(output_size, hidden_size);
b2 = randn(output_size, 1);
```
接下来,定义激活函数和损失函数。
```matlab
% 激活函数和损失函数
sigmoid = @(x) 1 ./ (1 + exp(-x)); % sigmoid 激活函数
softmax = @(x) exp(x) ./ sum(exp(x)); % softmax 激活函数
loss = @(y, t) -sum(t .* log(y)); % 交叉熵损失函数
```
然后,进行训练。
```matlab
% 训练
for epoch = 1:max_epoch
% 前向传播
z1 = W1 * X_train' + b1;
a1 = sigmoid(z1);
z2 = W2 * a1 + b2;
y_train_pred = softmax(z2);
% 反向传播
delta2 = y_train_pred - full(ind2vec(y_train', output_size));
delta1 = (W2' * delta2) .* a1 .* (1 - a1);
% 更新权重和偏置
W2 = W2 - learning_rate * delta2 * a1';
b2 = b2 - learning_rate * delta2;
W1 = W1 - learning_rate * delta1 * X_train;
b1 = b1 - learning_rate * delta1;
% 计算训练集上的损失
train_loss = loss(y_train_pred, full(ind2vec(y_train', output_size)));
fprintf('Epoch: %d, Train Loss: %f\n', epoch, train_loss);
end
```
最后,进行测试并计算准确率。
```matlab
% 测试
z1 = W1 * X_test' + b1;
a1 = sigmoid(z1);
z2 = W2 * a1 + b2;
y_test_pred = softmax(z2);
[~, y_test_pred] = max(y_test_pred);
accuracy = sum(y_test_pred' == y_test) / length(y_test);
fprintf('Test Accuracy: %f\n', accuracy);
```
完整代码如下:
```matlab
data = csvread('iris.csv'); % 导入鸢尾花数据集
X = data(:,1:4); % 特征值
y = data(:,5); % 类别标签
% 随机划分训练集和测试集
cv = cvpartition(size(X,1),'HoldOut',0.3); % 30% 为测试集
idx = cv.test;
X_train = X(~idx,:);
y_train = y(~idx,:);
X_test = X(idx,:);
y_test = y(idx,:);
% 特征值归一化
mu = mean(X_train);
sigma = std(X_train);
X_train = (X_train - mu) ./ sigma;
X_test = (X_test - mu) ./ sigma;
% 单层感知机结构和参数
input_size = size(X_train,2); % 输入层大小
hidden_size = 1; % 隐藏层大小
output_size = 3; % 输出层大小
learning_rate = 0.01; % 学习率
max_epoch = 1000; % 最大迭代次数
% 初始化权重和偏置
W1 = randn(hidden_size, input_size);
b1 = randn(hidden_size, 1);
W2 = randn(output_size, hidden_size);
b2 = randn(output_size, 1);
% 激活函数和损失函数
sigmoid = @(x) 1 ./ (1 + exp(-x)); % sigmoid 激活函数
softmax = @(x) exp(x) ./ sum(exp(x)); % softmax 激活函数
loss = @(y, t) -sum(t .* log(y)); % 交叉熵损失函数
% 训练
for epoch = 1:max_epoch
% 前向传播
z1 = W1 * X_train' + b1;
a1 = sigmoid(z1);
z2 = W2 * a1 + b2;
y_train_pred = softmax(z2);
% 反向传播
delta2 = y_train_pred - full(ind2vec(y_train', output_size));
delta1 = (W2' * delta2) .* a1 .* (1 - a1);
% 更新权重和偏置
W2 = W2 - learning_rate * delta2 * a1';
b2 = b2 - learning_rate * delta2;
W1 = W1 - learning_rate * delta1 * X_train;
b1 = b1 - learning_rate * delta1;
% 计算训练集上的损失
train_loss = loss(y_train_pred, full(ind2vec(y_train', output_size)));
fprintf('Epoch: %d, Train Loss: %f\n', epoch, train_loss);
end
% 测试
z1 = W1 * X_test' + b1;
a1 = sigmoid(z1);
z2 = W2 * a1 + b2;
y_test_pred = softmax(z2);
[~, y_test_pred] = max(y_test_pred);
accuracy = sum(y_test_pred' == y_test) / length(y_test);
fprintf('Test Accuracy: %f\n', accuracy);
```
希望能对您有所帮助!
阅读全文