KNN近邻算法matlab代码
时间: 2023-08-04 19:05:41 浏览: 132
KNN近邻算法
5星 · 资源好评率100%
Sure! Here is an example of KNN (K-Nearest Neighbors) algorithm implementation in MATLAB:
```matlab
% Load the data
load fisheriris;
X = meas(:,1:2);
Y = species;
% Split the data into training and test sets
trainIdx = randsample(1:size(X,1), ceil(0.7*size(X,1)));
testIdx = setdiff(1:size(X,1), trainIdx);
trainX = X(trainIdx,:);
trainY = Y(trainIdx);
testX = X(testIdx,:);
testY = Y(testIdx);
% Define the number of neighbors (K)
K = 5;
% Classify test samples
predictedY = zeros(size(testY));
for i = 1:size(testX,1)
distances = sqrt(sum((trainX - testX(i,:)).^2, 2));
[~, nearestIndices] = mink(distances, K);
nearestLabels = trainY(nearestIndices);
predictedY(i) = mode(nearestLabels);
end
% Calculate accuracy
accuracy = sum(predictedY == testY) / numel(testY);
disp(['Accuracy: ', num2str(accuracy)]);
```
This code uses the Fisher's Iris dataset as an example. It randomly splits the data into a training set and a test set. Then, for each test sample, it calculates the Euclidean distance to all training samples and selects the K nearest neighbors. Finally, it assigns the class label that appears most frequently among the nearest neighbors as the predicted class label for the test sample. The accuracy of the classification is calculated and displayed.
Note: This code assumes that you have the Statistics and Machine Learning Toolbox installed in MATLAB, as it uses the `mink` and `mode` functions from that toolbox.
阅读全文