python不使用sklearn实现LDA降维
时间: 2023-12-04 14:04:45 浏览: 156
pythond的一个LDA模型的实现
LDA(线性判别分析)是一种经典的有监督降维方法,它可以将高维数据映射到一个低维空间中,以便于分类和可视化。下面是一个简单的 Python 实现:
1.首先,我们需要计算类内散度矩阵 Sw 和类间散度矩阵 Sb。
```python
import numpy as np
def compute_scatter_matrices(X, y):
# 计算均值向量
class_labels = np.unique(y)
n_classes = len(class_labels)
n_features = X.shape[1]
mean_vectors = []
for cl in class_labels:
mean_vectors.append(np.mean(X[y==cl], axis=0))
# 计算类内散度矩阵
Sw = np.zeros((n_features, n_features))
for cl,mv in zip(class_labels, mean_vectors):
class_sc_mat = np.zeros((n_features, n_features)) # scatter matrix for every class
for row in X[y == cl]:
row, mv = row.reshape(n_features,1), mv.reshape(n_features,1) # make column vectors
class_sc_mat += (row-mv).dot((row-mv).T)
Sw += class_sc_mat # sum class scatter matrices
# 计算类间散度矩阵
overall_mean = np.mean(X, axis=0)
Sb = np.zeros((n_features, n_features))
for i,mean_vec in enumerate(mean_vectors):
n = X[y==class_labels[i]].shape[0]
mean_vec = mean_vec.reshape(n_features,1) # make column vector
overall_mean = overall_mean.reshape(n_features,1) # make column vector
Sb += n * (mean_vec - overall_mean).dot((mean_vec - overall_mean).T)
return Sw, Sb
```
2.然后,我们需要计算 Sw 的逆矩阵和 Sw 和 Sb 的乘积。
```python
def lda(X, y, n_components):
Sw, Sb = compute_scatter_matrices(X, y)
eig_vals, eig_vecs = np.linalg.eig(np.linalg.inv(Sw).dot(Sb))
eig_pairs = [(np.abs(eig_vals[i]), eig_vecs[:,i]) for i in range(len(eig_vals))]
eig_pairs = sorted(eig_pairs, key=lambda k: k[0], reverse=True)
W = np.hstack([eig_pairs[i][1].reshape(len(X[0]),1) for i in range(n_components)])
return X.dot(W)
```
这个函数将返回一个降维后的特征矩阵,其中 n_components 是我们想要的输出维度数。
阅读全文