python利用熵权TOPSIS法对IEEE30节点系统的度中心性、接近中心性、pagerank算法、失同步扩散时间倒数和临界同步耦合强度建立多指标综合判别方法,对节点重要度排序,详细写明每一步的代码
时间: 2024-04-30 12:20:17 浏览: 117
熵权TOPSIS的Python代码
首先,我们需要导入需要的库。
```python
import numpy as np
import pandas as pd
from sklearn.preprocessing import minmax_scale
```
接下来,我们需要读入IEEE30节点系统的度中心性、接近中心性、pagerank算法、失同步扩散时间倒数和临界同步耦合强度等指标数据。
```python
data = pd.read_csv('data.csv')
```
然后,我们需要对数据进行标准化处理。
```python
data_std = minmax_scale(data)
```
接下来,我们需要求出每个指标的权重,可以使用熵权TOPSIS法。
```python
def entropy_weight(data):
m, n = data.shape
# 求解熵值
entropy = np.zeros(n)
for i in range(n):
p = data[:, i] / data[:, i].sum()
entropy[i] = -np.dot(p, np.log(p))
# 求解权重
weight = (1 - entropy) / (n - entropy.sum())
return weight
weight = entropy_weight(data_std)
```
然后,我们需要对数据进行加权处理。
```python
data_weighted = data_std * weight
```
接下来,我们需要将数据进行正向化处理。
```python
def positive_ideal(data_weighted):
return np.max(data_weighted, axis=0)
def negative_ideal(data_weighted):
return np.min(data_weighted, axis=0)
z_positive = positive_ideal(data_weighted)
z_negative = negative_ideal(data_weighted)
data_positive = (data_weighted - z_negative) / (z_positive - z_negative)
```
然后,我们需要计算每个节点的综合得分。
```python
def topsis_score(data_positive, weight):
m, n = data_positive.shape
# 求解距离
distance_positive = np.zeros(m)
distance_negative = np.zeros(m)
for i in range(m):
distance_positive[i] = np.sqrt(np.sum((data_positive[i, :] - weight) ** 2))
distance_negative[i] = np.sqrt(np.sum((data_positive[i, :] - (1 - weight)) ** 2))
# 求解综合得分
score = distance_negative / (distance_positive + distance_negative)
return score
score = topsis_score(data_positive, weight)
```
最后,我们可以将节点按照得分从高到低进行排序。
```python
node_ranking = np.argsort(-score)
```
完整代码如下:
```python
import numpy as np
import pandas as pd
from sklearn.preprocessing import minmax_scale
data = pd.read_csv('data.csv')
data_std = minmax_scale(data)
def entropy_weight(data):
m, n = data.shape
# 求解熵值
entropy = np.zeros(n)
for i in range(n):
p = data[:, i] / data[:, i].sum()
entropy[i] = -np.dot(p, np.log(p))
# 求解权重
weight = (1 - entropy) / (n - entropy.sum())
return weight
weight = entropy_weight(data_std)
data_weighted = data_std * weight
def positive_ideal(data_weighted):
return np.max(data_weighted, axis=0)
def negative_ideal(data_weighted):
return np.min(data_weighted, axis=0)
z_positive = positive_ideal(data_weighted)
z_negative = negative_ideal(data_weighted)
data_positive = (data_weighted - z_negative) / (z_positive - z_negative)
def topsis_score(data_positive, weight):
m, n = data_positive.shape
# 求解距离
distance_positive = np.zeros(m)
distance_negative = np.zeros(m)
for i in range(m):
distance_positive[i] = np.sqrt(np.sum((data_positive[i, :] - weight) ** 2))
distance_negative[i] = np.sqrt(np.sum((data_positive[i, :] - (1 - weight)) ** 2))
# 求解综合得分
score = distance_negative / (distance_positive + distance_negative)
return score
score = topsis_score(data_positive, weight)
node_ranking = np.argsort(-score)
```
阅读全文