python实现求联合信息熵
时间: 2023-06-09 15:03:08 浏览: 219
要求联合信息熵,需要先计算出每个变量的独立信息熵,然后再进行合并得到联合信息熵。具体实现可以参考以下代码:
```
import math
def entropy(data):
"""
计算信息熵
:param data: 数据集
:return: 信息熵
"""
n = len(data)
label_counts = {}
for feat_vec in data:
current_label = feat_vec[-1]
if current_label not in label_counts.keys():
label_counts[current_label] = 0
label_counts[current_label] += 1
entropy = 0.0
for key in label_counts:
prob = float(label_counts[key]) / n
entropy -= prob * math.log2(prob)
return entropy
def joint_entropy(data):
"""
计算联合信息熵
:param data: 数据集
:return: 联合信息熵
"""
n = len(data)
feature_counts = {}
for feat_vec in data:
for i in range(len(feat_vec) - 1):
feat = feat_vec[i]
label = feat_vec[-1]
if feat not in feature_counts.keys():
feature_counts[feat] = {}
if label not in feature_counts[feat].keys():
feature_counts[feat][label] = 0
feature_counts[feat][label] += 1
joint_entropy = 0.0
for feat in feature_counts:
for label in feature_counts[feat]:
prob = float(feature_counts[feat][label]) / n
joint_entropy -= prob * math.log2(prob)
return joint_entropy
```
其中,entropy函数用于计算独立信息熵,joint_entropy函数用于计算联合信息熵。这两个函数均接收一个数据集作为参数,数据集应该是一个二维列表,每一行代表一个样本,最后一列代表样本所属的类别。
阅读全文