二分类softmax
时间: 2023-08-24 16:08:51 浏览: 100
二分类softmax是指在二分类问题中使用softmax回归模型。在二分类问题中,类标签可以取两个值。当类别数k=2时,softmax回归就退化为logistic回归。在二分类softmax中,最后一层全连接层的神经元个数为2,因为softmax函数将目标分类为两类。这与使用Sigmoid函数的二分类问题不同,Sigmoid函数只有一个目标类和一个背景类。因此,二分类softmax和Sigmoid函数在神经元个数和分类方式上存在差异。\[1\]\[3\]
#### 引用[.reference_title]
- *1* [【机器学习笔记2.5】用Softmax回归做二分类(Tensorflow实现)](https://blog.csdn.net/u011362297/article/details/87541051)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^insertT0,239^v3^insert_chatgpt"}} ] [.reference_item]
- *2* [二分类、多分类、多标签、softmax、sigmoid、pytorch实现](https://blog.csdn.net/jzlin1997/article/details/110048060)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^insertT0,239^v3^insert_chatgpt"}} ] [.reference_item]
- *3* [二分类情况下sigmoid函数和softmax函数区别](https://blog.csdn.net/qq_37006625/article/details/128221584)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^insertT0,239^v3^insert_chatgpt"}} ] [.reference_item]
[ .reference_list ]
阅读全文