tf.nn.sparse_softmax_cross_entropy_with_logits
时间: 2023-10-06 21:12:10 浏览: 125
tf.nn.sparse_softmax_cross_entropy_with_logits is a function in the TensorFlow library that computes the softmax cross-entropy loss between the logits and the true labels. It is used for multi-class classification problems where the target labels are integers rather than one-hot encoded vectors. The function takes two arguments: logits and labels. The logits are the unnormalized log probabilities computed by the model, while the labels are the true class labels. The function computes the cross-entropy loss by first applying the softmax function to the logits and then calculating the negative log likelihood of the true labels. The resulting loss is a scalar value that can be used to optimize the model parameters using gradient descent algorithms.
阅读全文