tf.nn.softmax_cross_entropy_with_logits(_sentinel=None, labels=None, logits=None, dim=-1, name=None)
Docstring:
Computes softmax cross entropy between logits
and labels
.
Type: function
Measures the probability error in discrete classification tasks in which the classes are mutually exclusive (each entry is in exactly one class). For example, each CIFAR-10 image is labeled with one and only one label: an image can be a dog or a truck, but not both.
与softmax
搭配使用的交叉熵损失函数,输入不需要额外加一层softmax
,softmax_cross_entropy_with_logits
中会集成有softmax
并进行了计算优化;它适用于分类的类别之间是相互排斥的场景,即只有一个标签(不是狗就是猫)。
NOTE: While the classes are mutually exclusive, their probabilities need not be. All that is required is that each row of labels
is a valid probability distribution. If they are not, the computation of the gradient will be incorrect.
If using exclusive labels
(wherein one and only one class is true at a time), see sparse_softmax_cross_entropy_with_logits
.
WARNING: This op expects unscaled logits, since it performs a softmax
on logits
internally for efficiency. Do not call this op with the output of softmax
, as it will produce incorrect results.
logits
and labels
must have the same shape [batch_size, num_classes]
and the same dtype (either float16
, float32
, or float64
).
Note that to avoid confusion, it is required to pass only named arguments to this function.
Args:
_sentinel: Used to prevent positional parameters. Internal, do not use.
labels: Each row labels[i]
must be a valid probability distribution.
logits: Unscaled log probabilities.
dim: The class dimension. Defaulted to -1 which is the last dimension.
name: A name for the operation (optional).
Returns:
A 1-D Tensor
of length batch_size
of the same type as logits
with the
softmax cross entropy loss.