softmax_cross_entropy_with_logits之前加softmax层:
W4 = tf.Variable(tf.truncated_normal([1000,10],stddev=0.1))
b4 = tf.Variable(tf.zeros([10])+0.1)
prediction = tf.nn.softmax(tf.matmul(L3_drop,W4)+b4)
#代价函数
loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(labels=y,logits=prediction))
#使用梯度下降法
train_step = tf.train.GradientDescentOptimizer(0.2).minimize(loss)
Iter 0: Test Accuracy 0.9172
Iter 1: Test Accuracy 0.932
Iter 2: Test Accuracy 0.9355
Iter 3: Test Accuracy 0.9412
Iter 4: Test Accuracy 0.9441
Iter 5: Test Accuracy 0.9474
Iter 6: Test Accuracy 0.9487
Iter 7: Test Accuracy 0.9515
Iter 8: Test Accuracy 0.9548
Iter 9: Test Accuracy 0.9548
Iter 10: Test Accuracy 0.9559
Iter 11: Test Accuracy 0.9575
Iter 12: Test Accuracy 0.9587
Iter 13: Test Accuracy 0.9606
Iter 14: Test Accuracy 0.9611
Iter 15: Test Accuracy 0.9614
Iter 16: Test Accuracy 0.9623
Iter 17: Test Accuracy 0.9639
Iter 18: Test Accuracy 0.9639
Iter 19: Test Accuracy 0.9649
Iter 20: Test Accuracy 0.9651
Iter 21: Test Accuracy 0.9672
Iter 22: Test Accuracy 0.967
Iter 23: Test Accuracy 0.9675
Iter 24: Test Accuracy 0.9677
Iter 25: Test Accuracy 0.9691
Iter 26: Test Accuracy 0.9686
Iter 27: Test Accuracy 0.9682
Iter 28: Test Accuracy 0.9698
Iter 29: Test Accuracy 0.9693
Iter 30: Test Accuracy 0.9705
softmax_cross_entropy_with_logits之前不加softmax层:
W4 = tf.Variable(tf.truncated_normal([1000,10],stddev=0.1))
b4 = tf.Variable(tf.zeros([10])+0.1)
prediction = tf.matmul(L3_drop,W4)+b4
#代价函数
loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(labels=y,logits=prediction))
#使用梯度下降法
train_step = tf.train.GradientDescentOptimizer(0.2).minimize(loss)
Iter 0: Test Accuracy 0.9302
Iter 1: Test Accuracy 0.9431
Iter 2: Test Accuracy 0.9565
Iter 3: Test Accuracy 0.9566
Iter 4: Test Accuracy 0.9629
Iter 5: Test Accuracy 0.9615
Iter 6: Test Accuracy 0.9695
Iter 7: Test Accuracy 0.9692
Iter 8: Test Accuracy 0.9708
Iter 9: Test Accuracy 0.9706
Iter 10: Test Accuracy 0.9725
Iter 11: Test Accuracy 0.9746
Iter 12: Test Accuracy 0.9714
Iter 13: Test Accuracy 0.973
Iter 14: Test Accuracy 0.9779
Iter 15: Test Accuracy 0.9773
Iter 16: Test Accuracy 0.9765
Iter 17: Test Accuracy 0.9751
Iter 18: Test Accuracy 0.977
Iter 19: Test Accuracy 0.9786
Iter 20: Test Accuracy 0.9743
Iter 21: Test Accuracy 0.9779
Iter 22: Test Accuracy 0.9779
Iter 23: Test Accuracy 0.9785
Iter 24: Test Accuracy 0.9796
Iter 25: Test Accuracy 0.9782
Iter 26: Test Accuracy 0.9797
Iter 27: Test Accuracy 0.98
Iter 28: Test Accuracy 0.9817
Iter 29: Test Accuracy 0.9806
Iter 30: Test Accuracy 0.9805
softmax_cross_entropy_logits内部会计算softmax,并进行了优化,所以不要画蛇添足在前面加一层softmax,可以看出额外加一次softmax会使得收敛速度减慢而且损失一定准确率。