Open
Description
Hello there!
Your code is pretty clear and helpful to others, thank you for your effort!
Here, I raised a question about 'softmax', u write about this:
train_vgg.py:
#fc8
with tf.name_scope('fc8') as scope:
kernel = weight_variable([4096, 2])
biases = bias_variable([2])
output_fc8 = tf.nn.relu(fc(output_fc7, kernel, biases), name=scope)
finaloutput = tf.nn.softmax(output_fc8, name="softmax")
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=finaloutput, labels=y))
Here, u used tf.nn.softmax_cross_entropy_with_logits() function, which I thought is inner implemented softmax to normalize data and then calcuate cross entropy, applied on finaloutput which has already been applied by 'softmax'. So is there more reasonable to use:
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=output_fc8, labels=y))
?
That's to say, use output_fc8 result to calculate cross entropy.
Metadata
Metadata
Assignees
Labels
No labels