Tensorflow R0.12 softmax_cross_entropy_with_logits ASSERT Error

248 views Asked by At

I have been working on getting "softmax_cross_entropy_with_logits" working as part of my cost function for a 147 class problem. I have the code working with "sigmoid_cross_entropy_with_logits" but would like to move to softmax.

I have tried a number of different attempts to get the code working by reshaping from a rank 3 to rank 2 (didn't help) and just stuck. I have tried some toy code through Notebook and the softmax_cross.... does not assert an error. Also tried casting the float32 to float64 (as my Notebook example used 64 bit and worked) but still asserted the error.

Here is the toy code:

y_hat_softmax = tf.nn.softmax(y_hat)
sess.run(y_hat_softmax)
# array([[ 0.227863  ,  0.61939586,  0.15274114],
#        [ 0.49674623,  0.20196195,  0.30129182]])

y_true = tf.convert_to_tensor(np.array([[0.0, 1.0, 0.0],[0.0, 0.0, 1.0]]))
sess.run(y_true)
# array([[ 0.,  1.,  0.],
#        [ 0.,  0.,  1.]])

loss_per_instance_2 = tf.nn.softmax_cross_entropy_with_logits(y_hat, y_true)
sess.run(loss_per_instance_2)
# array([ 0.4790107 ,  1.19967598])

cross_ent = tf.nn.softmax_cross_entropy_with_logits(y_hat, y_true)
print sess.run(cross_ent)
#[ 0.4790107   1.19967598]
print y_hat
#Tensor("Const:0", shape=(2, 3), dtype=float64)
print y_true
#Tensor("Const_1:0", shape=(2, 3), dtype=float64)
total_loss_2 = tf.reduce_mean(cross_ent)
sess.run(total_loss_2)
# 0.83934333897877922

Here is my code fragment: (sizes are printed in error below)

        self.error0        = tf.nn.softmax_cross_entropy_with_logits(tf.to_double(self.outputSplit0), tf.to_double(self.YactSplit0), "SoftMax0")
        self.error1        = tf.nn.softmax_cross_entropy_with_logits(self.outputSplit1, self.YactSplit1, "SoftMax1")
        self.error        = self.error0 + self.error1

What I am trying to do is I have 2 encoded "words" for each result, so I am now trying to calculate the error seperately for each word, still didn't work. Error occurs on the first line above:

self.outputSplit0 Tensor("LSTM/Reshape_2:0", shape=(8000, 147), dtype=float32)
self.YactSplit0 Tensor("LSTM/Reshape_4:0", shape=(8000, 147), dtype=float32)
Traceback (most recent call last):
  File "modelbuilder.py", line 352, in <module>
    brain.create_variables()
  File "/home/greg/Model/LSTM_qnet.py", line 58, in create_variables
    self.error0        = tf.nn.softmax_cross_entropy_with_logits(tf.to_double(self.outputSplit0), tf.to_double(self.YactSplit0), "SoftMax0")
  File "/home/greg/tensorflow/_python_build/tensorflow/python/ops/nn_ops.py", line 1436, in softmax_cross_entropy_with_logits
    precise_logits = _move_dim_to_end(precise_logits, dim, input_rank)
  File "/home/greg/tensorflow/_python_build/tensorflow/python/ops/nn_ops.py", line 1433, in _move_dim_to_end
    0, [math_ops.range(dim_index), math_ops.range(dim_index + 1, rank),
  File "/home/greg/tensorflow/_python_build/tensorflow/python/ops/math_ops.py", line 1094, in range
    assert all(arg.dtype in dtype_hierarchy for arg in [start, limit, delta])
AssertionError

Any ideas what might be happening here? The error seems to be from the "range" function, just can't figure out what I have done wrong.

1

There are 1 answers

0
eager2learn On BEST ANSWER

The third argument you pass to the softmax function is implicitly taken to be the dimension, but you pass the name instead, which leads to the assertion being triggered. You should pass the name of the parameter to the function:

tf.nn.softmax_cross_entropy_with_logits(tf.to_double(self.outputSplit0), tf.to_double(self.YactSplit0), name = "SoftMax0")