Python pdb skipping hard coded breakpoint

174 views Asked by At

I'm running Tensorflow's sequence-to-sequence tutorial in pdb to figure out how it works. I'm trying to hard code a breakpoint inside a function that takes many lines of parameters, so that I don't have to debug through all the parameters. However, pdb seems to just skip over the hard-coded breakpoint.

The main file is translate.py, and I'm calling a constructor in seq2seq_model.py. The relevant code is shown below:

# Sampled softmax only makes sense if we sample less than vocabulary size.
pdb.set_trace()
if num_samples > 0 and num_samples < self.target_vocab_size:
  w_t = tf.get_variable("proj_w",[self.target_vocab_size,size],dtype=dtype)
  w = tf.transpose(w_t)
  b = tf.get_variable("proj_b", [self.target_vocab_size], dtype=dtype)
  output_projection = (w, b)

  def sampled_loss(labels, inputs):
    labels = tf.reshape(labels, [-1, 1])
    # We need to compute the sampled_softmax_loss using 32bit floats to
    # avoid numerical instabilities.
    local_w_t = tf.cast(w_t, tf.float32)
    local_b = tf.cast(b, tf.float32)
    local_inputs = tf.cast(inputs, tf.float32)
    return tf.cast(
          tf.nn.sampled_softmax_loss(local_w_t, local_b, local_inputs, labels,
                                   num_samples, self.target_vocab_size), dtype)
  softmax_loss_function = sampled_loss

UPDATE: The pdb.set_trace() line in the file with the called function doesn't appear when I type list in the debugger. I think the problem has something to do with this.

0

There are 0 answers