How to make German to English Translation work?

116 views Asked by At

I tried out the English to German translation in the Colab notebook 'Welcome to the Tensor2Tensor Colab', which works. But I must miss something in the code to make it work for German to English.

According to the following page https://github.com/tensorflow/tensor2tensor I added '_rev' in order to 'reverse' the translation. The two changes compared to the original notebook are marked using '# <-------------':

# Fetch the problem
ende_problem = problems.problem("translate_ende_wmt32k_rev") # <------------- 

# Copy the vocab file locally so we can encode inputs and decode model outputs
# All vocabs are stored on GCS
vocab_name = "vocab.translate_ende_wmt32k.32768.subwords"
vocab_file = os.path.join(gs_data_dir, vocab_name)
!gsutil cp {vocab_file} {data_dir}

# Get the encoders from the problem
encoders = ende_problem.feature_encoders(data_dir)

# Setup helper functions for encoding and decoding
def encode(input_str, output_str=None):
  """Input str to features dict, ready for inference"""
  inputs = encoders["inputs"].encode(input_str) + [1]  # add EOS id
  batch_inputs = tf.reshape(inputs, [1, -1, 1])  # Make it 3D.
  return {"inputs": batch_inputs}

def decode(integers):
  """List of ints to str"""
  integers = list(np.squeeze(integers))
  if 1 in integers:
    integers = integers[:integers.index(1)]
  return encoders["inputs"].decode(np.squeeze(integers))

#Create hparams and the model
model_name = "transformer"
hparams_set = "transformer_base"

hparams = trainer_lib.create_hparams(hparams_set, data_dir=data_dir, problem_name="translate_ende_wmt32k_rev") # <-------------

# NOTE: Only create the model once when restoring from a checkpoint; it's a
# Layer and so subsequent instantiations will have different variable scopes
# that will not match the checkpoint.
translate_model = registry.model(model_name)(hparams, Modes.EVAL)

# Copy the pretrained checkpoint locally
ckpt_name = "transformer_ende_test"
gs_ckpt = os.path.join(gs_ckpt_dir, ckpt_name)
!gsutil -q cp -R {gs_ckpt} {checkpoint_dir}
ckpt_path = tf.train.latest_checkpoint(os.path.join(checkpoint_dir, ckpt_name))
ckpt_path

# Restore and translate!
def translate(inputs):
  encoded_inputs = encode(inputs)
  with tfe.restore_variables_on_create(ckpt_path):
    model_output = translate_model.infer(encoded_inputs)["outputs"]
  return decode(model_output)

inputs = "Sie ist zurückgetreten."
outputs = translate(inputs)

print("Inputs: %s" % inputs)
print("Outputs: %s" % outputs)

The output is as follows:

  • Inputs: Sie ist zurückgetreten.
  • Outputs: Sie sind zurückgetreten.

    The translation seems still to be from English to German instead of vice versa.

    What am I missing?

  • 1

    There are 1 answers

    3
    Martin Popel On

    The model you are loading form a checkpoint (ckpt_name = "transformer_ende_test" and downloading from gs_ckpt_dir) was trained only for English-to-German. You would need to find a checkpoint of a model trained in the opposite direction or train one yourself. I am not aware of any publicly available German-to-English T2T model checkpoint.