I have the following code in TensorFlow 1.0. I tried to migrate it to TensorFlow 2.0 using tf_upgrade_v2 script. However, it didnt find an equivalent function in the tf-2 compact version.
I was recommended to use tensorflow_addons. However, I dont see an equivalent attention_decoder in the tf_addons module. Please guide me.
decoder_outputs,decoder_state = tf.contrib.legacy_seq2seq.attention_decoder(
decoder_inputs = decoder_inputs,
initial_state = encoder_state,
attention_states = encoder_outputs,
cell = cell,
output_size = word_embedding_dim,
loop_function = None if mode=='pretrain' else feed_prev_loop,
scope = scope
)
The link to tf 1.0 code is here: https://github.com/yaushian/CycleGAN-sentiment-transfer/blob/master/lib/seq2seq.py
While there is no equivalent in Tensorflow 2.x API, the original implementation can be revised to be compatible with the new API. I have made the conversion below, along with a simple test case to verify it runs successfully.
As mentioned in the Github Issue for the same question:
The optimal approach is probably using new RNN and Attention modules introduced in the 2.x API, but for the sake of experimenting with scripts which were written using the 1.x API, similar to the one referenced in the question, this approach may be enough to bridge the gap.