To use WMD we need some word embeddings. For this example a pre-trained embedding provided by Gensim 'word2vec-google-news-300' is used.
Below is code snippet:
import gensim.downloader as api
model = api.load('word2vec-google-news-300')
distance = model.wmdistance(sent1, sent2)
How can I use my own customised embedding in place of that ,How can I load that in model? for ex: Embedding somewhat looks like - {text:1-D NumPy array}