How do I calculate sentence perplexity using torch-rb?

60 views Asked by At

I have translated this example of how to calculate perplexity to Ruby. I am now trying to adapt it to calculate the perplexity of a given sentence.

However, Torch::NN::Functional.binary_cross_entropy requires all elements of input should be between 0 and 1 and so, as it is, it fails.

Is #binary_cross_entropy the correct method to use here? How should I use it to calculate perplexity?

require "blingfire"
require "torch"

model = BlingFire.load_model("gpt2.bin")
BlingFire.change_settings_dummy_prefix(model, false)

sentence = "How now brown cow."
ids = model.text_to_ids(sentence).map(&:to_f)

output = Torch.tensor(ids) #=> tensor([2437.,  783., 7586., 9875.,   13.])
loss = Torch::NN::Functional.binary_cross_entropy(output, Torch.tensor(0.9)) # placeholder target here

# calculating perplexity
perplexity = Torch.exp(loss)
puts("Loss: #{loss} PP: #{perplexity}")

Edit

I see from the documentation that the input should be probabilities. I still don't know what I need to do to convert the tensors in output to probabilities.

Flailing, I've stumbled on this but it doesn't seem right. Probabilities are either 0 or 1.

Torch.softmax(output, dim: 0)

0

There are 0 answers