pyro.ai: AutoNormal and constraints

39 views Asked by At

I followed the code example on Bayesian Linear Regression given here: http://pyro.ai/examples/intro_long.html

It is shown how to use an autoguide "AutoNormal" to approximate a posterior distribution via independent normals. However, if I examine the output, I stumble upon the following:

b = auto_guide.named_parameters()
next(b) # call this line until sigma appears in output

('locs.sigma_unconstrained', Parameter containing:
tensor(-2.2059, requires_grad=True))

Now in the underlying model (full description see link above) sigma is defined as follows:

sigma = pyro.sample("sigma", dist.Uniform(0., 10.))

My question: The term "locs.sigma_unconstrained" seems to indicate that sigma is treated as unconstrained (which it shouldn't be cause it has to be nonnegative). If this is true, I can hardly make sense of the negative value in the tensor (-2.2059). If, on the other hand, under the hood sigma is transformed, i.e. log(sigma) is used, then "locs.sigma_unconstrained" could refer to the location parameter of the transformed sigma. Then a negative value would be fine. However, in this case I wonder how does AutoNormal() know that it has to apply the constraint. After all, except for the uniform initial value, the model was not setup with any constraints. Any clarification on the meaning of this would be great!

0

There are 0 answers