Open
Description
Ran into this while working on an upstream PR: TuringLang/DynamicPPL.jl#967
julia> using Bijectors: bijector, inverse, output_size
julia> for dist in (LogNormal(), Normal())
b = bijector(dist)
sz = output_size(b, size(dist))
y = randn(sz)
b_inv = inverse(b)
x = b_inv(y)
@show typeof(x)
end
typeof(x) = Float64
typeof(x) = Array{Float64, 0}
The workaround for this is straightforward, but I feel like it shouldn't be DynamicPPL's responsibility to work around it?
I guess are a couple of possible ways out, though none strike me as obviously correct:
- Make output_size return
(1,)
ifsize(dist) == ()
. Would make life easier on DynamicPPL side as I wouldn't have to deal with any 0-dim arrays. However, I don't think this is technically correct. - Keep the behaviour of
output_size
but change the behaviour for LogNormal to match that for Normal, i.e. both cases should return 0-dim arrays. This would be more principled, but I'm frankly unsure if this is possible because Julia is weird with 0-dim arrays anyway.
julia> x = randn(())
0-dimensional Array{Float64, 0}:
1.427397389133524
julia> exp.(x) # this is the source of the problem isn't it
4.167837804112459
Metadata
Metadata
Assignees
Labels
No labels