-
Notifications
You must be signed in to change notification settings - Fork 36
AbstractPPL@0.12 #970
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AbstractPPL@0.12 #970
Conversation
Benchmark Report for Commit aaa1e8dComputer Information
Benchmark Results
|
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #970 +/- ##
==========================================
+ Coverage 82.98% 83.00% +0.01%
==========================================
Files 36 36
Lines 3961 3965 +4
==========================================
+ Hits 3287 3291 +4
Misses 674 674 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Pull Request Test Coverage Report for Build 16114056967Details
💛 - Coveralls |
DynamicPPL.jl documentation for PR #970 is available at: |
return !isempty(vns) && all(Base.Fix1(istrans, vi), vns) | ||
# This used to be: `!isempty(vns) && all(Base.Fix1(istrans, vi), vns)`. | ||
# In theory that should work perfectly fine. For unbeknownst reasons, | ||
# Julia 1.10 fails to infer its return type correctly. Thus we use this | ||
# slightly longer definition. | ||
isempty(vns) && return false | ||
for vn in vns | ||
istrans(vi, vn) || return false | ||
end | ||
return true | ||
end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See main PR comment for rationale (if interested). This is the only meaningful change in this PR, the rest is quite mundane
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the AbstractPPL changes, very happy to be rid of the VN comparison bug. By extension of course happy with these changes too. Just one performance related question.
src/simple_varinfo.jl
Outdated
@@ -356,13 +356,15 @@ function BangBang.setindex!!(vi::SimpleVarInfo, vals, vns::AbstractVector{<:VarN | |||
return vi | |||
end | |||
|
|||
function BangBang.setindex!!(vi::SimpleVarInfo{<:AbstractDict}, val, vn::VarName) | |||
function BangBang.setindex!!( | |||
vi::SimpleVarInfo{<:AbstractDict}, val, vn::VarName{sym} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note that VarName{sym} where {sym}
function signatures might change performance, because they force specialisation on that argument. If they would, and if they would make it worse, this could always be circumvented with a getsymbol(vn::VarName{sym}) = sym
function (probably that exists already).
I tried to compare these two benchmark results to see if there is an effect: #970 (comment) and #966 (comment). It does look like everything has gotten a bit slower, but this could easily be a GHA benchmarking fluctuation. If it's not too much trouble, would you be happy to check the benchmarks locally in a more controlled environment?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, that's a really good catch. I think this is the only function which might be on a performance-sensitive path, but I'll change them all back anyway. The old definition of VarName(vn, optic)
was VarName{getsym(vn)}(optic)
so sticking to that should preserve exactly the same behaviour as before.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had looked at the benchmarks as I was worried about the type stability thing -- isn't this PR faster on most of them (though I imagine very much within error?)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reverting this fixed the original istrans type instability, but now there's another one. This might take a while.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the links above this PR is a bit slower on most. I wouldn't put any serious stock into those benchmarks without trying them in a more stable environment first though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd also be happy with a conclusion that the VarName{sym} where {sym}
signatures have no (noticable) impact on benchmarks and this can be merged as-is.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, to my relief, it wasn't a different type stability problem, it was the same one. So the same fix (manually expanding all
) still works.
Benchmarked smorgasboard
:
using DynamicPPL, Distributions, BenchmarkTools
@model function smorgasbord(x, y, ::Type{TV}=Vector{Float64}) where {TV}
@assert length(x) == length(y)
m ~ truncated(Normal(); lower=0)
means ~ product_distribution(fill(Exponential(m), length(x)))
stds = TV(undef, length(x))
stds .~ Gamma(1, 1)
for i in 1:length(x)
x[i] ~ Normal(means[i], stds[i])
end
y ~ product_distribution(map((mean, std) -> Normal(mean, std), means, stds))
0.0 ~ Normal(sum(y), 1)
return (; m=m, means=means, stds=stds)
end
m = smorgasbord(randn(100), randn(100))
vi = VarInfo(m)
ctx_def = DefaultContext()
ctx_spl = SamplingContext()
Main:
julia> @benchmark DynamicPPL.evaluate!!($m, $vi, $ctx_def)
BenchmarkTools.Trial: 10000 samples with 1 evaluation per sample.
Range (min … max): 11.250 μs … 52.083 μs ┊ GC (min … max): 0.00% … 0.00%
Time (median): 12.292 μs ┊ GC (median): 0.00%
Time (mean ± σ): 12.525 μs ± 1.371 μs ┊ GC (mean ± σ): 0.00% ± 0.00%
▂ ▃▃▃▃▄▄█▂▃▁ ▁
▃▇█▆▄███████████▇█▆▅▄▃▂▄▄▃▃▃▂▂▂▂▂▁▁▂▂▁▂▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁ ▃
11.2 μs Histogram: frequency by time 16.6 μs <
Memory estimate: 16.94 KiB, allocs estimate: 321.
julia> @benchmark DynamicPPL.evaluate!!($m, $vi, $ctx_spl)
BenchmarkTools.Trial: 10000 samples with 1 evaluation per sample.
Range (min … max): 20.416 μs … 129.583 μs ┊ GC (min … max): 0.00% … 0.00%
Time (median): 23.208 μs ┊ GC (median): 0.00%
Time (mean ± σ): 23.595 μs ± 2.289 μs ┊ GC (mean ± σ): 0.00% ± 0.00%
▂▃▆▆█▅▃▃▂
▁▁▂▂▃▂▃▄▅▇████████████▆▆▅▅▄▃▃▃▃▂▃▂▂▂▂▂▂▂▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁ ▃
20.4 μs Histogram: frequency by time 30.6 μs <
Memory estimate: 17.75 KiB, allocs estimate: 333.
This PR:
julia> @benchmark DynamicPPL.evaluate!!($m, $vi, $ctx_def)
BenchmarkTools.Trial: 10000 samples with 1 evaluation per sample.
Range (min … max): 11.375 μs … 3.182 ms ┊ GC (min … max): 0.00% … 98.77%
Time (median): 12.083 μs ┊ GC (median): 0.00%
Time (mean ± σ): 12.548 μs ± 31.706 μs ┊ GC (mean ± σ): 2.50% ± 0.99%
▂ ▇█ █▇▇ ▆▄ ▃▁
▂▅███████████▆███▄▆▅▃▄▄▄▃▃▃▃▃▂▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▁▂▂▁▂▂▂▂▂▂▂▂▂▂ ▄
11.4 μs Histogram: frequency by time 15.6 μs <
Memory estimate: 16.94 KiB, allocs estimate: 321.
julia> @benchmark DynamicPPL.evaluate!!($m, $vi, $ctx_spl)
BenchmarkTools.Trial: 10000 samples with 1 evaluation per sample.
Range (min … max): 21.333 μs … 7.169 ms ┊ GC (min … max): 0.00% … 99.12%
Time (median): 24.083 μs ┊ GC (median): 0.00%
Time (mean ± σ): 25.260 μs ± 71.474 μs ┊ GC (mean ± σ): 2.81% ± 0.99%
▁▇▅█▄▆▃▆▁▃▁▂
▂▂▁▂▁▂▂▂▃▃▇████████████▇█▅▇▆▇▅▆▄▄▄▄▄▄▄▄▃▃▃▃▃▃▂▃▂▃▂▂▂▂▂▂▂▂▂▂ ▄
21.3 μs Histogram: frequency by time 29.9 μs <
Memory estimate: 17.75 KiB, allocs estimate: 333.
This reverts commit 67bb055.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @penelopeysm!
Bump compat for AbstractPPL following TuringLang/AbstractPPL.jl#123.
The test failure on 1.10 (CI log here) was, bizarrely, due to Julia inferring
Union{Missing,Bool}
forall(Base.Fix1(istrans, vi), vns)
(inside the definition ofistrans
):DynamicPPL.jl/src/abstract_varinfo.jl
Line 483 in 92f6eea
This is technically not wrong, because
all(f, xs)
returnsmissing
iff(x) === missing
for anyx
inxs
. The compiler can only inferBool
if it knows that none of thef(x)
's aremissing
.So, this suggests that I should check for a loss of type stability in
@code_warntype istrans(vi, vns[1])
, right? Well, turns out that that's perfectly type stable:Anyway, I fixed this by manually expanding the definition of
all
. I spent a while looking into this with Cthulhu, but I eventually concluded that it wasn't really worth my time, because it works perfectly fine on 1.11, indicating that this is a Julia issue rather than our code being fundamentally wrong.I had hoped that we could solve this by bumping the minimum patch version of Julia, but even the latest 1.10.10 release has this 'problem'.