-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ntuple for inference #168
ntuple for inference #168
Conversation
On this example: ```julia using DifferentialEquations, SnoopCompile function lorenz(du,u,p,t) du[1] = 10.0(u[2]-u[1]) du[2] = u[1]*(28.0-u[3]) - u[2] du[3] = u[1]*u[2] - (8/3)*u[3] end u0 = [1.0;0.0;0.0] tspan = (0.0,100.0) prob = ODEProblem(lorenz,u0,tspan) alg = Rodas5(chunk_size = Val{3}()) tinf = @snoopi_deep solve(prob,alg) solve(prob,alg) ``` we saw just two inference triggers: ```julia julia> itrigs = inference_triggers(tinf) 2-element Vector{InferenceTrigger}: Inference triggered to call setindex!(::Vector{Tuple{Float64, Float64, Float64}}, ::Tuple{Bool, Bool, Bool}, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:75) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3}) Inference triggered to call OrdinaryDiffEq.jacobian2W!(::Matrix{Float64}, ::LinearAlgebra.UniformScaling{Bool}, ::Float64, ::Matrix{Float64}, ::Bool) called from toplevel ``` The second one is clear: okay, just use ntuple on like 75 right? So I did that, and it made the function more type-unstable? Look at the itrigs after that: ```julia julia> itrigs = inference_triggers(tinf) 6-element Vector{InferenceTrigger}: Inference triggered to call hcat(::BitMatrix, ::BitMatrix) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:67) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3}) Inference triggered to call Vector{Tuple{Float64, Float64, Float64}}(::UndefInitializer, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:73) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3}) Inference triggered to call ntuple(::SparseDiffTools.var"#15#16"{3, Int64, Int64}, ::Val{3}) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:75) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3}) Inference triggered to call setindex!(::Vector{Tuple{Float64, Float64, Float64}}, ::Tuple{Bool, Bool, Bool}, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:75) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3}) Inference triggered to call setindex!(::Vector{Vector{Tuple{Float64, Float64, Float64}}}, ::Vector{Tuple{Float64, Float64, Float64}}, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:77) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3}) Inference triggered to call OrdinaryDiffEq.jacobian2W!(::Matrix{Float64}, ::LinearAlgebra.UniformScaling{Bool}, ::Float64, ::Matrix{Float64}, ::Bool) called from toplevel ``` @chriselrod or @YingboMa could I get some Cthulhu magic to look at what's going on there?
Co-authored-by: Yingbo Ma <[email protected]>
Codecov Report
@@ Coverage Diff @@
## master #168 +/- ##
==========================================
+ Coverage 78.86% 78.89% +0.02%
==========================================
Files 14 14
Lines 743 744 +1
==========================================
+ Hits 586 587 +1
Misses 157 157
Continue to review full report at Codecov.
|
Nope, @YingboMa that also causes: julia> itrigs = inference_triggers(tinf)
6-element Vector{InferenceTrigger}:
Inference triggered to call hcat(::BitMatrix, ::BitMatrix) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:67) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
Inference triggered to call Vector{Tuple{Float64, Float64, Float64}}(::UndefInitializer, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:73) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
Inference triggered to call ntuple(::SparseDiffTools.var"#15#16"{3, Int64, Int64}, ::Val{3}) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:75) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
Inference triggered to call setindex!(::Vector{Tuple{Float64, Float64, Float64}}, ::Tuple{Bool, Bool, Bool}, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:75) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
Inference triggered to call setindex!(::Vector{Vector{Tuple{Float64, Float64, Float64}}}, ::Vector{Tuple{Float64, Float64, Float64}}, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:77) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
Inference triggered to call OrdinaryDiffEq.jacobian2W!(::Matrix{Float64}, ::UniformScaling{Bool}, ::Float64, ::Matrix{Float64}, ::Bool) called from toplevel |
I'll try to play with this locally and submit a PR. |
Actually, this seems better: https://github.com/JuliaDiff/SparseDiffTools.jl/pull/169/files |
On this example:
we saw just two inference triggers:
The second one is clear: okay, just use ntuple on like 75 right? So I did that, and it made the function more type-unstable? Look at the itrigs after that:
@chriselrod or @YingboMa could I get some Cthulhu magic to look at what's going on there?