Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ntuple for inference #168

Closed
wants to merge 2 commits into from
Closed

ntuple for inference #168

wants to merge 2 commits into from

Conversation

ChrisRackauckas
Copy link
Member

On this example:

using DifferentialEquations, SnoopCompile

function lorenz(du,u,p,t)
 du[1] = 10.0(u[2]-u[1])
 du[2] = u[1]*(28.0-u[3]) - u[2]
 du[3] = u[1]*u[2] - (8/3)*u[3]
end

u0 = [1.0;0.0;0.0]
tspan = (0.0,100.0)
prob = ODEProblem(lorenz,u0,tspan)
alg = Rodas5(chunk_size = Val{3}())
tinf = @snoopi_deep solve(prob,alg)
solve(prob,alg)

we saw just two inference triggers:

julia> itrigs = inference_triggers(tinf)
2-element Vector{InferenceTrigger}:
 Inference triggered to call setindex!(::Vector{Tuple{Float64, Float64, Float64}}, ::Tuple{Bool, Bool, Bool}, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:75) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
 Inference triggered to call OrdinaryDiffEq.jacobian2W!(::Matrix{Float64}, ::LinearAlgebra.UniformScaling{Bool}, ::Float64, ::Matrix{Float64}, ::Bool) called from toplevel

The second one is clear: okay, just use ntuple on like 75 right? So I did that, and it made the function more type-unstable? Look at the itrigs after that:

julia> itrigs = inference_triggers(tinf)
6-element Vector{InferenceTrigger}:
 Inference triggered to call hcat(::BitMatrix, ::BitMatrix) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:67) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
 Inference triggered to call Vector{Tuple{Float64, Float64, Float64}}(::UndefInitializer, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:73) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
 Inference triggered to call ntuple(::SparseDiffTools.var"#15#16"{3, Int64, Int64}, ::Val{3}) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:75) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
 Inference triggered to call setindex!(::Vector{Tuple{Float64, Float64, Float64}}, ::Tuple{Bool, Bool, Bool}, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:75) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
 Inference triggered to call setindex!(::Vector{Vector{Tuple{Float64, Float64, Float64}}}, ::Vector{Tuple{Float64, Float64, Float64}}, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:77) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
 Inference triggered to call OrdinaryDiffEq.jacobian2W!(::Matrix{Float64}, ::LinearAlgebra.UniformScaling{Bool}, ::Float64, ::Matrix{Float64}, ::Bool) called from toplevel

@chriselrod or @YingboMa could I get some Cthulhu magic to look at what's going on there?

On this example:

```julia
using DifferentialEquations, SnoopCompile

function lorenz(du,u,p,t)
 du[1] = 10.0(u[2]-u[1])
 du[2] = u[1]*(28.0-u[3]) - u[2]
 du[3] = u[1]*u[2] - (8/3)*u[3]
end

u0 = [1.0;0.0;0.0]
tspan = (0.0,100.0)
prob = ODEProblem(lorenz,u0,tspan)
alg = Rodas5(chunk_size = Val{3}())
tinf = @snoopi_deep solve(prob,alg)
solve(prob,alg)
```

we saw just two inference triggers:

```julia
julia> itrigs = inference_triggers(tinf)
2-element Vector{InferenceTrigger}:
 Inference triggered to call setindex!(::Vector{Tuple{Float64, Float64, Float64}}, ::Tuple{Bool, Bool, Bool}, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:75) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
 Inference triggered to call OrdinaryDiffEq.jacobian2W!(::Matrix{Float64}, ::LinearAlgebra.UniformScaling{Bool}, ::Float64, ::Matrix{Float64}, ::Bool) called from toplevel
```

The second one is clear: okay, just use ntuple on like 75 right? So I did that, and it made the function more type-unstable? Look at the itrigs after that:

```julia
julia> itrigs = inference_triggers(tinf)
6-element Vector{InferenceTrigger}:
 Inference triggered to call hcat(::BitMatrix, ::BitMatrix) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:67) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
 Inference triggered to call Vector{Tuple{Float64, Float64, Float64}}(::UndefInitializer, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:73) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
 Inference triggered to call ntuple(::SparseDiffTools.var"#15#16"{3, Int64, Int64}, ::Val{3}) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:75) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
 Inference triggered to call setindex!(::Vector{Tuple{Float64, Float64, Float64}}, ::Tuple{Bool, Bool, Bool}, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:75) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
 Inference triggered to call setindex!(::Vector{Vector{Tuple{Float64, Float64, Float64}}}, ::Vector{Tuple{Float64, Float64, Float64}}, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:77) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
 Inference triggered to call OrdinaryDiffEq.jacobian2W!(::Matrix{Float64}, ::LinearAlgebra.UniformScaling{Bool}, ::Float64, ::Matrix{Float64}, ::Bool) called from toplevel
```

@chriselrod or @YingboMa could I get some Cthulhu magic to look at what's going on there?
@codecov-commenter
Copy link

Codecov Report

Merging #168 (8691112) into master (f10cda4) will increase coverage by 0.02%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #168      +/-   ##
==========================================
+ Coverage   78.86%   78.89%   +0.02%     
==========================================
  Files          14       14              
  Lines         743      744       +1     
==========================================
+ Hits          586      587       +1     
  Misses        157      157              
Impacted Files Coverage Δ
src/differentiation/compute_jacobian_ad.jl 94.17% <100.00%> (+0.02%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update f10cda4...8691112. Read the comment docs.

@ChrisRackauckas
Copy link
Member Author

Nope, @YingboMa that also causes:

julia> itrigs = inference_triggers(tinf)
6-element Vector{InferenceTrigger}:
 Inference triggered to call hcat(::BitMatrix, ::BitMatrix) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:67) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
 Inference triggered to call Vector{Tuple{Float64, Float64, Float64}}(::UndefInitializer, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:73) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})      
 Inference triggered to call ntuple(::SparseDiffTools.var"#15#16"{3, Int64, Int64}, ::Val{3}) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:75) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
 Inference triggered to call setindex!(::Vector{Tuple{Float64, Float64, Float64}}, ::Tuple{Bool, Bool, Bool}, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:75) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
 Inference triggered to call setindex!(::Vector{Vector{Tuple{Float64, Float64, Float64}}}, ::Vector{Tuple{Float64, Float64, Float64}}, ::Int64) from generate_chunked_partials (C:\Users\accou\.julia\dev\SparseDiffTools\src\differentiation\compute_jacobian_ad.jl:77) with specialization SparseDiffTools.generate_chunked_partials(::Vector{Float64}, ::UnitRange{Int64}, ::Val{3})
 Inference triggered to call OrdinaryDiffEq.jacobian2W!(::Matrix{Float64}, ::UniformScaling{Bool}, ::Float64, ::Matrix{Float64}, ::Bool) called from toplevel

@YingboMa
Copy link
Member

YingboMa commented Dec 8, 2021

I'll try to play with this locally and submit a PR.

@chriselrod
Copy link
Contributor

@ChrisRackauckas ChrisRackauckas deleted the ntuple branch December 8, 2021 19:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants