Closed timholy closed 3 years ago
Which branch should one use to test it out? I removed SnoopCompilerCore
as a separate package (which seems to be part of this one now), but from this branch it does not load the _deep
tools:
(@v1.6) pkg> st SnoopCompile
Status `~/.julia/environments/v1.6/Project.toml`
[aa65fe97] SnoopCompile v2.1.2 `https://github.com/timholy/SnoopCompile.jl.git#teh/docs`
julia> using SnoopCompile
julia> names(SnoopCompile)
10-element Vector{Symbol}:
Symbol("@snoopc")
Symbol("@snoopi")
Symbol("@snoopr")
:SnoopCompile
:ascend
:filtermod
:findcaller
:invalidation_trees
:read_snoopl
:uinvalidated
julia> VERSION
v"1.6.0-DEV.1780"
It's both separate and together: the usual dependency rules mean you get SnoopCompileCore if you install SnoopCompile. But the oddity is that if you dev SnoopCompile
but not SnoopCompileCore
, you can get a mismatch because SnoopCompileCore
lives in the same repo as SnoopCompile
. So just make sure you dev
both and it should be good---let me know if not!
For anyone wondering about any more changes, I do expect to slightly rework the internals of InferenceTrigger and support the creation of flamegraphs from child nodes. I just pushed the teh/subflames
branch that encapsulates this PR and the remaining changes I envision.
Thanks, it is working now.
Note to all: barring feedback, merge day is Tuesday, Jan5 in the very early hours of the morning (UTC-6).
What's still not very clear to me is how to get a MethodInstance
from the output of @snoopi_deep
(or its flattened version). Sorry if I missed it.
Sorry if I missed it.
No worries, and thanks for the feedback!
What's still not very clear to me is how to get a MethodInstance from the output of @snoopi_deep (or its flattened version)
Hmm, that could indeed be clarified. So you don't have to wait, here's a demo:
julia> using SnoopCompile
julia> tinf = SnoopCompile.flatten_demo()
InferenceTimingNode: 0.002332952/0.0030147330000000003 on InferenceFrameInfo for Core.Compiler.Timings.ROOT() with 1 direct children
julia> using AbstractTrees
julia> print_tree(tinf)
InferenceTimingNode: 0.00233295/0.00301473 on InferenceFrameInfo for Core.Compiler.Timings.ROOT() with 1 direct children
└─ InferenceTimingNode: 0.000179361/0.000681781 on InferenceFrameInfo for SnoopCompile.FlattenDemo.packintype(::Int64) with 2 direct children
├─ InferenceTimingNode: 0.000119137/0.000119137 on InferenceFrameInfo for MyType{Int64}(::Int64) with 0 direct children
└─ InferenceTimingNode: 0.000105458/0.000383283 on InferenceFrameInfo for SnoopCompile.FlattenDemo.dostuff(::MyType{Int64}) with 2 direct children
├─ InferenceTimingNode: 7.3179e-5/0.0001366 on InferenceFrameInfo for SnoopCompile.FlattenDemo.extract(::MyType{Int64}) with 2 direct children
│ ├─ InferenceTimingNode: 3.7388e-5/3.7388e-5 on InferenceFrameInfo for getproperty(::MyType{Int64}, ::Symbol) with 0 direct children
│ └─ InferenceTimingNode: 2.6033e-5/2.6033e-5 on InferenceFrameInfo for getproperty(::MyType{Int64}, x::Symbol) with 0 direct children
└─ InferenceTimingNode: 0.000141225/0.000141225 on InferenceFrameInfo for SnoopCompile.FlattenDemo.domath(::Int64) with 0 direct children
julia> Core.MethodInstance(tinf)
MethodInstance for ROOT()
julia> Core.MethodInstance(tinf.children[1])
MethodInstance for packintype(::Int64)
julia> Core.MethodInstance(tinf.children[2])
ERROR: BoundsError: attempt to access 1-element Vector{SnoopCompileCore.InferenceTimingNode} at index [2]
Stacktrace:
[1] getindex(A::Vector{SnoopCompileCore.InferenceTimingNode}, i1::Int64)
@ Base ./array.jl:787
[2] top-level scope
@ REPL[7]:1
julia> Core.MethodInstance(tinf.children[1].children[2])
MethodInstance for dostuff(::SnoopCompile.FlattenDemo.MyType{Int64})
Yay, the long wait is over! This isn't even comprehensive (it barely touches on
flatten
,accumulate_by_source
, etc.) but it should get everyone going. I've also found that many of those earlier utilities are not as useful as more focused tools likeinference_triggers
andparcel
.This is currently based on top of #191.