PieterjanRobbe / MultilevelEstimators.jl

The Julia module for Multilevel Monte Carlo methods
Other
9 stars 4 forks source link

cannot run MIMC #2

Closed anhvt2 closed 3 years ago

anhvt2 commented 3 years ago

Hi, thanks for the great package!

I'm new to Julia. I was testing the example given here but hit this error when following the example. Is there any simple solution for this?

Many thanks, Anh

julia> estimator = Estimator(TD(2), MC(), sample_lognormal, distributions)
julia> h = run(estimator, 5e-3)
+-------------------------------------------------------------------------------+
| *** MultilevelEstimators.jl @2021-02-11T21:08:19.744                          |
| *** This is a Estimator{TD{2}, MC}                                            |
| *** Simulating UntitledEstimator3                                             |
| *** Tolerance on RMSE ϵ = 5.302e-02                                           |
+-------------------------------------------------------------------------------+
Currently running with L = 0.
Shape of the index set:
  ◼ 
Taking 20 warm-up samples at index (0, 0)...ERROR: MethodError: no method matching one(::Type{DataType})
Closest candidates are:
  one(::Type{Missing}) at missing.jl:103
  one(::BitArray{2}) at bitarray.jl:426
  one(::Missing) at missing.jl:100
  ...
Stacktrace:
 [1] (::Base.var"#770#772")(::Task) at ./asyncmap.jl:178
 [2] foreach(::Base.var"#770#772", ::Array{Any,1}) at ./abstractarray.jl:2009
 [3] maptwice(::Function, ::Channel{Any}, ::Array{Any,1}, ::UnitRange{Int64}) at ./asyncmap.jl:178
 [4] wrap_n_exec_twice(::Channel{Any}, ::Array{Any,1}, ::Distributed.var"#206#209"{Distributed.CachingPool}, ::Function, ::UnitRange{Int64}) at ./asyncmap.jl:154
 [5] async_usemap(::Base.var"#56#58"{Base.var"#56#57#59"{ExponentialBackOff,Nothing,Distributed.var"#222#223"{MultilevelEstimators.var"#f#69"{Estimator{TD{2,Tuple{Int64,Int64}},MC,Array{Normal{Int64},1},Dict{Symbol,Any},MultilevelEstimators.EstimatorInternals{MultilevelEstimators.DefaultInternals{Array{Array{Array{Float64,1},2},2},Array{Int64,2},Array{Float64,2},Set{CartesianIndex{2}},MultilevelEstimators.IndexSetSize{Int64}},MultilevelEstimators.GenericIndexSetInternals,MultilevelEstimators.MCInternals}},CartesianIndex{2},Int64}}}}, ::UnitRange{Int64}; ntasks::Function, batch_size::Nothing) at ./asyncmap.jl:103
 [6] #asyncmap#754 at ./asyncmap.jl:81 [inlined]
 [7] pmap(::Function, ::Distributed.CachingPool, ::UnitRange{Int64}; distributed::Bool, batch_size::Int64, on_error::Nothing, retry_delays::ExponentialBackOff, retry_check::Nothing) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Distributed/src/pmap.jl:126
 [8] parallel_sample!(::Estimator{TD{2,Tuple{Int64,Int64}},MC,Array{Normal{Int64},1},Dict{Symbol,Any},MultilevelEstimators.EstimatorInternals{MultilevelEstimators.DefaultInternals{Array{Array{Array{Float64,1},2},2},Array{Int64,2},Array{Float64,2},Set{CartesianIndex{2}},MultilevelEstimators.IndexSetSize{Int64}},MultilevelEstimators.GenericIndexSetInternals,MultilevelEstimators.MCInternals}}, ::CartesianIndex{2}, ::Int64, ::Int64) at /home/anhvt89/.julia/packages/MultilevelEstimators/6IH1f/src/sample.jl:49
 [9] macro expansion at ./timing.jl:233 [inlined]
 [10] sample!(::Estimator{TD{2,Tuple{Int64,Int64}},MC,Array{Normal{Int64},1},Dict{Symbol,Any},MultilevelEstimators.EstimatorInternals{MultilevelEstimators.DefaultInternals{Array{Array{Array{Float64,1},2},2},Array{Int64,2},Array{Float64,2},Set{CartesianIndex{2}},MultilevelEstimators.IndexSetSize{Int64}},MultilevelEstimators.GenericIndexSetInternals,MultilevelEstimators.MCInternals}}, ::CartesianIndex{2}, ::Int64) at /home/anhvt89/.julia/packages/MultilevelEstimators/6IH1f/src/sample.jl:27
 [11] _run(::Estimator{TD{2,Tuple{Int64,Int64}},MC,Array{Normal{Int64},1},Dict{Symbol,Any},MultilevelEstimators.EstimatorInternals{MultilevelEstimators.DefaultInternals{Array{Array{Array{Float64,1},2},2},Array{Int64,2},Array{Float64,2},Set{CartesianIndex{2}},MultilevelEstimators.IndexSetSize{Int64}},MultilevelEstimators.GenericIndexSetInternals,MultilevelEstimators.MCInternals}}, ::Float64) at /home/anhvt89/.julia/packages/MultilevelEstimators/6IH1f/src/run.jl:67
 [12] macro expansion at ./timing.jl:233 [inlined]
 [13] run(::Estimator{TD{2,Tuple{Int64,Int64}},MC,Array{Normal{Int64},1},Dict{Symbol,Any},MultilevelEstimators.EstimatorInternals{MultilevelEstimators.DefaultInternals{Array{Array{Array{Float64,1},2},2},Array{Int64,2},Array{Float64,2},Set{CartesianIndex{2}},MultilevelEstimators.IndexSetSize{Int64}},MultilevelEstimators.GenericIndexSetInternals,MultilevelEstimators.MCInternals}}, ::Array{Float64,1}) at /home/anhvt89/.julia/packages/MultilevelEstimators/6IH1f/src/run.jl:31
 [14] run(::Estimator{TD{2,Tuple{Int64,Int64}},MC,Array{Normal{Int64},1},Dict{Symbol,Any},MultilevelEstimators.EstimatorInternals{MultilevelEstimators.DefaultInternals{Array{Array{Array{Float64,1},2},2},Array{Int64,2},Array{Float64,2},Set{CartesianIndex{2}},MultilevelEstimators.IndexSetSize{Int64}},MultilevelEstimators.GenericIndexSetInternals,MultilevelEstimators.MCInternals}}, ::Float64) at /home/anhvt89/.julia/packages/MultilevelEstimators/6IH1f/src/run.jl:38
 [15] top-level scope at REPL[59]:1
PieterjanRobbe commented 3 years ago

Hi @anhvt2, thanks for your interest! There was a missing one( . ) in the sample_lognormal function for the multi-index example. I've updated the tutorial (see https://pieterjanrobbe.github.io/MultilevelEstimators.jl/dev/example.html). Can you verify that the example is working now as expected? Thanks!

anhvt2 commented 3 years ago

Hi @PieterjanRobbe, thanks for looking into this and the quick fix. Now it's working great. I was able to finish the lognormal diffusion example. Thanks again for your time and kind help!