Open alanedelman opened 4 years ago
I can't reproduce this. Is it possible that you've called MPI.Init()
in your session?
nope, the entire session is as typed in 1.4.2
Which versions of Elemental
, MPI
and MPIClusterManagers
are you using?
Tagged versions (currently). BTW, I had the same issue.
[902c3f28] Elemental v0.6.0
[da04e1cc] MPI v0.14.2
[e7922434] MPIClusterManagers v0.2.0
➜ ~ exec '/Users/viral/Desktop/Julia_Releases/Julia-1.4.app/Contents/Resources/julia/bin/julia'
_
_ _ _(_)_ | Documentation: https://docs.julialang.org
(_) | (_) (_) |
_ _ _| |_ __ _ | Type "?" for help, "]?" for Pkg help.
| | | | | | |/ _` | |
| | |_| | | | (_| | | Version 1.4.2 (2020-05-23)
_/ |\__'_|_|_|\__'_| | Official https://julialang.org/ release
|__/ |
julia> using LinearAlgebra, Elemental
julia> A = Elemental.Matrix(Float64)
0×0 Elemental.Matrix{Float64}
julia> Elemental.gaussian!(A, 100, 80);
julia> U, s, V = svd(A);
julia> convert(Matrix{Float64}, s)[1:10]
10-element Array{Float64,1}:
18.20429433719855
17.728335105360213
17.17920145437086
16.733135377026617
16.502371100450368
16.316310946797437
16.051488300107216
15.76097431919749
15.121926373180647
14.704011842059382
julia> using MPI, MPIClusterManagers, Distributed
julia> man = MPIManager(np = 4);
ERROR: AssertionError: MPI.Initialized() == mgr.launched
Stacktrace:
[1] MPIManager(; np::Int64, launch_timeout::Float64, mode::TransportMode, master_tcp_interface::String) at /Users/viral/.julia/packages/MPIClusterManagers/0ZYYQ/src/mpimanager.jl:64
[2] top-level scope at REPL[7]:1
I do believe this used to work before the recent round of tagging new versions of packages. Could it be a Julia 1.4.2 issue?
Yeah. I was on a dev branch that didn't have the _jll
stuff. After freeing, I'm getting
julia> using MPI
julia> MPI.Initialized()
false
julia> using Elemental
julia> MPI.Initialized()
true
which used to be false so for some reason we are now automatically initializing MPI
when loading Elemental
. I'm wondering if this might be because Elemental
and MPI
are now using the same MPI "instance" where they didn't use to.
Oh, the joy of MPI. Maybe we can add a comment that each of the examples assumes a separate session.
Maybe just do one intialization? I thought they would have big used the same system MPI in the past.
Part of the idea with the first example is that you don't have to setup an MPIClusterManager if you don't run distributed. However, this is needed for the second example.
Is the first example single core?
Yes. Elemental.Matrix(Float64)
is a non-distributed array type.
It would be nice to lead with a parallel example. I might even suggest just dropping that one. I already have LAPACK and multi-threaded blas...
i would drop the single processor example
On Wed, May 27, 2020 at 12:58 PM Viral B. Shah notifications@github.com wrote:
It would be nice to lead with a parallel example. I might even suggest just dropping that one. I already have LAPACK and multi-threaded blas...
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/JuliaParallel/Elemental.jl/issues/65#issuecomment-634802104, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAGSSQS7STZKRPMGCEJYRSDRTVBEZANCNFSM4NLB5A6A .
on the readme the first example and the second example do not compose a new user would naturally try both and get frustrated
perhaps the example that nominally states
without MPI
has a uniprocessor MPI under the hood causing interference?but still this is frustrating as a user experience