I ran this code, which produced tapes for, say, 10 MPI ranks.
using MPI
using MPITape
function main()
MPI.Init()
comm = MPI.COMM_WORLD
rank = MPI.Comm_rank(comm)
nranks = MPI.Comm_size(comm)
# chain topology, i.e. "open" boundary conditions
left = rank - 1
right = rank + 1
if left < 0
left = MPI.PROC_NULL
elseif right >= nranks
right = MPI.PROC_NULL
end
N = 2^22
buf = rand(N)
for r in 0:nranks-1
MPI.Barrier(comm)
if r == rank
rightstr = right == MPI.PROC_NULL ? "no one" : right
leftstr = left == MPI.PROC_NULL ? "no one" : left
println("Rank $r will send to $rightstr and receive from $leftstr soon.")
end
end
# this communication will happen in serial
MPI.Send(buf, comm; dest=right)
MPI.Recv!(buf, comm; source=left)
# Due to serialization, ranks will reach this point in a deterministic order.
# The last rank will reach this point first, then the second to last rank, and so on.
println("Rank $rank has finished Recv!/Send.")
MPI.Finalize()
end
@record main()
MPITape.save()
I ran this code, which produced tapes for, say, 10 MPI ranks.
Afterwards I ran
to get
Note the
-1
which corresponds toMPI.PROC_NULL
(i.e. a dummy send/receive).Trying to run
MPITape.plot_merged(tape_merged)
I gotWe should try to support
-1
here in the sense of not showing an arrow for the send/recv.cc @Mellich