erdc / proteus

A computational methods and simulation toolkit
http://proteustoolkit.org
MIT License
88 stars 56 forks source link

how to run proteus from a python script (in parallel) #349

Closed cekees closed 4 years ago

cekees commented 8 years ago

Sometimes running your _p and _n (or _so) modules via the parun cli is not convenient. You can also run directly with a python script. There is a module proteus.iproteus that helps set up the proteus system inside a script or IPython notebook, but you also need to set up the logging and probably change some options if you want to run in parallel. The Poisson test script is a reasonably good example:

https://github.com/erdc-cm/proteus/blob/master/proteus/tests/ci/test_poisson.py

It can be run with mpiexec -np 4 python test_poisson.py.

Notice in particular how some of the numerics module options are changed, such as the linear solver to KSP_petsc4py. You may also want to adjust the partitioning type and amount over overlap:

https://github.com/erdc-cm/proteus/blob/master/proteus/default_n.py#L217

/cc @robertsawko

robertsawko commented 8 years ago

Right, thanks. So unfortunately I cannot get even the example to work. It begins to do something but trips over with

Traceback (most recent call last):
  File "test_poisson.py", line 113, in <module>
Traceback (most recent call last):
  File "test_poisson.py", line 113, in <module>
    test_c0q1()
  File "test_poisson.py", line 82, in test_c0q1
    check_c0q1(test_hexMesh_3x3=False,use_petsc=False)
  File "test_poisson.py", line 74, in check_c0q1
    ns = NumericalSolution.NS_base(so,pList,nList,so.sList,opts)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/NumericalSolution.py", line 381, in __init__
    test_c0q1()
  File "test_poisson.py", line 82, in test_c0q1
    check_c0q1(test_hexMesh_3x3=False,use_petsc=False)
  File "test_poisson.py", line 74, in check_c0q1
    ns = NumericalSolution.NS_base(so,pList,nList,so.sList,opts)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/NumericalSolution.py", line 381, in __init__
        model = Transport.MultilevelTransport(p,n,mlMesh,OneLevelTransportType=p.LevelModelType)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 5996, in __init__
model = Transport.MultilevelTransport(p,n,mlMesh,OneLevelTransportType=p.LevelModelType)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 5996, in __init__
Traceback (most recent call last):
  File "test_poisson.py", line 113, in <module>
    test_c0q1()
  File "test_poisson.py", line 82, in test_c0q1
    check_c0q1(test_hexMesh_3x3=False,use_petsc=False)
  File "test_poisson.py", line 74, in check_c0q1
    ns = NumericalSolution.NS_base(so,pList,nList,so.sList,opts)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/NumericalSolution.py", line 381, in __init__
    model = Transport.MultilevelTransport(p,n,mlMesh,OneLevelTransportType=p.LevelModelType)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 5996, in __init__
    PhiSpaceTypeDict=phiSpaces)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 6214, in initialize
    PhiSpaceTypeDict=phiSpaces)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 6214, in initialize
    PhiSpaceTypeDict=phiSpaces)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 6214, in initialize
    movingDomain=movingDomain)
    movingDomain=movingDomain)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 1502, in __init__
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 1502, in __init__
    assert numericalFluxType != None and numericalFluxType.useWeakDirichletConditions,"You must use a numerical flux to apply weak boundary conditions for parallel runs"
AssertionError    assert numericalFluxType != None and numericalFluxType.useWeakDirichletConditions,"You must use a numerical flux to apply weak boundary conditions for parallel runs"
AssertionError: You must use a numerical flux to apply weak boundary conditions for parallel runs
: You must use a numerical flux to apply weak boundary conditions for parallel runs
    movingDomain=movingDomain)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 1502, in __init__
    assert numericalFluxType != None and numericalFluxType.useWeakDirichletConditions,"You must use a numerical flux to apply weak boundary conditions for parallel runs"
AssertionError: You must use a numerical flux to apply weak boundary conditions for parallel runs
Traceback (most recent call last):
  File "test_poisson.py", line 113, in <module>
    test_c0q1()
  File "test_poisson.py", line 82, in test_c0q1
    check_c0q1(test_hexMesh_3x3=False,use_petsc=False)
  File "test_poisson.py", line 74, in check_c0q1
    ns = NumericalSolution.NS_base(so,pList,nList,so.sList,opts)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/NumericalSolution.py", line 381, in __init__
    model = Transport.MultilevelTransport(p,n,mlMesh,OneLevelTransportType=p.LevelModelType)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 5996, in __init__
    PhiSpaceTypeDict=phiSpaces)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 6214, in initialize
    movingDomain=movingDomain)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 1502, in __init__
    assert numericalFluxType != None and numericalFluxType.useWeakDirichletConditions,"You must use a numerical flux to apply weak boundary conditions for parallel runs"
AssertionError: You must use a numerical flux to apply weak boundary conditions for parallel runs
*** The MPI_Barrier() function was called after MPI_FINALIZE was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.
[(null):11553] Local abort after MPI_FINALIZE completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
*** The MPI_Barrier() function was called after MPI_FINALIZE was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.
*** The MPI_Barrier() function was called after MPI_FINALIZE was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.
[(null):11551] Local abort after MPI_FINALIZE completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
[(null):11552] Local abort after MPI_FINALIZE completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
*** The MPI_Barrier() function was called after MPI_FINALIZE was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.
[(null):11550] Local abort after MPI_FINALIZE completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!

If I put some changes into my cases. In particular I am using nLayerOfOverlapForParallel=0 and parallelPeriodic=True according to the comment in the default code (I do have periodic BCs). But then the code responds with:

Traceback (most recent call last):
  File "mesh_and_discretisation_test.py", line 28, in <module>
    Traceback (most recent call last):
Traceback (most recent call last):
  File "mesh_and_discretisation_test.py", line 28, in <module>
    ns = NumericalSolution.NS_base(so, pList, nList, so.sList, opts)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/NumericalSolution.py", line 381, in __init__
Traceback (most recent call last):
  File "mesh_and_discretisation_test.py", line 28, in <module>
    ns = NumericalSolution.NS_base(so, pList, nList, so.sList, opts)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/NumericalSolution.py", line 381, in __init__
ns = NumericalSolution.NS_base(so, pList, nList, so.sList, opts)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/NumericalSolution.py", line 381, in __init__
  File "mesh_and_discretisation_test.py", line 28, in <module>
    model = Transport.MultilevelTransport(p,n,mlMesh,OneLevelTransportType=p.LevelModelType)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 5996, in __init__
    model = Transport.MultilevelTransport(p,n,mlMesh,OneLevelTransportType=p.LevelModelType)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 5996, in __init__
    model = Transport.MultilevelTransport(p,n,mlMesh,OneLevelTransportType=p.LevelModelType)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 5996, in __init__
    ns = NumericalSolution.NS_base(so, pList, nList, so.sList, opts)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/NumericalSolution.py", line 381, in __init__
    model = Transport.MultilevelTransport(p,n,mlMesh,OneLevelTransportType=p.LevelModelType)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 5996, in __init__
    PhiSpaceTypeDict=phiSpaces)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 6118, in initialize
        PhiSpaceTypeDict=phiSpaces)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 6118, in initialize
PhiSpaceTypeDict=phiSpaces)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 6118, in initialize
    PhiSpaceTypeDict=phiSpaces)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 6118, in initialize
    trialSpace_global = TrialSpaceType(mesh,nd)
      File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/FemTools.py", line 4144, in __init__
    trialSpace_global = TrialSpaceType(mesh,nd)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/FemTools.py", line 4144, in __init__
trialSpace_global = TrialSpaceType(mesh,nd)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/FemTools.py", line 4144, in __init__
    trialSpace_global = TrialSpaceType(mesh,nd)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/FemTools.py", line 4144, in __init__
        DiscontinuousGalerkinDOFMap(mesh,localFunctionSpace))
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/FemTools.py", line 1892, in __init__
DiscontinuousGalerkinDOFMap(mesh,localFunctionSpace))
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/FemTools.py", line 1892, in __init__
    DiscontinuousGalerkinDOFMap(mesh,localFunctionSpace))
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/FemTools.py", line 1892, in __init__
    DiscontinuousGalerkinDOFMap(mesh,localFunctionSpace))
    self.updateAfterParallelPartitioning(mesh.globalMesh)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/FemTools.py", line 1903, in updateAfterParallelPartitioning
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/FemTools.py", line 1892, in __init__
    self.updateAfterParallelPartitioning(mesh.globalMesh)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/FemTools.py", line 1903, in updateAfterParallelPartitioning
    self.updateAfterParallelPartitioning(mesh.globalMesh)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/FemTools.py", line 1903, in updateAfterParallelPartitioning
        self.updateAfterParallelPartitioning(mesh.globalMesh)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/FemTools.py", line 1903, in updateAfterParallelPartitioning
self.dof_offsets_subdomain_owned = numpy.zeros(globalMesh.nodeOffsets_subdomain_owned.shape,'i')
AttributeError: 'NoneType' object has no attribute 'nodeOffsets_subdomain_owned'    self.dof_offsets_subdomain_owned = numpy.zeros(globalMesh.nodeOffsets_subdomain_owned.shape,'i')
AttributeError: 'NoneType' object has no attribute 'nodeOffsets_subdomain_owned'

    self.dof_offsets_subdomain_owned = numpy.zeros(globalMesh.nodeOffsets_subdomain_owned.shape,'i')
AttributeError: 'NoneType' object has no attribute 'nodeOffsets_subdomain_owned'
    self.dof_offsets_subdomain_owned = numpy.zeros(globalMesh.nodeOffsets_subdomain_owned.shape,'i')
AttributeError: 'NoneType' object has no attribute 'nodeOffsets_subdomain_owned'
*** The MPI_Barrier() function was called after MPI_FINALIZE was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.
[(null):13305] Local abort after MPI_FINALIZE completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
*** The MPI_Barrier() function was called after MPI_FINALIZE was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.
[(null):13306] Local abort after MPI_FINALIZE completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
*** The MPI_Barrier() function was called after MPI_FINALIZE was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.
[(null):13308] Local abort after MPI_FINALIZE completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
*** The MPI_Barrier() function was called after MPI_FINALIZE was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.
[(null):13307] Local abort after MPI_FINALIZE completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!

These two look like completely different problems to me. Clearly, I am still missing something in my setup but the original case breaks too!

cekees commented 8 years ago

Just as a test try running in parallel with parun. From that test directory to

which mpiexec; which python; mpiexec -np 2 parun poisson_3d_p.py poisson_3d_c0p1_n.py -l 5 -v

robertsawko commented 8 years ago

Basic parun seems to work. These are the results of both type commands

/usr/bin/mpiexec
/home/robert/projects/proteus/linux2/bin/python
robertsawko commented 8 years ago

On my Ubuntu installation the mpiexec is taken from proteus directory, but that still doesn't help and a similar error appears

%type mpiexec                                                                                          
mpiexec is /home/robert/projects/proteus/linux2/bin/mpiexec

The error after running mpiexec -np 2 python test_poisson.py

Traceback (most recent call last):
  File "test_poisson.py", line 113, in <module>
    test_c0q1()
  File "test_poisson.py", line 82, in test_c0q1
    check_c0q1(test_hexMesh_3x3=False,use_petsc=False)
  File "test_poisson.py", line 74, in check_c0q1
    ns = NumericalSolution.NS_base(so,pList,nList,so.sList,opts)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/NumericalSolution.py", line 392, in __init__
    model = Transport.MultilevelTransport(p,n,mlMesh,OneLevelTransportType=p.LevelModelType)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 5999, in __init__
Traceback (most recent call last):
  File "test_poisson.py", line 113, in <module>
    test_c0q1()
  File "test_poisson.py", line 82, in test_c0q1
    check_c0q1(test_hexMesh_3x3=False,use_petsc=False)
  File "test_poisson.py", line 74, in check_c0q1
    ns = NumericalSolution.NS_base(so,pList,nList,so.sList,opts)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/NumericalSolution.py", line 392, in __init__
    model = Transport.MultilevelTransport(p,n,mlMesh,OneLevelTransportType=p.LevelModelType)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 5999, in __init__
    PhiSpaceTypeDict=phiSpaces)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 6217, in initialize
    PhiSpaceTypeDict=phiSpaces)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 6217, in initialize
    movingDomain=movingDomain)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 1502, in __init__
    movingDomain=movingDomain)
  File "/home/robert/projects/proteus/linux2/lib/python2.7/site-packages/proteus-1.0.0-py2.7-linux-x86_64.egg/proteus/Transport.py", line 1502, in __init__
    assert numericalFluxType != None and numericalFluxType.useWeakDirichletConditions,"You must use a numerical flux to apply weak boundary conditions for parallel runs"
AssertionError: You must use a numerical flux to apply weak boundary conditions for parallel runs
    assert numericalFluxType != None and numericalFluxType.useWeakDirichletConditions,"You must use a numerical flux to apply weak boundary conditions for parallel runs"
AssertionError: You must use a numerical flux to apply weak boundary conditions for parallel runs
Attempting to use an MPI routine after finalizing MPICH
Attempting to use an MPI routine after finalizing MPICH

parun works fine.

For now it's ok. I will just move to parun. I really wanted to serialize just 4 simulations. But it would be a useful feature.

robertsawko commented 8 years ago

Yes, I cannot even get the simpler Riemann problem for Burgers equation scripts to work in parallel. Maybe this is something to do with my tedious (and unrealistic!) periodic setup? For instance this script runs with parun run without MPI, but breaks with mpiexec -np 2. I've tried setting parallelPeriodic to True, but to no avail.

cekees commented 8 years ago

Let's get the test_poisson script working first. It looks to me like the input file for the commit you're working from has the parallel input parameters turned of. Is there a parallel convenience flag set to False in one of the poisson*_n.py files? Parallel periodic conditions do require some care, but we can get it working. Just want to make sure the test_poisson script works for you first.

robertsawko commented 8 years ago

Right, I started looking at too many things at once. Focusing on the test_poisson. Just to summarise it does run from parun but not through mpiexec -np <number> python. The convenience flag is definitely set to True and I verifed that the correct if statement branch is taken.

cekees commented 8 years ago

OK, there are multiple input files for that script. The above error shows it failing when it gets to the c0q1 test, so try commenting that out on line 113 of test_poisson.py

robertsawko commented 8 years ago

I see, I must correct myself then. The c0q1 numerics file did contain parallel fixed to False. I have now corrected it to true and it seems to go further

[       6] Initializing NumericalSolution for poisson_3d_c0q1pe2
 System includes: 
poisson_3d_c0q1pe2

[       6] Setting Archiver(s)
[       6] Setting up MultilevelMesh
[       6] Building one multilevel mesh for all models
[       6] Generating mesh for poisson_3d_c0q1pe2
[       6] Building 5 x 5 x 5 rectangular mesh for poisson_3d_c0q1pe2
[       6] Generating hexahedral mesh
[       6] Partitioning mesh among 2 processors using partitioningType = 1

===================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   EXIT CODE: 134
=   CLEANING UP REMAINING PROCESSES
=   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
YOUR APPLICATION TERMINATED WITH THE EXIT STRING: Aborted (signal 6)
This typically refers to a problem with your application.
Please see the FAQ page for debugging suggestions
python2.7: proteus/mesh.cpp:2660: int constructElementBoundaryElementsArrayWithGivenElementBoundaryAndEdgeNumbers_hexahedron(Mesh&): Assertion `elementBoundaryIds[ebt] == ebN_global' failed.
cekees commented 8 years ago

Let's comment out the test_c0q1 line for now and run the script with just the c0p1. I just finished refactoring some of the Q1 and Q2 support and want to keep those issues separate. If you can run the script in parallel with just test_c0p1 (or test_c0p1 and test_c0p2), then we can move to the scripts for your application.

robertsawko commented 8 years ago

After playing a little with this script I figured it's the the test_hexMesh_3x3=False which is causing one of the c0q1 test cases to fail. Setting that to True makes the whole test suite run okay.

adimako commented 5 years ago

@cekees @tridelat we can run in serial for sure in the air-water-vv tests, do not know about parallel though. Leave open?

cekees commented 4 years ago

I think this was already addressed.