Open AdrienWehrle opened 4 months ago
As soon as I turn gravity on in iceslab_fv_2d_SI.i
on u and v with
https://github.com/AdrienWehrle/diuca/blob/076e10e27d1fc7b1e39723af81d990ee3a2b5c48/problems/iceslab_fv_2d_SI.i#L116-L122
https://github.com/AdrienWehrle/diuca/blob/076e10e27d1fc7b1e39723af81d990ee3a2b5c48/problems/iceslab_fv_2d_SI.i#L151-L157
the outputting step is on hold while it takes a fraction of seconds to converge without gravity and I don't really understand where the problem could come from. Have you seen similar patterns in the past @GiudGiud? I added the gravity terms based on e.g. this input file.
The following total 2 aux variables:
mu_out
rho_out
are added for automatic output by MaterialOutputAction.
Framework Information:
MOOSE Version: git commit 0b297bbb93 on 2024-02-15
LibMesh Version:
PETSc Version: 3.20.3
SLEPc Version: 3.20.1
Current Time: Tue Feb 27 11:29:33 2024
Executable Timestamp: Mon Feb 26 17:55:37 2024
Parallelism:
Num Processors: 1
Num Threads: 1
Mesh:
Parallel Type: replicated
Mesh Dimension: 2
Spatial Dimension: 2
Nodes: 661
Elems: 200
Num Subdomains: 1
Nonlinear System:
Num DOFs: 600
Num Local DOFs: 600
Variables: { "vel_x" "vel_y" "pressure" }
Finite Element Types: "MONOMIAL"
Approximation Orders: "CONSTANT"
Auxiliary System:
Num DOFs: 600
Num Local DOFs: 600
Variables: { "vel_z" "mu_out" "rho_out" }
Finite Element Types: "MONOMIAL"
Approximation Orders: "CONSTANT"
Execution Information:
Executioner: Transient
TimeStepper: ConstantDT
TimeIntegrator: ImplicitEuler
Solver Mode: Preconditioned JFNK
PETSc Preconditioner: lu
Time Step 0, time = 0
Currently Executing
Outputting
Outputting Step..................^Z
[23]+ Stopped ./diuca-opt -i problems/iceslab_fv_2d_SI.i
No this isn't normal. Can you turn off output temporarily
I get a zero pivot when I comment the output... Maybe the outputting step is blowing up before because of this?
Framework Information:
MOOSE Version: git commit 0b297bbb93 on 2024-02-15
LibMesh Version:
PETSc Version: 3.20.3
SLEPc Version: 3.20.1
Current Time: Tue Feb 27 16:50:28 2024
Executable Timestamp: Tue Feb 27 12:01:14 2024
Parallelism:
Num Processors: 1
Num Threads: 1
Mesh:
Parallel Type: replicated
Mesh Dimension: 2
Spatial Dimension: 2
Nodes: 661
Elems: 200
Num Subdomains: 1
Nonlinear System:
Num DOFs: 600
Num Local DOFs: 600
Variables: { "vel_x" "vel_y" "pressure" }
Finite Element Types: "MONOMIAL"
Approximation Orders: "CONSTANT"
Auxiliary System:
Num DOFs: 200
Num Local DOFs: 200
Variables: "vel_z"
Finite Element Types: "MONOMIAL"
Approximation Orders: "CONSTANT"
Execution Information:
Executioner: Transient
TimeStepper: ConstantDT
TimeIntegrator: ImplicitEuler
Solver Mode: Preconditioned JFNK
PETSc Preconditioner: lu
Time Step 0, time = 0
Time Step 1, time = 3.1536e+06, dt = 3.1536e+06
|residual|_2 of individual variables:
vel_x: 0.46898
vel_y: 0.633676
pressure: 0.0810533
0 Nonlinear |R| = 7.925010e-01
Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
PC failed due to FACTOR_NUMERIC_ZEROPIVOT
Nonlinear solve did not converge due to DIVERGED_FNORM_NAN iterations 0
Solve Did NOT Converge!
Aborting as solve did not converge
Time Step 1, time = 1.5768e+06, dt = 1.5768e+06
|residual|_2 of individual variables:
vel_x: 0.46898
vel_y: 0.633676
pressure: 0.0810533
0 Nonlinear |R| = 7.925010e-01
Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
PC failed due to FACTOR_NUMERIC_ZEROPIVOT
Nonlinear solve did not converge due to DIVERGED_FNORM_NAN iterations 0
Solve Did NOT Converge!
Aborting as solve did not converge
Time Step 1, time = 788400, dt = 788400
|residual|_2 of individual variables:
vel_x: 0.46898
vel_y: 0.633676
pressure: 0.0810533
0 Nonlinear |R| = 7.925010e-01
Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
PC failed due to FACTOR_NUMERIC_ZEROPIVOT
Nonlinear solve did not converge due to DIVERGED_FNORM_NAN iterations 0
Solve Did NOT Converge!
Aborting as solve did not converge
Time Step 1, time = 394200, dt = 394200
|residual|_2 of individual variables:
vel_x: 0.46898
vel_y: 0.633676
pressure: 0.0810533
0 Nonlinear |R| = 7.925010e-01
Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
PC failed due to FACTOR_NUMERIC_ZEROPIVOT
Nonlinear solve did not converge due to DIVERGED_FNORM_NAN iterations 0
Solve Did NOT Converge!
Aborting as solve did not converge
Time Step 1, time = 197100, dt = 197100
|residual|_2 of individual variables:
vel_x: 0.46898
vel_y: 0.633676
pressure: 0.0810533
0 Nonlinear |R| = 7.925010e-01
Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
PC failed due to FACTOR_NUMERIC_ZEROPIVOT
Nonlinear solve did not converge due to DIVERGED_FNORM_NAN iterations 0
Solve Did NOT Converge!
Aborting as solve did not converge
Time Step 1, time = 98550, dt = 98550
|residual|_2 of individual variables:
vel_x: 0.46898
vel_y: 0.633676
pressure: 0.0810533
0 Nonlinear |R| = 7.925010e-01
^Z
[5]+ Stopped ./diuca-opt -i problems/iceslab_fv_2d_SI.i
you cannot use a functionDirichletBC for pressure you have to use this
[outlet_p]
type = INSFVOutletPressureBC
variable = pressure
boundary = 'right'
functor = 0
[]
for functor you can use ocean_pressure
Thanks for this!
you cannot use a functionDirichletBC for pressure you have to use this
[outlet_p] type = INSFVOutletPressureBC variable = pressure boundary = 'right' functor = 0 []
for functor you can use ocean_pressure
Working more on the FV iceslab input I come to realize it was converging only because it was hitting the II_eps_min limit giving constant viscosity to the whole system! Even for the case with top and bottom walls and a zero pressure at the front. And that's because II_eps_min was still there as a finite strain rate parameter with a rather high value (1e-15). I've now lowered it some more for it to be used against floating point errors (1e-50) and get the following (which keeps going if not stopped)
The following total 2 aux variables:
mu_out
rho_out
are added for automatic output by MaterialOutputAction.
Framework Information:
MOOSE Version: git commit 0b297bbb93 on 2024-02-15
LibMesh Version:
PETSc Version: 3.20.3
SLEPc Version: 3.20.1
Current Time: Tue Feb 27 18:14:11 2024
Executable Timestamp: Tue Feb 27 18:14:09 2024
Parallelism:
Num Processors: 1
Num Threads: 1
Mesh:
Parallel Type: replicated
Mesh Dimension: 2
Spatial Dimension: 2
Nodes: 181
Elems: 50
Num Subdomains: 1
Nonlinear System:
Num DOFs: 150
Num Local DOFs: 150
Variables: { "vel_x" "vel_y" "pressure" }
Finite Element Types: "MONOMIAL"
Approximation Orders: "CONSTANT"
Auxiliary System:
Num DOFs: 150
Num Local DOFs: 150
Variables: { "vel_z" "mu_out" "rho_out" }
Finite Element Types: "MONOMIAL"
Approximation Orders: "CONSTANT"
Execution Information:
Executioner: Transient
TimeStepper: ConstantDT
TimeIntegrator: ImplicitEuler
Solver Mode: Preconditioned JFNK
PETSc Preconditioner: lu
Time Step 0, time = 0
Time Step 1, time = 3.1536e+06, dt = 3.1536e+06
|residual|_2 of individual variables:
vel_x: 0.0805352
vel_y: 0
pressure: 0.113915
0 Nonlinear |R| = 1.395084e-01
Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
PC failed due to FACTOR_NUMERIC_ZEROPIVOT
Nonlinear solve did not converge due to DIVERGED_FNORM_NAN iterations 0
Solve Did NOT Converge!
Aborting as solve did not converge
Time Step 1, time = 1.5768e+06, dt = 1.5768e+06
|residual|_2 of individual variables:
vel_x: 0.0805352
vel_y: 0
pressure: 0.113915
0 Nonlinear |R| = 1.395084e-01
Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
PC failed due to FACTOR_NUMERIC_ZEROPIVOT
Nonlinear solve did not converge due to DIVERGED_FNORM_NAN iterations 0
Solve Did NOT Converge!
Aborting as solve did not converge
Time Step 1, time = 788400, dt = 788400
|residual|_2 of individual variables:
vel_x: 0.0805352
vel_y: 0
pressure: 0.113915
0 Nonlinear |R| = 1.395084e-01
Linear solve did not converge due to DIVERGED_PC_FAILED iterations 0
PC failed due to FACTOR_NUMERIC_ZEROPIVOT
Nonlinear solve did not converge due to DIVERGED_FNORM_NAN iterations 0
Solve Did NOT Converge!
Aborting as solve did not converge
Issue can be reproduced on the PR_match_iceslab_AD_FV branch (note that e.g. the object FVICeMaterialSI
has been modified compared to main
).
Which means that I thought we were reaching convergence on our simple case of iceslab_fv_2d_SI.i
(walls bottom and top, zero pressure at the outlet and inlet on velocity) but that was only because we were bound by II_eps_min
. I'll investigate!
How small does it need to be? 1e-25 still good on my side
1e-25 should be ok. I only converge in SVD, do you converge with LU?
no not currently. Field split with LU on each solve could be something to try
So what is the 'molecular' viscosity of ice? That can be a definite low-bound. So viscosity that does not come from this model that you have based on the gradient of speed. Something equivalent to the molecular viscosity of water for example.
we cant solve for 0 viscosity (Eulerian flow) with these methods. We can do those flows in MOOSE at high Mach number that's it
So what is the 'molecular' viscosity of ice? That can be a definite low-bound. So viscosity that does not come from the stress tensor. Just like the molecular viscosity of water
That's a good question, I'm looking into this!
On hold for now. Only converges when viscosity reaches a low or high cap, otherwise can't get it to converge. Let's focus on iceslab
and icestream
AD.
Found at least part of the problem. Pressure boundary condition must be consistent with the pressure profile created by gravity. So use this:
[outlet_p]
type = INSFVOutletPressureBC
variable = pressure
boundary = 'right'
functor = ocean_pressure
[]
[]
# ------------------------
[Functions]
[ocean_pressure]
type = ParsedFunction
expression = '-917 * 9.81 * ( (y * cos(bed_slope / 180 * pi)) - (x * sin(bed_slope / 180 * pi)))'
symbol_names = 'bed_slope'
symbol_values = '${bed_slope}'
[]
[]
note 917 for density instead of 1020 and the minus on the x term converged to II_eps_min = 5e-18 with LU for me now, instead of 1e-16
this correspond to a viscosity of
Maximum allowed viscosity : 2.19059e+13
note 917 for density instead of 1020
I'm not sure I understand this since we're trying to apply the hydrostatic pressure with sea water (density ~ 1028)? Switching for 917 would be applying ice pressure.
ah ok this makes sense. Lemme try 1020
Seems to work still, until 9e-19 for II_eps_min or mu = 3.87975e+13
Do you think this could be a numeric threshold or is it still linked to the physics?
This made me look at the physics again, and I dont like that in the FE solution you seemingly have ice coming from the top boundary. Can you use this in FV? It seems to improve convergence again and it will prevent the same thing from happening
[freeslip_x]
type = INSFVNaturalFreeSlipBC
variable = vel_x
boundary = 'top'
momentum_component = 'x'
[]
[freeslip_y]
type = INSFVNaturalFreeSlipBC
variable = vel_y
boundary = 'top'
momentum_component = 'y'
[]
Do you mean https://github.com/AdrienWehrle/diuca/blob/main/problems/iceslab_ad_2d_SI.i when you mention the FE solution?
yes
I do have the same condition at the top in FE and FV, i.e. a INSFVInletVelocityBC
of value inlet_mps
. Do you mean that a free slip at the top would help convergence in FV compared to an inlet?
No top is the boundary with the air not the inlet. Left is the inlet. Free slip at the top ensures no incoming ice from there
Oh yes ok I see! I'll add this!
Could you push/share/paste the version of the FV input file you're working on so I make sure we're in sync?
see #19
Thanks a lot!
It converges for me too now!
In the FE solution, should I use INSFEMomentumFreeSlipBC
to 0 at the top boundary as an equivalent to INSFVNaturalFreeSlipBC
in FV?
Try an INSAD version if possible. For boundary conditions we have to be wary of the settings you set to the kernels. Look at examples in the navier stokes module to get it right
In INSAD it looks like a similar BC would be INSADMomentumNoBCBC
like eg here
looks like it. do the results look better then?
So I first had to fix the ocean pressure as it was reversed (it needs to be max at z=0, since that's where the water column thickness is max)
[Functions]
[ocean_pressure]
type = ParsedFunction
expression = '-1028 * 9.81 * ( ((thickness - y) * cos(bed_slope / 180 * pi)) )'
symbol_names = 'bed_slope thickness'
symbol_values = '${bed_slope} ${thickness}'
[]
[]
Then when I compare iceslab
AD (first screenshot) and FV (second one) on a similar setup, I get very different results. I'm investigating.
@GiudGiud , I think you worked on a version of icestream_ad_3d_SI.i
during our meeting yesterday, could you push it somewhere here? I remember we eg added the pressure pinning point.
Input is here https://github.com/GiudGiud/diuca/blob/PR_ad_pin/problems/icestream_ad_3d_SI.i
there s also the commenting out advection kernels there.
Trying to understand what's wrong with the downstream pressure: I'm using a control to enable ocean_pressure
only after the second timestep, while no BC is set on this side before. Velocity increases when the ocean pressure is applied, which really shows something is wrong here.
What's wrong with downstream pressure is the FE setup. I dont think it's working as expected. I would look at other pipe cases and see if we are doing something wrong. If not, then use finite volume
I feel like those very high speeds at the front were already visible while we were working on the iceslab
problem (see screenshot above) so indeed it might simply be the FE setup. I'll still try the few PETSc and preconditioner options we talked about via mail, and see if that could fix the problem.
It wont fix the problem with FE, the solution we have with FE (and FV) are converged with regards to the nonlinear solve. So having a better faster scalable solve should not change the solution. I would say focus on these solver efforts still, but for FV
Understood, let's focus on FV!
The
iceslab
problem is currently being solved in AD and FV. The AD implementation took some more effort due to OS-specific issues (e.g. floating point errors and pivots).The results from the two methods should be similar, let's make sure they are!