Closed jwallwork23 closed 4 months ago
Any thoughts @acse-ej321 @ddundo? Moving this functionality over from Goalie as it is better suited in Animate.
Hi @jwallwork23, I'm looking into this now and I'm probably defining these constraint fields wrong... could you please tell me how to properly define them?
I tried something like this:
Q = FunctionSpace(mesh, "CG", 1)
x, y = SpatialCoordinate(mesh)
h_min = interpolate(x+1., Q)
mp["dm_plex_metric_h_min"] = h_min
which then returned this error in metric.set_parameters(mp)
:
Error Traceback (most recent call last)
Line 127, in adaptor(mesh_seq, solutions, adapt_field, adapt_timedep, target_complexity)
125 var = solutions[\"h\"][i][0]
126 metric = RiemannianMetric(P1_ten)
--> 127 metric.set_parameters(mp)
128 metric.compute_hessian(var)
129 metric.normalise()
File ~/software/firedrake-sep23/src/animate/animate/metric.py:107, in RiemannianMetric.set_parameters(self, metric_parameters)
105 self.metric_parameters.update(self._process_parameters(metric_parameters))
106 with OptionsManager(self.metric_parameters, \"\").inserted_options():
--> 107 self._plex.metricSetFromOptions()
108 if self._plex.metricIsUniform():
109 raise NotImplementedError(
110 \"Uniform metric optimisations are not supported in Firedrake.\"
111 )
File petsc4py/PETSc/DMPlex.pyx:765, in petsc4py.PETSc.DMPlex.metricSetFromOptions()
Error: error code 63
[0] DMPlexMetricSetFromOptions() at /home/ubuntu/software/firedrake-sep23/src/petsc/src/dm/impls/plex/plexmetric.c:37
[0] PetscOptionsReal_Private() at /home/ubuntu/software/firedrake-sep23/src/petsc/src/sys/objects/aoptions.c:1027
[0] PetscOptionsGetReal() at /home/ubuntu/software/firedrake-sep23/src/petsc/src/sys/objects/options.c:2629
[0] PetscOptionsStringToReal() at /home/ubuntu/software/firedrake-sep23/src/petsc/src/sys/objects/options.c:2338
[0] Argument out of range
[0] Input string w₈₅₇ has no numeric value"
Same code works as expected with constant h_min
.
So this PR adds a new method enforce_variable_constraints, see https://github.com/pyroteus/animate/blob/48770bd06830de9c40cc95ed04e97d40aa099156/animate/metric.py#L273 which you should use and not set it in the metric parameters, so something like
metric.enforce_variable_constraints(h_min=h_min, h_max=h_max)
I guess in response to this we could do two things to avoid users making the same mistake:
enforce_variable_constraints
is called with <h_min/h_max/a_max>
when the equivalent dm_plex_metric_<h_min/h_max/a_max>
has already been set then raise an error, or at least a warning.set_parameters
which ensures that dm_plex_metric_<h_min/h_max/a_max>
may be converted to float
s.
Sound good?Sorry, I wasn't focused apparently! I saw this new function but my brain immediately thought it's the existing enforce_spd
function! But yes, I think this is a bit counterintuitive - and might make current user code outdated once this function disappears from animate and the functionality gets ported to petsc.
In my testing I am also not 100% convinced it's constraining lengths right... and I found it a bit hard to make sure. So I was thinking that maybe we could write a function that would check this automatically? There is probably a smarter way to do this, but I was doing e.g.
from firedrake.projection import project as fproject
P1 = FunctionSpace(msh, "CG", 1)
h_maxes = fproject(ufl.MaxCellEdgeLength(msh), P1)
and then trying to compare throughout the domain. I think this wouldn't be hard to code up?
But re my first point, maybe we could have something like this:
Keep user code the same:
mp = {'h_min' = variable_hmin, ...}
metric.set_parameters(mp)
# compute metric
metric.normalise()
Inside metric.set_parameters
:
if h_min/h_max/a_max is variable:
self.variable_constraints = True
Inside metric.normalise()
:
if self.variable_constraints:
enforce_variable_constraints()
I.e. so that we don't introduce any temporary functions into user code.
As it runs now the metric_parameters are initially set to an empty dictionary and the only parameter required by any other the other functions is 'dm_plex_metric_target_complexity' in normalise
. The option of enforcing_variable_constraints is useful for the user, as one could check they have been applied before sending the metric to PETSc, but also leaves room for potential conflict with any duplicate metric_parameters which have been set either independently or using the MetricParameters
class from goalie
. The metric_parameters are passed and enforced by PETSc with each adaptation, so those would potentially supersede any edits made by enforce_variable_constraints
if they were set more restrictively.
It seems like maybe an additional note in enforce_variable_constraints
letting the user know any similar constraints set via the metric_parameters will also be applied would be enough? @jwallwork23, I like your idea of a warning if a user applies this function and has the equivalent variables set in the metric_parameters already.
Okay how about this? I've modified the way that parameters are processed so that any variable ones get put in a separate dictionary. Also, enforce_variable_constraints
is now a private method that gets called within enforce_spd
if variable parameters have been specified. One limitation is that it only currently supports restricting both sizes and anisotropy at the same time. This is probably okay most of the time because unset h_max
/h_min
/a_max
would just use the default.
In my testing I am also not 100% convinced it's constraining lengths right... and I found it a bit hard to make sure.
Perhaps it's a bit misleading but these functions don't directly constrain the edge lengths or anisotropy of the adapted mesh. What they do is constrain the metric (magnitude and/or anisotropy), which influences the edge lengths and anisotropy of the adapted mesh. There is no guarantee that the constraints actually get enforced because this is dependent on the mesher (Mmg). However, they tend to be approximately met.
So there are two complications here:
Thanks Joe, that looks neat to me! And thanks for clarifying - during some tests I was testing some very high h_min values which I guess should lead to 1-3 edges on the boundary, but this was ignored. I guess it could have been due to what you said.
Thanks Joe, that looks neat to me! And thanks for clarifying - during some tests I was testing some very high h_min values which I guess should lead to 1-3 edges on the boundary, but this was ignored. I guess it could have been due to what you said.
When you say it was 'ignored', do mean that the resolution was only slightly off or completely different? Are you sure you used the right boundary tags? Please could you share the example you're referring to?
I got back to testing this now and running the burgers-hessian.py demo from goalie gives the following error:
Line 197
192 return continue_unconditionally
195 # With the adaptor function defined, we can call the fixed point iteration method. ::
--> 197 solutions = mesh_seq.fixed_point_iteration(adaptor)
199 # Here the output should look something like
200 #
201 # .. code-block:: console
(...)
223 #
224 # Finally, let's plot the adapted meshes. ::
226 fig, axes = mesh_seq.plot()
File petsc4py/PETSc/Log.pyx:115, in petsc4py.PETSc.Log.EventDecorator.decorator.wrapped_func()
File petsc4py/PETSc/Log.pyx:116, in petsc4py.PETSc.Log.EventDecorator.decorator.wrapped_func()
File ~/software/firedrake-sep23/src/goalie/goalie/mesh_seq.py:685, in MeshSeq.fixed_point_iteration(self, adaptor, solver_kwargs, **kwargs)
682 self.solve_forward(solver_kwargs=solver_kwargs)
684 # Adapt meshes, logging element and vertex counts
--> 685 continue_unconditionally = adaptor(self, self.solutions)
686 if self.params.drop_out_converged:
687 self.check_convergence[:] = np.logical_not(
688 np.logical_or(continue_unconditionally, self.converged)
689 )
Line 159, in adaptor(mesh_seq, solutions)
157 hessian.set_parameters(mp)
158 hessian.compute_hessian(sol[j])
--> 159 hessian.enforce_spd(restrict_sizes=True)
160 metric += dt * hessians[0].intersect(hessians[1])
161 metrics.append(metric)
File petsc4py/PETSc/Log.pyx:115, in petsc4py.PETSc.Log.EventDecorator.decorator.wrapped_func()
File petsc4py/PETSc/Log.pyx:116, in petsc4py.PETSc.Log.EventDecorator.decorator.wrapped_func()
File ~/software/firedrake-sep23/src/animate/animate/metric.py:298, in RiemannianMetric.enforce_spd(self, restrict_sizes, restrict_anisotropy)
296 kw[\"restrictAnisotropy\"] = False
297 v = self._create_from_array(self.dat.data_with_halos)
--> 298 det, _ = self._plex.metricDeterminantCreate()
299 self._plex.metricEnforceSPD(v, v, det, **kw)
300 size = np.shape(self.dat.data_with_halos)
ValueError: too many values to unpack (expected 2)"
}
@ddundo It seems to me that your Firedrake/PETSc installation might be out of date. I just merged main
into this branch to trigger the test suite and it passed fine. Indeed, I do recall that the signatures of some of the plex functions were changed at some point (not by me).
Thanks @jwallwork23 , I had to reinstall :) it must have happened in a very short timespan when I changed branches so I immediately assumed it was some of your petsc changes!
I retested everything, including on the Burgers demo example to easily share - everything looks good. The resolution was only slightly off when I would test these extreme cases.
I have just a small quality of life suggestion which I will comment on in the file directly
Thanks @ddundo - ready for re-review.
No worries. Thanks again @ddundo!
Closes #35.
The
enforce_spd
method currently only supports constant values forh_min
,h_max
, anda_max
. This method provides a temporary way to enforce variable values, until the functionality is ported over to the PETSc side.