Open jcotela opened 7 years ago
I agree with this, in any case in the short therm I think it would be nice to have #651 ready for the release, because them it will be possible to use beams in the release (finally!!)
@loumalouomega This is not for immediate implementation, and I'd definitely not block #651 for this.
I have been in favor of our "enriched" json syntax from beginning. Nevertheless I need some time to evaluate your proposal and give my comments
I think it's a good idea. I don't know if this would be more difficult, but one idea would be to have a generic type like \@data_set that takes care of checking the type and the dimensions (fixed or variable) and use aliases for special types like \@table, \@vector, etc.
In the default parameters one could use either data_set or an alias, something like:
\@table
"load_curve_as_data_set" : {
"@data_set" : {
"@dtype" : "float",
"@dims" : ["*", "2"],
"@default_values" : [[1.0, 0.0], [2.0, 0.0]]
}
},
"load_curve_as_table" : {
"@table" : [[1.0, 0.0], [2.0, 0.0]]
}
\@vector3d
"direction_as_data_set" : {
"@data_set" : {
"@dtype" : "float",
"@dims" : ["3"],
"@default_values" : [0.0, 0.0, 0.0]
}
},
"direction_as_vector3d" : {
"@vector3d" : [0.0, 0.0, 0.0]
}
If multiple types are possible in the defaults, then one could indicate this by
"direction" : {
"@vector3d" : [1.0, 0.0, 0.0],
"@string" : "wall_normal"
}
While on the topic of json setting validation, I think there is also a strong need for a functionality that allows extraction and validation of sub parameters by a derived class, allowing the remaining parameters to be passed to the base class for validation. This enables a staged form of validation. The problem now is that default parameters from a base class are being copied to the derived classes. These copied parameters diverge as keys and default values are updated in one file but not the others. With the added functionality, a default parameter would be ideally only defined at one place in the code. There is a workaround function for this called validate_and_transfer_matching_settings() in applications/StructuralMechanicsApplication/python_scripts/structural_mechanics_solver.py.
2 comments on my side.
1 - i agree about creating a few types, (i would call it @Array3 rather than @vector but this is a minimal thing). I have the feeling that it will not be easy to check it in the validation, but i will give it some thinking. @msandre i would prefer something a little less verbose than what you propose (on the line of Jordi's proposal)
2 - @msandre i did not get your point about the second issue. Could u describe it with a small example?
@RiccardoRossi here is an example of the problem:
class BaseSolver(object):
def __init__(self, custom_settings):
default_settings = Parameters("""{
"model_part_name" : "unkown_name"
}""") # here we specify "model_part_name" the first time
custom_settings.ValidateAndAssignDefaults(default_settings)
# now the one-shot ValidateAndAssignDefaults means that we need to copy settings to each derived class
class DerivedSolver(BaseSolver):
def __init__(self, custom_settings):
default_settings = Parameters("""{
"model_part_name" : "unkown_name",
"derived_setting" : 1.0
}""") # here we specify "model_part_name" a second time along with a new "derived_setting"
custom_settings.ValidateAndAssignDefaults(default_settings)
Concerning the topic of current limitations for validating json settings in kratos, I see this as an important issue. Adding functionality to perform staged validation is the way to go in my opinion (see the constructor in structural_mechanics_static_solver.py as an example of what I mean). I was actually waiting to post about this until things slowed down after the release. Let's wait to discuss this though until the issue of keywords is settled. I just wanted to bring it to attention if we are thinking of adding functionality to the Parameters.
@msandre, ok i see your point.
My idea to address such issues was to define a function "merge" in which for example the derived settings are written on the top of the base ones (overwriting the repeated parameters and adding the ones that are not)
in any case i agree about discussing after the release
Is there any update on this?
I took my time to think about it and I would go for something less verbose and very gradual. Let say, to have a default
keyword and see how it works. Then I would go for vector, matrix, so on.
Also I have checked different libraries doing that but there are no global trend I could find on this. Is anybody has found any similar usage?
Maybe we can reopen this discussion
I agree with this proposal. Maybe also keywords like @automatic
or @auto
should be employed to indicate the those variables will exist but its value will be calculated automatically by the CL (or element) in terms of the other variables (or subproperties' variables)
Since the CL cannot change the property values, the @automatic
variables should be autocalculated at the moment of reading them (in terms of subproperties variables, for example)...
Interesting.... @frastellini Would you please give us a concrete example?
@RiccardoRossi and me where discussing this option and decide to open a new issue (may be complementary to this one): #3354 I quote here one example reported in that issue:
Essentially the idea is that there are situations in which one has a very complicated material behaviour (for example an RVE or a complex composite CL) which needs to do some preprocessing (common to all the CLs of the same type).
for example one may want to compute the Elasticity tensor for an RVE and then use it very many times (it will be the same for all CLs sharing the same RVE prototype) or the equivalent young modulus for a composite material. It MAY make sense to store this information by computing once for a given property (it is something that should never change, and be shared by all the CL instances which belong to a given property).
In this case the format to indicate this would be something like the following:
"elasticity_tensor" : "@automatic"
or "equivalent_young_modulus" : "@automatic"
We have seen in some recent PR (#626, and I'd say that there is a related issue in #651) that the current JSON setting validation has some limitations:
We were discussing this with @RiccardoRossi yesterday and we are considering the addition of keyword-type arguments in JSON defaults definition to extend its current capabilities. This is our first, very rough proposal:
Note that some default values have been replaced by a JSON object, containing a name and a default value. The idea is that each of these names
@evaluable
,@table
,@interval
corresponds to a standard type, which will be validated using a custom function, and the corresponding value is the default. For example, the keywords defined here would check that the input verifies:@vector
: A 3-component array of floats (maybe we also want to accept a 2D one?)@table
: An array of arrays of double values, given either as [x,y] pairs or as [ [ x values...], [y values...] ]@interval
: A two-component array of floats (closed array). If only one component is given, it is understood as an open interval starting at the given value.@evaluable
: Something that can be used to obtain a double value: a float (constant input), a table or a string (to be understood as an interpreted function).This would also serve as a kind of format guidelines for JSON arguments, since it gives a formal definition for (for example) tables, preventing that different developers decide to accept them in different and mutually incompatible formats.
Obviously, everything is open for discussion at this point. The main questions would be:
Pinging @KratosMultiphysics/team-maintainers to encourage discussion.