Closed joey-coleman closed 10 years ago
See also the discussion in 3220182 (expressions in periodic threads) although here the arguments are always evaluated in an implicit duration (0) context rather then let time progress, but still with left-to-right evaluation. If an operation is called that blocks while evaluating the arguments, than the interpreter should just yield a run-time error (deadlock).
Nick Battle wrote:
I personally would prefer some sort of policy indicator, like
or (like the CPU scheduling policy), rather than an arbitrary function.
I disagree (strongly); this should be left entirely to the specifier. If the specifier wants to have this kind of common functionality, provide it as a standard library but not as a built-in language primitive.
Nick Battle wrote:
We should also make sure that the current (fixed duration) case can be implemented in the extended form (ie. internally we only have one form).
Seems to me a no-brainer. duration (x) is equal to duration(x,x,f) whereby f is the identity function (returning x).
Comment by ardbeg1958, 2011-07-03 08:37:08:
I think the current syntax 'duration' '(' expression ')' statement is suffice for Marcel-san's purpose if we allow arbitrary expressions.
Suppose we have a number generator class called "Gaussian" and you can pass two parameters (lower, upper) to that constructor. You can get an instance of this class like:
ggen := new Gaussian(lower, upper)
We can define "next_value" method on Gaussian class. The method returns a random value chosen between lb and ub (and statistically falls in a gaussian distribution).
with this object. we can use duration/call like:
duration (ggen.next_value()) statements
This is just an example. You can define any number generators that fit your needs.
There are two side effects from this alternative proposal from Sako-san that have a significant (imho negative) impact on useability. First, the lower and upper bound are now hidden inside the object managing the probability and second of all you would need to create these instances everywhere since the upper and lower bounds are likely to be different everywhere. This will make the administrative burden on the specifier very high and error prone (you'll have to encode it everywhere explicitly).
But moreover, loosing the ability to express the upper- and lower bound in the syntax, makes it impossible to do downstream analysis with other tools (for example for WCET analysis), without first running a simulation run to obtain those values from the object passed, or by intricate static analysis on the model.
Imho explicit lower and upper bounds in the syntax are very advantageous (even increase readability) and passing these parameters to a named operation, taking these parameters as its input provides all the flexibility we need.
Comment by nick_battle, 2011-07-05 07:14:02:
This is true, though an alternative implementation that used a static function with arguments, rather than an object with constructed arguments, would be almost identical to your proposal: ie. instead of duration(1,10,gaussian) you would call duration(gaussian(1,10)) - it's just that the latter doesn't require this new RM.
A more interesting point is about downstream analysis - presumably you're talking about the runtime logging and analysis here? I would say that we ought to be carefully defining the syntax and semantics of the log file (which we're not, currently), and if this (or any other) RM requires logging changes, then that ought to be carefully specified in the proposal.
yes, I understand all that, but the essence of the RM is the change to the concrete syntax, because it has significant added value IMHO, just like the parameters passed to periodic and sporadic.
Furthermore, the probabilistic object approach makes it hard to understand (a) when the probabilitistic entity is actually created and (b) how it behaves when the duration statement is called more than once, for example inside a loop. Moreover, you may want to refer to state variables of the caller object (the object from which the duration or cycles is called), for example to count how often a certain operation is executed, which may affect the selection criterion from the interval (i.e. to encode caching effects). This would be very cumbersome from the alternate proposal.
With downstream tools I do not necessarily mean tools that deal with trace files; it can be a tool that works on the AST directly, without executing the specification.
Run-time logging is a tool issue and that has nothing to do with the language change proposal, and I disagree strongly that it should be part of the RM. It is something to do after the RM is accepted (to that I do agree!).
Comment by ardbeg1958, 2011-08-21 09:50:33:
There are two side effects from this alternative proposal from Sako-san that have a significant (imho negative) impact on useability. First, the lower and upper bound are now hidden inside the object managing the probability and second of all you would need to create these instances everywhere since the upper and lower bounds are likely to be different everywhere. This will make the administrative burden on the specifier very high and error prone (you'll have to encode it everywhere explicitly).
Even with Marcel-san's proposal there should be some administrative burden on the specifier since you have to specify lower/upper bounds parameters in several places in your specification.
For a administrative view, how about following idea?
Suppose we have a map which holds probability objects.
aMap = { policy1 |-> new Gen(l1, h1), policy2 |-> new Gen(l2, h2), ... policyN |-> Gen(lN, hN) }
Then you can use these like
.... duration (aMap(policy1).next_value()) statement1 .... .... duration (aMap(policy2).next_value()) statement2 .... and in another place .... duration (aMap(policyN).next_value()) statement3 ....
You can define your various policies in one place (the map) , and I think this can make a specifier's work easier.(?)
But moreover, loosing the ability to express the upper- and lower bound in the syntax, makes it impossible to do downstream analysis with other tools (for example for WCET analysis), without first running a simulation run to obtain those values from the object passed, or by intricate static analysis on the model.
If you use a consistent style in your specification like my example, I think it is not so hard to find "which policy applied in where".
If the map is static one, you can query the properties like upper/lower bounds. And also after a simulation, you can ask each objects in the map about generation history.
Comment by ardbeg1958, 2011-08-21 10:04:09:
Furthermore, the probabilistic object approach makes it hard to understand (a) when the probabilitistic entity is actually created and
I think "probabilitistic entities" should be prepared at an initialisation time (see below). But it's only my personal observation.
(b) how it behaves when the duration statement is called more than once, for example inside a loop. Moreover, you may want to refer to state variables of the caller object (the object from which the duration or cycles is called), for example to count how often a certain operation is executed, which may affect the selection criterion from the interval (i.e. to encode caching effects). This would be very cumbersome from the alternate proposal.
Even in this case you can pass a state-variable derived value to the expression like:
duration (Gaussian(lower, upper, ... some state-variable derived value ...))
or if you use a generator object
in instance variable section
ggen : Gaussian := new Gaussian(lower, upper)
.....
in an operation body duration (ggen.next_value( ... some state-variable derived value ...))
Comment by ardbeg1958, 2011-10-08 09:32:53:
Hi all.
I thought this issue again and found my proposal would not be appropriate to Marcel-san's requirement. Actually, I missed the following phrase:
"in other words they are evaluated with duration (0)"
To archive this, we can't pass an expression to duration/cycle since it consumes unwanted cycles.
Anyway, I agree with Marcel-san's original requirement.
Please find enclosed a VDM-RT project that you can investigate with Overture 2.0.4; it is input for the discussion on 10/11.3, and RM #19.
The specification revisits the Car Radio Navigation case study from my PhD thesis, see http://www.marcelverhoef.nl/uploads/Main/thesis.pdf, section 2.2. The aim of the specification is to assess the WCET of two competing use case scenarios running simultaneously on the same embedded architecture, modelled in VDM-RT. Although the specification isn't quite finished yet, the basics already work and it can already be used to demonstrate the use of probabilistic timings in the context of cycles and duration statements. Have a look at MMI.vdmrt (I copied the relevant parts here for convenience):
values
-- estimated WCET times
MMI_VKP : nat = 1E5;
MMI_UV : nat = 5E5;
MMI_AKP : nat = 1E5;
MMI_UA : nat = 5E5;
MMI_UT : nat = 5E5;
-- interval values
MMI_UVL : nat = MMI_UV - 1E5;
MMI_UVH : nat = MMI_UV + 1E5
functions
private interval: nat * nat -> nat
interval (pl, ph) ==
let diff = ph - pl in
pl + MATH`rand(diff+1)
pre pl < ph
operations
public async UpdateVolume: nat ==> ()
UpdateVolume (pid) == (
-- perform local processing
cycles (interval(MMI_UVL, MMI_UVH)) skip;
-- inform the environment
duration (0) CarRadNav`user.NoticeVisibleChangeVolume(pid)
)
In the operation UpdateVolume, it is now possible to state that the execution time of the body (abstracted away as a skip statement) will be in the interval [MMI_UVL, MMI_UVH] and using the interval function, we choose a random value from this interval every time UpdateVolume is executed. When the specification is executed with "cycles (MMI_UV) skip;" then the WCET value of the entire scenario is 0.037 seconds. Using the randomized version as shown above I get the same average value (after 500 runs), but the response time now varies between 0.033 and 0.041.
This example demonstrates that RM #19 can IMHO be rejected, as the requested functionality can now be obtained through the use of the standard MATH library and the results of RM #17 (as was suggested by Hiroshi-san).
Regards, Marcel
Files were attached:
- CarRadNav_29032014.zip (8300 bytes)
duration statement = 'duration' '(' expression ')' statement -- the original definition | 'duration' '(' expression, expression, ')' statement -- extension one | 'duration' '(' expression, expression, name ')' statement -- extension two
Extension one specifies the upper and lower bound (both real >=0 with lower < upper) of the applicable time interval, whereby the value is selected at random.
Extension two specifies the upper and lower bound followed by the identifier of a function or operation taking two parameters, which gives the user control how to select the value from the [lb, ub] interval. Standard functions could be provided for normal and exponential distributions. Operations should be allowed so that the model state can be inspected in order to influence the selection process (example: first time expensive (close to upper bound) and next time cheap (closer to the lower bound) etc.
Note that evaluation of these expressions and operations shall NOT affect the notion of time in the model (in other words they are evaluated with duration (0).
For extension one, the same interpreter elaboration mechanism can be used as for random in the MATH standard library. So a generic seed value set once at the beginning of the simulation run, and pseudo random number for all following calls to random, such that reruns of the same specification with the same seed result in the same output deterministically.
The main added value of this approach is that it allows easy modelling of caching type behaviours.
This extension was explored in my PhD work, but was never properly published (see the future work section of my thesis).