Closed adamgreig closed 10 months ago
Would it be useful to switch to u64
as the argument type, since the scale has changed three orders of magnitude?
Users of the trait now have significantly less range to delay the core.
Max delay is now ~4.29 seconds. Previously, the maximum was ~4,294 seconds.
The trait still has delay_us
and delay_ms
with u32, so I don't think that is going to an issue?
We discussed this again today and didn't reach a firm conclusion, so probably default to not renaming this, but I'll leave the issue open another week in case anyone else wants to chime in.
The trait still has
delay_us
anddelay_ms
with u32, so I don't think that is going to an issue?
If you look at the inner implementation of delay_us
and delay_ms
, they make a call to delay_ns
, which limits the max delay to ~4.29 seconds (4.29 billion nanoseconds).
If this isn't a concern, since users could call delay_*
functions multiple times to get longer delays, then it's a non-issue. It just seems like a good time to widen the argument bit size before the first stable release.
If you look at the inner implementation of delay_us and delay_ms, they make a call to delay_ns, which limits the max delay to ~4.29 seconds (4.29 billion nanoseconds).
They don't? The provided default implementation (the final implementer can override) makes multiple calls to delay_ns
to ensure the total delay is as many micro/milliseconds as required, even if that's many more than 4.29 billion nanoseconds.
The provided default implementation (the final implementer can override) makes multiple calls to delay_ns to ensure the total delay is as many micro/milliseconds as required, even if that's many more than 4.29 billion nanoseconds.
You're right, I misread the implementation somehow late last night, and thought there was only a single call.
In the end we decided not to do this for 1.0, i'm closing this to keep the roadmap clear.
The obvious name for the
DelayNs
trait isDelay
, but we've reserved that name for a future, post-1.0 trait once we have a goodDuration
. Right now it's not clear what thatDuration
will look like or do, so we've been leaving thinking about it until later.I suggest we rename the current
DelayNs
trait toDelay
now anyway:DelayNs
already does everything we want: pick a number of ns/us/ms to delay for, up to u32 of them, and you can always do it multiple times if that's not enough rangeDuration
type, it would be nicer to be stuck withDelay
thanDelayNs
;delay(Duration)
toDelay
and provide a default implementation in terms ofdelay_ns()
; if we keepDelayNs
then once we findDuration
we'll end up with both the new, goodDelay
and the old-but-kept-foreverDelayNs
My main point is that perhaps what we have in
DelayNs
is already good enough to warrant using the nameDelay
, and it's nice that we could conceivably add adelay(Duration)
method later (with some caveats) if we wanted to, while if we stick withDelayNs
now, we're sure to always haveDelayNs
and maybe also haveDelay
, with all the confusion that multiple overlappnig traits has brought us in the past.There are some downsides to "just add
delay(Duration)
later", though:delay_ns
, whereas the obvious de-novo construction would be to defaultdelay_ms/us/ns
in terms ofdelay(Duration)
. This is a pain, but I don't think it constrains whatDuration
looks like (presumably we'll need to be able to convert it to ns somehow anyway), and we can document that implementors should prefer to implementdelay()
themselves, and then implementdelay_ns
in terms ofdelay
.Delay
, which could be used to require a certain timebase for example. I don't fully understand the negative consequences here, but maybe @Dirbaio can expand on them later.Delay
trait to only ever havedelay(Duration)
, withoutdelay_ms()
methods, as is done in the stdlib. We couldn't do this rename and then later delete those methods, so we would be stuck with them inDelay
.It's not a slam-dunk either way, but it seemed worth discussing before we hit 1.0.