When run during the last 100ms of any given second, the following scenario fails:
Scenario: DateTime Accuracy to One Second
Given I have a DateTime value
And I have a second DateTime value that varies from the first by 100 milliseconds
When I adjust the accuracy of each DateTime value to one second
And I compare the first and second DateTime values
Then the resulting difference should be zero
This fails because of the following:
// TimeExtensionSteps.cs:47
[Given(@"I have a DateTime value")]
public void CreateFirstDateTime()
{
_context.FirstValue = DateTime.Now;
}
[Given(@"I have a second DateTime value that varies from the first by (.*) milliseconds")]
public void CreateSecondDateTime(int millisecondAdjustment)
{
_context.SecondValue = _context.FirstValue.AddMilliseconds(millisecondAdjustment);
}
If DateTime.now's Millisecond property is in the range (900..999) and millisecondAdjustment is 100 or more, this test will fail.
When run during the last 100ms of any given second, the following scenario fails:
This fails because of the following:
If
DateTime.now
'sMillisecond
property is in the range (900..999) andmillisecondAdjustment
is 100 or more, this test will fail.