The device currently starts "time" as zero and waits to do the first read until sensorSampleInterval, which is a long wait for the user. Either do a sample during setup(), or advance the initial time to sensorSampleInterval during setup(), so the first read happens in the first iteration of loop(). loop() will then immediately reset time to zero and we've entered the normal device pattern?
Fixed by initializing timeLastSample as -(sensorSampleInterval *1000), which triggers the sample read on the first iteration of loop(). loop() resets timeLastSample to a normal value after the first sample read.
The device currently starts "time" as zero and waits to do the first read until sensorSampleInterval, which is a long wait for the user. Either do a sample during setup(), or advance the initial time to sensorSampleInterval during setup(), so the first read happens in the first iteration of loop(). loop() will then immediately reset time to zero and we've entered the normal device pattern?