I tried reducing the total measurement period below 100ms, but it had no effect.
I set to ranging mode = 1 and timing budget of 15 ms, but the intermeasurement period stayed at 100ms. (This was verified by measuring the processor ticks after each measurement.)
In the library I manually changed the init_seq to shrink the intermeasurement period to a low value as below. That reduced my measurement period to the expected 15 ms.
I suspect there is either:
something wrong in the table of values that are being written during the timing_budget change
incorrect registers are being written to for _RANGE_CONFIG__TIMEOUT_MACROP_A_HI and _RANGE_CONFIG__TIMEOUT_MACROP_B_HI when the timing_budget is changed
there should be an additional function to separately adjust the intermeasurement period
Here's the hack I made to the library init_seq to speed up the measurements:
0x00, # 0x6c : Intermeasurement period MSB, 32 bits register
0x00, # 0x6d : Intermeasurement period
0x00, #0x0F, # 0x6e : Intermeasurement period ##### I hacked this line to 0x00
0x89, # 0x6f : Intermeasurement period LSB
I tried reducing the total measurement period below 100ms, but it had no effect.
I set to ranging mode = 1 and timing budget of 15 ms, but the intermeasurement period stayed at 100ms. (This was verified by measuring the processor ticks after each measurement.)
In the library I manually changed the
init_seq
to shrink the intermeasurement period to a low value as below. That reduced my measurement period to the expected 15 ms.I suspect there is either:
timing_budget
change_RANGE_CONFIG__TIMEOUT_MACROP_A_HI
and_RANGE_CONFIG__TIMEOUT_MACROP_B_HI
when thetiming_budget
is changedHere's the hack I made to the library
init_seq
to speed up the measurements: