Kalhama / Gluwave

Web based open loop application for diabetes management
https://iob.kalhama.fi
1 stars 0 forks source link

Observed carbs #21

Closed Kalhama closed 5 days ago

Kalhama commented 4 weeks ago

Current progress looks like this but I don't think this fits for the main screen.

Screenshot 2024-08-22 at 20 49 21

Other remarks

Kalhama commented 3 weeks ago

I played also with a cumulative view, but this is not very useful when the integration error becomes too great. Below ISF was high during the night and low after waking up. Now if these errors had same directionality they would accumulate quite badly. Also y axis resolution dissipates over on high carb loads.

Screenshot 2024-08-28 at 12 59 38

I think that best solution is to implement something similar to Loop so observed carbs are always split to corresponding meal entries. This would also yield more meaningful estimate for carbs on board.

I played with SQL query. However I am not fan of all the fussing with the timestamps. But unfortunately the carb frequency is whenever user inputs data and BG data is also unreliable and not necessarily exactly one minute. So least common denominator is something very low, like one minute ;).

Perhaps sufficient timestamps would be every time a) carbs start b) end c) NOW(). Only edge case is (that is also prevalent here) when the absorbed_carbs.timestamp (=glucose.timestamp) does not match this series that well. I guess we could counteract this by doing a linear interpolation for generating missing glucose readings.

WITH minutes AS (
  SEELCT generate_series(from, NOW(), interval '1 minutes') AS timestamp
), carb_minimum_absorption_rate AS (
  select 
    minutes.timestamp as timestamp,
    decay,
    value,
    id,
    value / decay / 1.5 as minimium_absorption_rate,
    SUM(value / decay / 1.5) OVER (PARTITION BY timestamp) AS total_minimium_absorption_rate /*if there are multiple meal entries for one timestamp, sum all of their minimum_absorption_rates */
  from minutes
  left join carbs on minutes.timestamp < carbs.timestamp and minutes.timestamp + interval '1 minute' >= carbs.timestamp
) carb_minimum_absorption_rate_observed_carbs AS (
  SELECT 
     timestamp,
     decay,
     value,
     id,
     minimium_absorption_rate,
     total_minimium_absorption_rate,
     minimium_absorption_rate / total_minimium_absorption_rate * (interval '1 minutes' / observed_carbs.interval * observed_carbs.rate) AS observed_carbs
  FROM carb_minimum_absorption_rate
  LEFT JOIN observed_carbs 
    ON observed_carbs.timestamp <= carb_minimum_absorption_rate.timestamp 
    AND (observed_carbs.timestamp + observed_carbs.interval) > carb_minimum_absorption_rate.timestamp /* todo this should add carb.decay * 150% */
)

/* TODO add group by id to get how observed carbs per food */
Kalhama commented 3 weeks ago

Other option from Loop algo would be stacking carbs temporally. If I have just eaten a large meal and I eat more the digestion is just gonna run longer, not higher.

Exception for this would be short carbs that have different pathway thus they do stack.

Edit: After thinking about this overnight I think this can be achieved my adding more carbs and decay time to the already existing meal.

Kalhama commented 3 weeks ago

Perhaps sufficient timestamps would be every time a) carbs start b) end c) NOW(). Only edge case is (that is also prevalent here) when the absorbed_carbs.timestamp (=glucose.timestamp) does not match this series that well. I guess we could counteract this by doing a linear interpolation for generating missing glucose readings.

I continued and played with these ideas at here #60554d - query_all_metrics_at_any_timeframe.sql.

This demonstrates a neat way to calculate glucose, total_carbs_absorbed (predicted) and total_insulin_absorbed for any timesteps. With these we can calculate total_carbs_absorbed (observed).

Now with these four metrics we can continue on splitting the observed carbs for each meal.

TODO for query_all_metrics_at_any_timeframe.sql

Spin couple of new enchaments