Closed danamlewis closed 7 years ago
Dana, I think we are having a similar idea, maybe we can coordinate. My observation was similar-- I find that loop low temps me overnight and I suspect that some of that basal, pre-loop, was corrective and not really basal. Now that I have the data, though, I should be able to use the data to say what my true basal is instead of guessing at it.
I started a report in the same format as the Hourly Stats report, a Vertical Floating Bar Chart, thinking that seeing an average over time would be helpful. Do you think that's the best format? Suggestions? :)
One difficulty I'm having is the terminology of the netiob. netiob assumes that your basal is already correctly set, so your basal causes no +/- in your bg, which we know is not a totally true assumption. I would really like to get to a place where we don't say basal/bolus, and instead think of one stream of insulin, as in a real pancreas model! More simple, more true.
netIOB doesn't necessarily assume it's correctly set; it's just assuming that it's point zero, so anything above/below that level becomes negative or positive in comparison to what you normally get at that time.
I see your point, although I think it'll be a really hard model for people to understand since all of the modern pumps don't calculate netIOB of any kind, let alone net from zero that includes even baseline basal activity as positive. So maybe stage 1 is with netIOB with basals as baseline; and a second stage 2 option of socializing the idea of all insulin is positive netIOB?
Can you share some screenshots of what you're talking about having started, in either case? :) Thanks!
No screenshots yet, but I have a commit with an new report, rendering in dev, pulling the treatment objects and filtering them into 24 slots for the 24 hours.
Now I'm trying to wrangle the data!
In a data set like this: "timestamp":"2017-02-02T08:03:53Z","rate":0,"duration":30, "timestamp":"2017-02-02T08:54:02Z","rate":4.025,"duration":4.983333333333333, "timestamp":"2017-02-02T08:59:01Z","rate":0,"duration":0,
should i assume the 0 temp lasted 30 minutes, then back to normal basal rate from 8:33-8:54, temp basal from 8:54-8:59, then back to 0 for the last minute- 8:59-9:00...?
In the case of a 30 minute duration we'll need a check that there's not another temp basal before 30 minutes has passed. In the case of that last 1 minute- that's tricky. Do I make a check that all 60 minutes in the hour have been slotted with some data? Not sure how to attack that one.
A more typical set is more like this: "timestamp":"2017-02-03T08:09:18Z","rate":2.55,"duration":5.033333333333333, "timestamp":"2017-02-03T08:14:20Z","rate":3.675,"duration":4.883333333333334," "timestamp":"2017-02-03T08:19:13Z","rate":4.95,"duration":5.083333333333333, "timestamp":"2017-02-03T08:24:18Z","rate":4.525,"duration":4.683333333333334," "timestamp":"2017-02-03T08:28:59Z","rate":3.575,"duration":4.916666666666667," "timestamp":"2017-02-03T08:33:54Z","rate":3.125,"duration":5.083333333333333," "timestamp":"2017-02-03T08:38:59Z","rate":1.925,"duration":5.266666666666667," "timestamp":"2017-02-03T08:44:15Z","rate":0,"duration":0, "timestamp":"2017-02-03T08:59:13Z","rate":2,"duration":5.016666666666667,
In this set, the slots are not exactly 5 minutes apart which I find odd. Also, how can I account for the time between 8:00 and 8:09? Should I assume normal basal? OR delete those 9 minutes from the average?-- that seems more accurate. On the 0 temp here, too, there's a case where you can't use the 0 duration, you would need to calculate the duration by subtracting the next timestamp somehow.
Basically, a lot to consider. But do-able. Right now I have any treatment with a duration that will round up or down to 5 or 10 being added to a sum for that hour. It's a start!
The hourly stats report is the template I started from, looks like this: https://www.dropbox.com/s/p34dgqravuowrff/Screenshot%202017-02-08%2016.13.11.png?dl=0
I think the table looks really similar to the table you posted, so that's a good match.
I'm questioning if I can calculate a standard deviation like hourly stats does, and if that's the best way to display the insulin info on the graph. It would be better if instead of just adding and subtracting the deviation to the average, we could show actual dosage amounts in a cluster, creating kind of a heat map (for each hour) of what your actual dosages are.
In order to evaluate basal rates, at this point the easiest way to do that for DIY closed loopers is to see how much net basal IOB was given for an hour. However, other than eyeballing or manually adding up amounts, there's not an existing tool. I think the easiest way to achieve this (before the autotuning algorithm as detailed in https://github.com/openaps/oref0/issues/261#issuecomment-269412654 is completed) will be to render an hourly sum of netbasal IOB in Nightscout reports.
For each hour, it should display the positive basal iob; negative basal iob; and the net basal iob. It would be nice to run for a single day; or get the aggregate amounts for a time frame (like a 30 day average of each hour for each column) to help people decide whether to change underlying basals. (Note of course that during common meal hours you probably wouldn't use this for basal adjustment).
Example visual of the columns and hourly breakdown: https://twitter.com/danamlewis/status/645679089433481216?ref_src=twsrc%5Etfw