I've used the plugin a while and generally have my bed dialed in where I want it. But I do like to check occasionally to make sure it's relatively level. I have GCode added to do the bed level every print. What throws me off now is the autoscaling of the color scheme for the heatmap
What I'd like is to be able to set my own scale, and let the min and max colors be for everything outside my the user-set range. Maybe I pretty sure I'm within 0.10. So I set that as the scale in the plugin. When the plugin gets hold of the real values, it would do something like this:
I'm assume the color scale has 7 values.
User set scale is 0.10. Let's call that uVar for user variance
Get values from the printer, find min and max, and calc the midpoint. Let's call that mid
Calc the min and max for the color scale (cMin and cMax) by doing: cMin = mid-(uVar/2) and cMax = mid+(uVar/2)
All values lower than cMin get the lowest color in the scale, all values higher than cMax get the highest color in the scale.
The remaining five colors are allocated similar to how I think they are now, but using only five values to cover the range between cMin and cMax.
If I have set my scale value wrong, this could lead to a worthless heatmap, but that's my fault. The heatmap would still show me max and min, and total variance so I have enough information to reset my scale to an appropriate value. If I do this right, I think it makes the successive runs while debugging much more intuitive, because the scale is staying constant until I change it.
I've used the plugin a while and generally have my bed dialed in where I want it. But I do like to check occasionally to make sure it's relatively level. I have GCode added to do the bed level every print. What throws me off now is the autoscaling of the color scheme for the heatmap
What I'd like is to be able to set my own scale, and let the min and max colors be for everything outside my the user-set range. Maybe I pretty sure I'm within 0.10. So I set that as the scale in the plugin. When the plugin gets hold of the real values, it would do something like this:
uVar
for user variancemid
cMin
andcMax
) by doing:cMin = mid-(uVar/2)
andcMax = mid+(uVar/2)
cMin
get the lowest color in the scale, all values higher thancMax
get the highest color in the scale.cMin
andcMax
.If I have set my scale value wrong, this could lead to a worthless heatmap, but that's my fault. The heatmap would still show me max and min, and total variance so I have enough information to reset my scale to an appropriate value. If I do this right, I think it makes the successive runs while debugging much more intuitive, because the scale is staying constant until I change it.