Closed se5a closed 6 years ago
Even just something really simple like this:
float[] _testArray = new float[11] { 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10 };
ImGui.PlotHistogram("##testgram", _testArray, 0, "", 0f, 11f, new ImVec2(), 1);
gives a result like this:
Stride is size in bytes. so the correct code for the above is:
ImGui.PlotHistogram("##testgram", _testArray, 0, "", 0f, 11f, new ImVec2(), 1 * sizeof(float));
I've been struggling to figure this out, just trying to display a game physics frame rate as a histogram. I'm storing the length of time for a frame to an float[] array. I had some idea that the valueOffset could be used to position the 'start' of the array when drawing the histogram. so I set the array to 120, then stored the currentIndex for the position of the last write to the array, when the currentIndex got to array.Length (120) I set the currentIndex to 0. and write over top of the existing array position. I then used the currentIndex as the valueOffset. is this correct? no matter what I do, it seems to draw the most recent value somewhere in the middle of the plot. I've tried looking at some of the C examples but it looks like most of them seem to have the valueOffset a set number, I'm kind of not sure what I'm supposed to be doing or how this is supposed to work here.
The other problem I was having was the minScale and maxScale, how do these work? if I check the array that I'm trying to display I get values of all < 1.0 however the histogram repeatedly shows bars of > 1.0, and sometimes even negative numbers. even though no such values exist in the array. Been messing with this for a day now and I must be missing something, I've tried all sorts of things and feel more confused about how this works than before I started.