On my (non-HD) Kindle Fire, in this line of AndroidGame.java:
float scaleY = (float) frameBufferHeight
/ getWindowManager().getDefaultDisplay().getHeight();
getHeight() returns 1024 (for portrait let's say).
So, it computes scaleY from that number.
However, in AndroidFastRenderView.java, this line:
canvas.getClipBounds(dstRect);
returns a rect with top = 0 and bottom = 1004 (not 1024).
The missing 20 pixels are eaten up by the kindle's "bottom bar".
So, as a result, the 320!480 framebuffer is drawn onto a smaller surface than
the scaling math thought it would be.
So, if you plot a touch event onto the framebuffer, as you move towards the
bottom of the display, you will start to see that it is a small amount off due
to this discrepancy (whereas at the top, it is exactly dead-on).
So, I honestly do not know how (in a device independent manner) to detect
whether a "bottom bar" will be present. If you could, and you could ask how
many pixels it takes up, then you could account for that in the scaling math.
Alternatively, perhaps the initial scaling math can be used to start but then
you could expose a mechanism to update the scale factor (on the fly) by
recomputing it in the run() loop (in AndroidFastRenderView.java) from the
dstRect.height().
Thanks for an awesome book!
Original issue reported on code.google.com by samb...@gmail.com on 1 Jan 2013 at 2:36
Original issue reported on code.google.com by
samb...@gmail.com
on 1 Jan 2013 at 2:36