Closed bonsaipanda closed 2 weeks ago
Could you test compiling it on the Raspberry Pi for
PLATFORM_DESKTOP_GLFW
?
Yes I can, I'll have a look and get back.
Surprising results on Raspbian desktop. I took the controls_test_suite (https://github.com/gen2brain/raylib-go/tree/master/examples/gui/controls_test_suite) and added a camera2D rotation of -90 degrees. Both mouse and touch are ignoring the camera rotation. And if the mouse is attached at the same time I touch the screen, the cursor goes nuts as the absolute and relative inputs are competing with each other (which is expected, sort of).
I remembered that the display I had was SPI, but it was an HDMI screen that connected via an additional USB to the Pi, which then supplies the touch for the system via mouse emulation (I'm guessing, or multi-touch isn't supported on the display) as gestures demo doesn't register any gestures.
Two methods I've tried and both have the same end result (mouse coordinates are correct, but touch reacts only to original non-rotated locations).
First method is via the camera and enclosing drawing in Mode2D >>
rl.BeginMode2D(rl.Camera2D{
Target: rl.Vector2{X: float32(screenHeight) / 2, Y: float32(screenWidth) / 2},
Offset: rl.Vector2{X: float32(screenWidth) / 2, Y: float32(screenHeight) / 2},
Rotation: -90, // Rotate 90 degrees counterclockwise
Zoom: 1.0,
})
Second method is with transform >>
rl.PushMatrix()
rl.Translatef(screenWidth/2, screenHeight/2, 0)
rl.Rotatef(-90, 0, 0, 1)
rl.Translatef(-screenHeight/2, -screenWidth/2, 0)
On the Pi, mouse coordinates are unrotated, just like the touch, but on the MPC Live, mouse is hitting the rotated GUI elements.
Would love to have a Raylib example where the view is rotated along with the input or colliders in the proper way as my hack is extra dirty (but works!).
I'm actually puzzled about the mouse coords getting rotated
So the touch is actually ok, it's just mouse (relative) coordinates that are bugged on the SPI interface? welp.
IMHO, both mouse and touch input should hit the rotated elements, especially with GUI.
[EDIT] I should word this differently as I don't understand the internals of the library properly and I'm just going for what I've seen in the source and on screen.
The mouse coordinates are not 'rotated', it's just hitting the rotated elements correctly, compared to touch, which is acting like the GUI is not rotated and you have to hit the original location (empty space) to hit them, which is 100% wrong.
This may take some time tho.
For me, this is not a priority as I got it working by setting up a dedicated build chain for the MPC Live with the hacked library and it works perfectly now, but the hack is ugly and will break in the future.
From my developer pov, the expectation for view transform is that items are actually where they visually appear to be. Just like on the main example on the Gio GUI website (https://gioui.org/) when you press the free transform checkbox and you can hit the elements even when the whole UI is floating >>
So for me, whatever is happening to the mouse coordinates on the MPC Live, it is correct and what happens on the Pi is an unwanted effect. :D
you're not using any OS/System-level call to rotate the screen
Correct. I did find a command for the DRM/KMS driver that could rotate the view on driver level but I don't know how to call it or use it. I'm just using Raylib's built-in rotation.
This is on DRM with an SPI touch display?
Because on desktop, everything works a-ok. But on my MPC Live screen, mouse coordinates are indeed correct and hit their targets, just the touch is wonky. But I do have the hack running so I guess I can live with that.
[EDIT] Also, you're not rotating the view using the methods or that is some method I have not seen anywhere(?).
That would definitely indicate that either my display is a special case or the Go bindings are doing something on their own in between. If you can, test also with a UI component - in the meantime, I'll figure out a way to try and get your code sample to build and run on the device. Also, huge thank you for taking the time and testing this!
Oh crappiola maximus. I was so hoping that the problem would be either in the Go bindings or some skipped setup step which would be easy to fix. I'm running the C example right now and it's exhibiting the same issue, where the red circle is following my finger but the blue circle is going in the original unrotated axis. My fear is that the display is somehow forced on driver level (or something else the end user can't really change) to behave this way as this is a proprietary platform and I'm really not allowed to poke around.
I have my hack in the library working, so I'm sort of covered on this front. Huge kudos @asdqwe for taking the time and looking into this as I've been pulling my hair for the past five days trying to debug this and not knowing what to do, while continuously muttering "there is no way the library was designed this way". I feel like Raylib is working 100% and it's something else behind the scenes forcing the touch events to stay that way.
If needed for posterity, I can record videos of what is happening on the screen.
I built the C example by (first building Raylib and installing it per instructions and) putting it into the examples/core folder and running >>
make core/core_drm_rotation PLATFORM=PLATFORM_DRM
[EDIT] Crappiola maximus part II: Running it on the Raspberry Pi gives the same result; the blue circle is going on the original axis, while the red follows the mouse. This on the DRM build running from the console.
Some extra details >>
Specs: Raspberry Pi display is 1024 x 600 MPC Live display is 800 x 1280
Executables are built on the Raspberry Pi.
No errors or warnings during build, but this is spammed on both devices constantly >>
WARNING: GetCurrentMonitor() not implemented on target platform
WARNING: GetCurrentMonitor() not implemented on target platform
WARNING: GetCurrentMonitor() not implemented on target platform
WARNING: GetCurrentMonitor() not implemented on target platform
WARNING: GetCurrentMonitor() not implemented on target platform
WARNING: GetCurrentMonitor() not implemented on target platform
WARNING: GetCurrentMonitor() not implemented on target platform
WARNING: GetCurrentMonitor() not implemented on target platform
WARNING: GetCurrentMonitor() not implemented on target platform
WARNING: GetCurrentMonitor() not implemented on target platform
WARNING: GetCurrentMonitor() not implemented on target platform
Here's a quick video of what's happening >> https://www.youtube.com/watch?v=Ev0X2r9DHC8
Here's the code that's running on that example >>
#include "raylib.h"
int main(void) {
InitWindow(800, 1280, "test");
SetTargetFPS(60);
// Create a surface that will get rotated:
RenderTexture2D surface = LoadRenderTexture(600.0f, 400.0f);
Rectangle sourceRec = { 0.0f, 0.0f, 600.0f, -400.0f };
Rectangle destRec = { 0.0f, 0.0f, 600.0f, 400.0f };
Vector2 origin = { 0.0f, 0.0f };
float rotation = 0.0f;
int toggleRotation = 0;
// Define the position for two buttons:
Rectangle yellowButtonPos = { 0.0f, 0.0f, 50.0f, 50.0f };
Rectangle pinkButtonPos = { 100.0f, 0.0f, 50.0f, 50.0f };
ShowCursor();
while (!WindowShouldClose()) {
// Press [SPACE] to toggle the surface rotation to the left:
if (IsKeyPressed(KEY_SPACE)) {
if (!toggleRotation) {
origin = (Vector2){ 400.0f, 0.0f }; rotation = -90.0f; toggleRotation = 1;
} else {
origin = (Vector2){ 0.0f, 0.0f }; rotation = 0.0f; toggleRotation = 0;
}
}
// Press [E] to enable cursor and [D] to disable cursor:
if (IsKeyPressed(KEY_E)) EnableCursor();
if (IsKeyPressed(KEY_D)) DisableCursor();
// Check if mouse is over yellowButton:
Color yellowButtonColor = YELLOW;
if (CheckCollisionPointRec((Vector2){ GetMouseX(), GetMouseY() }, yellowButtonPos)) yellowButtonColor = GOLD;
// Check if touch is over pink button:
Color pinkButtonColor = PINK;
if (CheckCollisionPointRec((Vector2){ GetTouchX(), GetTouchY() }, pinkButtonPos)) pinkButtonColor = PURPLE;
// Fill the surface:
BeginTextureMode(surface);
ClearBackground(WHITE);
DrawRectangleRec(yellowButtonPos, yellowButtonColor); // Draw yellow button
DrawRectangleRec(pinkButtonPos, pinkButtonColor); // Draw pink button
DrawText(TextFormat("monitor size %i x %i", GetMonitorWidth(GetCurrentMonitor()), GetMonitorHeight(GetCurrentMonitor())), 0, 50, 10, BLACK); // Draw monitor stats
DrawText(TextFormat("window size %i x %i", GetScreenWidth(), GetScreenHeight()), 0, 65, 10, BLACK); // Draw window stats
DrawText(TextFormat("mouse pos %i x %i", GetMouseX(), GetMouseY()), 0, 80, 10, BLACK); // Draw mouse stats
DrawText(TextFormat("touch pos %i x %i", GetTouchX(), GetTouchY()), 0, 95, 10, BLACK); // Draw tuoch stats
DrawCircleLines(GetMouseX(), GetMouseY(), 5, BLUE); // BLUE = Mouse "virtual" visual representation inside the surface
DrawCircleLines(GetTouchX(), GetTouchY(), 10, GREEN); // GREEN = Touch "virtual" visual representation inside the surface
EndTextureMode();
// Draw to the screen:
BeginDrawing();
ClearBackground(RAYWHITE);
DrawTexturePro(surface.texture, sourceRec, destRec, origin, rotation, WHITE); // Render the surface
DrawCircleLines(GetMouseX(), GetMouseY(), 15, ORANGE); // ORANGE = Mouse "real" visual representation inside the window
DrawCircleLines(GetTouchX(), GetTouchY(), 20, RED); // RED = Touch "real" visual representation inside the window
EndDrawing();
}
UnloadRenderTexture(surface);
CloseWindow();
return 0;
}
Yes, the explanation is exactly what I'm experiencing.
The root question, that I've been asking around is >>
How do I make the user hit the GUI buttons when I need to rotate the whole view 90 degrees counterclockwise and the input is hitting the original locations and not the visible GUI elements? That is my core issue and no one seems to have an answer to this.
The application would preferably support touch and mouse at the same time but touch is the primary method of interaction. With the hack that I mentioned, it works, but it's not pretty and certainly not safe (or deployable) as I'm just brute forcing the numbers inside rcore_drm.c
The display is mounted in landscape (possibly upside down in other models, where the app will try to run on) - on the desktop it runs like a normal application and I've had no problems there
Any method will do, I'd rather not draw on to a rendertexture as it's embedded platform, but it should not be a biggie in practice.
Give me a few hours
@asdqwe You really don't need to spend any of your free time to fix my app. :D :D
But it is hugely appreciated if you can come up with a way. \o/
Nice nice. This will give me something to go on, the CheckCollisionPointRec() is probably what I'm after. I still don't know how to implement something like that in example like the controls_test_suite, but I'll dive into the source and figure it out. Thanks so much for these!
The thing that works for me so far is setting up a separate build chain for the ARM device and just brute force rotating the mouse coordinates in rcore_drm.c, which I'll leave here for the next Google adventurer to find >>
// Absolute movement parsing
if (event.type == EV_ABS)
{
// Basic movement
if (event.code == ABS_X)
{
CORE.Input.Mouse.currentPosition.y = (event.value - platform.absRange.x)*CORE.Window.screen.width/platform.absRange.width; // Scale according to absRange
CORE.Input.Touch.position[0].y = (event.value - platform.absRange.x)*CORE.Window.screen.width/platform.absRange.width; // Scale according to absRange
touchAction = 2; // TOUCH_ACTION_MOVE
}
if (event.code == ABS_Y)
{
CORE.Input.Mouse.currentPosition.x = CORE.Window.screen.height - ((event.value - platform.absRange.y)*CORE.Window.screen.height/platform.absRange.height); // Scale according to absRange
CORE.Input.Touch.position[0].x = CORE.Window.screen.height - ((event.value - platform.absRange.y)*CORE.Window.screen.height/platform.absRange.height); // Scale according to absRange
touchAction = 2; // TOUCH_ACTION_MOVE
}
.. it's stupid, but it works.
@asdqwe How common is this case, requiring manual mouse/touch rotation to match display? Maybe a compilation flag can be added to consider those cases?
but did the last test case work?
It gives me a visual of where the cursor should be but no, it does not solve the problem of pressing GUI buttons, sadly.
I'll add my two cents coming from game development side. Normally, a view (aka presentation) should be separate from the screen, as we can never know how the physical screen is mounted (the OS itself can't know this) and what it supports. But this adds a circus exactly how @asdqwe mentioned.
How I would solve this >> Allowing stuff like GUI elements to be attached to a transform and then allowing rotation of the transform so that the elements are actually in the places where they appear to be. On the development side, this would just be a 2D Rect Transform similar how Unity3D does it (scale/rotation/position offset) and the GUI rect just takes that transform into account when it does placement and collision checks.
EDIT: The most correct solution would be a DRM/KMS call which will properly rotate the display.
The call is drm_plane_create_rotation_property() and calling drm_rotation_simplify() on it (I have 0 idea on how) but it is not guaranteed that the driver on more exotic systems will support this, even though they support KMS.
Do you mean, on the test case hovering the yellow/pink buttons don't change them to orange/purple?
Or you mean using raygui? Because raygui calls
GetMousePosition()
directly so it complely bypass the client-implementation.
This exactly. It works if I construct the buttons manually, but it won't help with RayGUI.
Found a solution that uses
raygui
, calculates rotation exclusively on the client-side
I think you cracked it!! asdfsafdaasf \o/
Works 100% on the screen and button is pressed. This is absolute perfection as there's no need to hack the library and as this is such a special case, I was really worried that this would just add unnecessary work to somehow implement it in to the actual library and possibly break other things in the process - this solves it (at least for me).
Has this been closed actually?
@asdqwe thank you very much for all the time put into investigating this issue and finding the culprit! About your points:
- Rotations are not uncommon, however they should be handled exclusively on the client-side...
Agree.
- We shouldn't (ever) change the raw coords at all...
Agree. Still note that raylib provides some functions to allow scale+offset mouse raw data, I found it very convenient while dealing with raygui windows. Also, the mouse-touch mapping was added for convenience and simplicty while compiling same applications for Desktop and Android.
- Now, after taking a look at raygui, I think this specific case is more related to raygui...
raygui aim was being backend agnostic, so the raylib functions can be re-implemented by users for others backend but probably the current mechanism to do so (editing raygui code) is not the most appropiate/intuitive... it needs to be re-thinked...
The most correct solution would be a DRM/KMS call which will properly rotate the display.
Agree... and it can be added. The new platform-modules design allows adding platform-specific functionality, for example adding a rcore_drm.h
header with additional platform functions, so users can do:
#include "raylib.h"
#include "platforms/rcore_drm.h"
// Use additional platform functionality
Nobody has used that possibility yet, despite some platform-specific functions have been added in some module.
Calling the DRM/KMS rotation works for some devices but not all as it is a kernel call which has to be supported by the driver (not all do, mine does).
Client -side adjustments are preferred, but still convenience methods inside the library would be super handy, for example I can't do the posted solution in my app as Go won't bend to that, but that's a Go problem.
Also, adjusting GUI elements only might not really work if gestures don't understand what is going on (if they are still following the original coordinates, not sure, haven't tried).
But I have a ton of tips to go on, so I'm covered. Huge thanks to everyone. \o/
Has this been closed actually?
Raylib needs a support forum. Badly. And yes, this can be closed.
There's also the Discussions' Tab (Q&A).
Excellent, this was what I was after when asking the question, felt weird to post it as an issue. Live and learn :D
Discord is a great place for quick questions, but bad for conversations as it's not searchable by Google, so folks like me come in and ask the same questions over and over again.
I'm working on a screen that's mounted horizontally and I need to rotate the view, which is really easy on Raylib. After the rotation, all GUI elements rotate correctly to their respected places and I can click on them with the mouse.
But the screen touch input still only works on the original, unrotated locations, so it does not take into consideration the transform applied to the view. I was able to create a hacked copy of the library where I just brute forced the coordinates by injecting them in the event list, but this of course is not the way to do this. :D
This also might need more testing as I'm not sure if it's only on my setup but seems any input that reads the relative input section in rcore_drm.c is ok, but the absolute input does not take the rotation into account.
Running Raylib via Raylib-Go on armhf using a display connected via SPI (MPC Live). I also have a Raspberry Pi with an SPI display I could test this on.