Closed michaelfiber closed 1 year ago
@ubkp thanks! just reviewed it!
@ubkp @michaelfiber Hi! Following with the redesign, I moved all the platform-specific data to separate PlatformData
structures.
I did a quick test for PLATFORM_DESKTOP
and PLATFORM_WEB
and everything seems to work, please, could you try PLATFORM_DRM
and PLATFORM_ANDROID
?
@raysan5 Ok, I'll check PLATFORM_ANDROID
.
I've done some more review and code organization, simplified TakeScreenshot()
and moved to rcore.c
.
I created rcore_custom.c
module as a template for new platform implementations. Considering a platform_sdl.c
at the moment...
Now I'm checking the functions moved to separate platforms, I think some of them could be probably moved back to rcore.c
. For example, some of the input-based functions that just return a state from CORE.Input
could be probably moved and the required state for every platform forced by PollInputEvents()
.
@ubkp @michaelfiber Hi! Following with the redesign, I moved all the platform-specific data to separate
PlatformData
structures.I did a quick test for
PLATFORM_DESKTOP
andPLATFORM_WEB
and everything seems to work, please, could you tryPLATFORM_DRM
andPLATFORM_ANDROID
?
I did quick tests of PLATFORM_DRM and couldn't find any issues.
Currently the PLATFORM_ANDROID
's building is very manual, which is not viable for recurrent testing. While the main setup is not that much of a problem, building each example is.
So I'm trying to work out a setup here that would make that testing a bit more practical. This may delay me a day or two tho.
@ubkp GitHub Actions for PLATFORM_ANDROID
and PLATFORM_DRM
should be added.
WAAAA increíble una de las cosas que mas quería hacer ❤️
@ubkp @michaelfiber I continue with the platforms review and code cleaning and organization, to simplify/document the addition of new platforms as much as possible. Already moved some function from platform-specific code to rcore.c
and I think some more can be moved with some review. Here a list with functions that I think could be moved:
int GetGamepadAxisCount(int gamepad)
For most platforms just returns CORE.Input.Gamepad.axisCount
but for PLATFORM_DRM
that queries axisCount
with a ioctl()
, could this call be moved to InitGamepad()
or GamepadThread()
and just fill CORE.Input.Gamepad.axisCount
?
const char *GetGamepadName(int gamepad)
Just reviewed implementation for PLATFORM_DESKTOP
to use CORE.Input.Gamepad.name[id]
, similar to previous function, probably a similar implementation can be used for PLATFORM_DRM
int GetMouseX(void)
int GetMouseY(void)
Vector2 GetMousePosition(void)
float GetMouseWheelMove(void)
All those functions have same implementation for most platforms, only PLATFORM_ANDROID
returns CORE.Input.Touch.position[0]
, I'm sure that value could be feed somewhere else to mouse equivalent.
int GetTouchX(void)
int GetTouchY(void)
Vector2 GetTouchPosition(int index)
Similar to previous case, those functions should always return CORE.Input.Touch.position[0]
and be mapped to CORE.Input.Mouse.currentPosition
as required.
Please, could you help with the some of those?
@ubkp @michaelfiber I continue with the platforms review and code cleaning and organization, to simplify/document the addition of new platforms as much as possible. Already moved some function from platform-specific code to
rcore.c
and I think some more can be moved with some review. Here a list with functions that I think could be moved:* `int GetGamepadAxisCount(int gamepad)`
For most platforms just returns
CORE.Input.Gamepad.axisCount
but forPLATFORM_DRM
that queriesaxisCount
with aioctl()
, could this call be moved toInitGamepad()
orGamepadThread()
and just fillCORE.Input.Gamepad.axisCount
?
That makes sense, I can work on that tonight (in about 12 hours).
* `const char *GetGamepadName(int gamepad)`
Just reviewed implementation for
PLATFORM_DESKTOP
to useCORE.Input.Gamepad.name[id]
, similar to previous function, probably a similar implementation can be used forPLATFORM_DRM
This seems like a straight forward thing to fix as well. How should raylib react if the plugged in gamepad changes while the program is running? Should it just automatically pick up the new pad and start using it? If so then the name and the axis count should be polled regularly. If not, just the axis count should be polled regularly (I have a couple gamepads that let you alter the axis count on the fly with on-gamepad buttons).
* `int GetMouseX(void)` * `int GetMouseY(void)` * `Vector2 GetMousePosition(void)` * `float GetMouseWheelMove(void)`
All those functions have same implementation for most platforms, only
PLATFORM_ANDROID
returnsCORE.Input.Touch.position[0]
, I'm sure that value could be feed somewhere else to mouse equivalent.* `int GetTouchX(void)` * `int GetTouchY(void)` * `Vector2 GetTouchPosition(int index)`
Similar to previous case, those functions should always return
CORE.Input.Touch.position[0]
and be mapped toCORE.Input.Mouse.currentPosition
as required.Please, could you help with the some of those?
If this is still up for grabs after I'm done with the DRM changes I can take a look.
@raysan5 @michaelfiber I apologize for being late on the PLATFORM_ANDROID
testing. Although compiling is no problem (reference), testing is a challenge since my device is too old and Android is not easy to emulate. I'm still working on it (trying Bless OS right now).
If this is still up for grabs after I'm done with the DRM changes I can take a look.
@michaelfiber Please do. Mouse/touch is the one thing waydroid
has trouble with, so I can't reliably test it yet.
The rcore_custom is great, I have less work to update, although I think that some functions like LoadFontDefault should not be stopped and I also think that it would be good to implement some funcs like:
RLAPI void MouseButtonCallback(int button, int action, int mods);
RLAPI void MouseCursorPosCallback(double x, double y);
RLAPI void MouseScrollCallback(double xoffset, double yoffset);
RLAPI void WindowSizeCallback(int width, int height);
RLAPI void KeyCallback(int key, int scancode, int action, int mods);
The idea of this is that an event system can fill it in without a problem. Thanks Thanks Thanks Thanks.
How should raylib react if the plugged in gamepad changes while the program is running? Should it just automatically pick up the new pad and start using it? If so then the name and the axis count should be polled regularly. If not, just the axis count should be polled regularly (I have a couple gamepads that let you alter the axis count on the fly with on-gamepad buttons).
@michaelfiber This is a good question, I think any solution you choose would be a good solution, with its pros and cons. Polling regularly is more costly but probably more convenient for users while polling only on initialization is way less costly and probably most user will never change gamepads. In any case, the solution you choose is good.
I apologize for being late on the PLATFORM_ANDROID testing. Although compiling is no problem (https://github.com/raysan5/raylib/issues/3371#issuecomment-1745365670), testing is a challenge since my device is too old and Android is not easy to emulate. I'm still working on it (trying Bless OS right now).
@ubkp No worries at all! I'm aware PLATFORM_ANDROID
is quite a pain, it's the most different platform from the list and testing/debugging is pretty complex.
I think that some functions like LoadFontDefault should not be stopped and I also think that it would be good to implement some funcs like...
@hbiblia I don't understand what you mean about LoadFontDefault()
, about the proposed functions, raylib approach to input polling is different than other libraries, no plans to implement callbacks.
I mean rcore_custom.c
is not loading LoadFontDefault();
but it is calling UnloadFontDefault();
.
Update: Update now, and it's already added, thank you. ❤️
The build was with the master branch as of the date of publishing this comment.
core_custom_frame_control.c
tiene un parpadeo ☣️core_loading_thread.c
Build [ok] Run[Error]https://github.com/raysan5/raylib/assets/1939353/68b56f49-b603-44cb-b84b-ae78096732b6
@michaelfiber This is a good question, I think any solution you choose would be a good solution, with its pros and cons. Polling regularly is more costly but probably more convenient for users while polling only on initialization is way less costly and probably most user will never change gamepads. In any case, the solution you choose is good.
@raysan5 I realized that for now it makes sense to just get the axis count once because to do it in the gamepad update thread creates other problems because of race conditions. I really want to spend some time updating the input system for PLATFORM_DRM anyway. I think the platform has come along enough to start targeting /dev/input/by-id
instead of just /dev/input
since then we get a nice easy list of devices with mouse
, kbd
, joystick
at the end of the link to identify how to handle it.
I marked that first PR ready for review, it just moves over the two functions listed. It also adds 1 example that just outputs the gamepad name and axis count because I was using that for quick testing. I can remove that if you want. My goal is to keep adding to it so that by the time the input system is overhauled it will display all the pertinent info about all connected devices when it runs.
EDIT: Once the split is in a good place i'll work on the input system for platform_drm more and I'll also look into why I can't run PLATFORM_DRM programs on my main workstation. It seems like it might be related to the resolution of the monitor somehow but I'm not sure how yet. It's 3440x1440 and it's the only system I have where no suitable connector can be found.
@raysan5 @michaelfiber Some updates:
I managed to setup another Android
testing environment using qemu/kvm/virgl
and Bliss OS 14
. Unfortunately it had the exactly same issues Waydroid
had with touch gestures and mouse presses.
I started digging and found out that those issues are very common on games. Mostly happening to users trying to run Android
games emulated on PC
with a mouse instead of a touchscreen.
My best guess is that it's related to how Android
handles mouse input
and real touch input
internally depending on context
(Native, OpenGL, Vulkan, etc). Games touch usually don't work emulated. But regular apps touch (that mostly use the Android
's view logic) usually work emulated.
My last attempt was trying again the Android Studio
. But it's borderline on the limits of my hardware and is something I have almost no experience with. I couldn't get anywhere with the Virtual Device Manager
. It will take me some time to learn how to do something there.
So I'm somewhat limited on what I can test on Android
. If it's related to mouse/touch/gestures
, there's not much I can do right now. Anything else I can probably make work. Really sorry guys.
@ubkp thanks for the update and thanks for all the hard work looking for ways to test Android platform! Maybe @Bigfoot71 could help with Android platform testing?
@michaelfiber Just moved the two gamepad functions to rcore.c
, now they are generic for all platforms!
I'm going to review the Mouse and Touch functions but it will require some testing on PLATFORM_ANDROID
.
Just reviewed Mouse and Touch functions, easier than expected! Having the platforms splitted and having a generic PollInputEvents()
really benefits those improvements!
Here the current list with the platform-specific functions, as expected most of the functions are window/display/monitor related, that's probably the element with more differences between platforms:
void InitWindow(int width, int height, const char *title)
void CloseWindow(void)
bool WindowShouldClose(void)
bool IsWindowHidden(void) bool IsWindowMinimized(void) bool IsWindowMaximized(void) bool IsWindowFocused(void) bool IsWindowResized(void)
void ToggleFullscreen(void) void ToggleBorderlessWindowed(void) void MaximizeWindow(void) void MinimizeWindow(void) void RestoreWindow(void)
void SetWindowState(unsigned int flags) void ClearWindowState(unsigned int flags)
void SetWindowIcon(Image image) void SetWindowIcons(Image images, int count) void SetWindowTitle(const char title) void SetWindowPosition(int x, int y) void SetWindowMonitor(int monitor) void SetWindowMinSize(int width, int height) void SetWindowMaxSize(int width, int height) void SetWindowSize(int width, int height) void SetWindowOpacity(float opacity) void SetWindowFocused(void) void *GetWindowHandle(void) Vector2 GetWindowPosition(void) Vector2 GetWindowScaleDPI(void)
int GetMonitorCount(void) int GetCurrentMonitor(void) int GetMonitorWidth(int monitor) int GetMonitorHeight(int monitor) int GetMonitorPhysicalWidth(int monitor) int GetMonitorPhysicalHeight(int monitor) int GetMonitorRefreshRate(int monitor) Vector2 GetMonitorPosition(int monitor) const char *GetMonitorName(int monitor)
void SetClipboardText(const char text) const char GetClipboardText(void)
void ShowCursor(void) void HideCursor(void) void EnableCursor(void) void DisableCursor(void)
- **Custom frame control:**
```c
void SwapScreenBuffer(void);
void PollInputEvents(void);
double GetTime(void)
void OpenURL(const char *url)
int SetGamepadMappings(const char *mappings)
void SetMousePosition(int x, int y)
void SetMouseCursor(int cursor)
static bool InitGraphicsDevice(int width, int height)
There is a total of 52 platform-specific functions but probably some of window/monitor related, the ones that check states, can also be made generic for all platforms just setting the states properly on the platform initialization/update code.
EDIT: Some tweaks for better organization
On my end, touch input on Android seems to be working as before, no issues to report. Regarding mouse input for Android, I'm unable to test it with my hardware, except possibly with an emulator, but that may not be representative of real device behavior. _(Uh wait, if I'm not mistaken, the mouse vector on Android is equal to that of the first touch, right? Maybe I misunderstood what was said... rcore_android.c#L1203)_
https://github.com/raysan5/raylib/assets/90587919/6a1bde49-b35c-4039-9010-3b18a1a7dbc5
(the fact that the view is cut off on the sides is normal; the window isn't initialized with a resolution that matches my device)
For simplicity I think the idea is to use the first touch as the value of the mouse for the sake of raylib. Android has mouse input built in and it is well supported throughout the OS but I haven't the slightest idea what supporting it looks like nor do I know how many people actually use it. I've used it a bunch but I do weird things with my phones. I also regularly use USB gamepads with android. So, until someone really needs it (and is willing to implement it hopefully) using the first touch seems like a straight forward approach.
I also think it's better this way, and no one seems to be requesting it either. Otherwise, it's simply handled in android/input.h like other inputs.
So, until someone really needs it (and is willing to implement it hopefully) using the first touch seems like a straight forward approach.
That's the reasons for current approach, I don't know anyone using a mouse connected to an Android phone...
I'm currently conducting tests with as many examples as possible, and I just noticed that the Vector2 GetMouseDelta()
function (rcore.c) now returns a value on Android, unlike before. The problem is that it seems to consistently return the delta considering the previous touch position as always being at {0,0}
.
So, I'm currently writing a small fix just for my tests, and it will only work with the first touch. Would it be a good idea to add the ability to get the delta of touches? Something like Vector2 GetTouchDelta(int id)
?
Other than that, there's still an issue with detecting the double-tap gesture. I say "still" because I've seen another user complain about it on the Android channel on Discord:
https://github.com/raysan5/raylib/assets/90587919/698f47fa-ec0d-4d24-aa57-9a7ad17c74fa
Here are some other tests I've conducted in a video, and everything looks good to me:
https://github.com/raysan5/raylib/assets/90587919/fcd723dc-a3bf-4246-8e4c-f93c01f1bd53
@raysan5 Should axisCount inside CoreData be an array of Ints so that each gamepad can have a different number of axes? GetGamepadAxisCount
takes a gamepad number but then doesn't use it.
Should axisCount inside CoreData be an array of Ints so that each gamepad can have a different number of axes? GetGamepadAxisCount takes a gamepad number but then doesn't use it.
@michaelfiber Oh! Sure, it sounds like a bug! I don't know why it was not added before... afaik, all gamepad data is indexed by gamepad.
The problem is that it seems to consistently return the delta considering the previous touch position as always being at {0,0}. So, I'm currently writing a small fix just for my tests, and it will only work with the first touch. Would it be a good idea to add the ability to get the delta of touches? Something like Vector2 GetTouchDelta(int id)?
@Bigfoot71 Probably not required, I try to avoid adding new features if they are not required in a real use-case scenario. But the mouse delta issue should be fixed on rcore_android.c
!
I performed some tests on core_custom_frame_control.c
in all versions where it was added, starting from version 4.0, and the same issue occurred in all of them. Windows 11 Nvidia GTX 1660 Ti Laptop. OpenGL 3.3
I conducted the test on another PC, and it doesn't work: Windows 11 Intel(R) UHD Graphics 620. OpenGL 3.3
I also performed the tests on all available versions of OpenGL.
Update test:
The problem is in SwapScreenBuffer()
. core_custom_frame_control.c
Other than that, there's still an issue with detecting the double-tap gesture. I say "still" because I've seen another user complain about it on the Android channel on Discord:
@Bigfoot71 I really wish I could help to debug that issue. My best guess is that there's a problem with some value of the GESTURE_DOUBLETAP
detection criteria (L277). Possibly one of these:
GESTURES.current
being different from GESTURE_NONE
.GESTURES.Touch.tapCounter
being smaller than 2
.rgGetCurrentTime() - GESTURES.Touch.eventTime
being larger than TAP_TIMEOUT
.rgVector2Distance(GESTURES.Touch.downPositionA, event.position[0])
being larger than DOUBLETAP_RANGE
.Do you know if is it possible to get the TraceLog
messages on Android? If yes, adding the following to LINE 275 on rgestures.h
would probably show the problem:
TraceLog(LOG_INFO, "--------------------");
TraceLog(LOG_INFO, "GESTURES.current: %i", GESTURES.current);
TraceLog(LOG_INFO, "\nGESTURES.Touch.tapCounter: %i", GESTURES.Touch.tapCounter);
TraceLog(LOG_INFO, "\nGESTURES.Touch.eventTime: %f", GESTURES.Touch.eventTime);
TraceLog(LOG_INFO, "rgGetCurrentTime(): %f", rgGetCurrentTime());
TraceLog(LOG_INFO, "TAP_TIMEOUT: %f", TAP_TIMEOUT);
TraceLog(LOG_INFO, "rgGetCurrentTime() - GESTURES.Touch.eventTime: %f", (rgGetCurrentTime() - GESTURES.Touch.eventTime));
TraceLog(LOG_INFO, "\nDOUBLETAP_RANGE: %f", DOUBLETAP_RANGE);
TraceLog(LOG_INFO, "rgVector2Distance(GESTURES.Touch.downPositionA, event.position[0]): %f", (rgVector2Distance(GESTURES.Touch.downPositionA, event.position[0])));
TraceLog(LOG_INFO, "--------------------");
Edit: editing.
@hbiblia Please double check if SUPPORT_CUSTOM_FRAME_CONTROL
was enabled during the library compilation. That can be done by uncommenting LINE 70 from config.h
and then recompiling raylib. Otherwise, the core_custom_frame_control.c
example won't work correctly.
Edit: editing.
@ubkp This correctly makes it work, thank you!! . My mistake was not checking the core_custom_frame_control.c
to read the comments.
Edit: But I think this should be something that will work first time for someone who just compiles and reviews the compiled examples without looking at the code on the spot.
Do you know if is it possible to get the
TraceLog
messages on Android? If yes, adding the following to LINE 275 onrgestures.h
would probably show the problem:
Of course, Android logs can be obtained using logcat. Here are the results of the double-tap attempt:
V New input event: type=2
I --------------------
I GESTURES.current: 0
I GESTURES.Touch.tapCounter: 1
I GESTURES.Touch.eventTime: 0.000000
I rgGetCurrentTime(): 9.464040
I TAP_TIMEOUT: 0.300000
I rgGetCurrentTime() - GESTURES.Touch.eventTime: 9.464066
I DOUBLETAP_RANGE: 0.030000
I rgVector2Distance(GESTURES.Touch.downPositionA, event.position[0]): 588.914001
I --------------------
V New input event: type=2
V New input event: type=2
V New input event: type=2
I --------------------
I GESTURES.current: 0
I GESTURES.Touch.tapCounter: 2
I GESTURES.Touch.eventTime: 9.464104
I rgGetCurrentTime(): 9.580643
I TAP_TIMEOUT: 0.300000
I rgGetCurrentTime() - GESTURES.Touch.eventTime: 0.116562
I DOUBLETAP_RANGE: 0.030000
I rgVector2Distance(GESTURES.Touch.downPositionA, event.position[0]): 5.921487
I --------------------
V New input event: type=2
V New input event: type=2
Here's a video of what happened:
https://github.com/raysan5/raylib/assets/90587919/e0cb8bf0-bfb1-4cfb-84ac-cb6a9ee0ff17
And here's a comparison of how it works on PC:
https://github.com/raysan5/raylib/assets/90587919/2e90faa5-e00b-4229-b5fd-04f142354dfa
Now, I'm going to tackle GetMouseDelta()
for Android, and I'll take care of the gesture issue afterward. If you have any ideas in the meantime, feel free to share them, and I'll work on implementing them.
Otherwise, you have raymob to easily compile on Android, it's even simpler with Android Studio.
By the way, I unfortunately don't have a controller available. If someone could try this on Android, that would be fantastic!
Edit: You can simply clone the 4.6-dev branch of raymob and replace the project's main.c
with core_input_gamepad.c, connect your device with debug mode enabled, and click on the "run" button of Android Studio, or use the gadlew scirpt to build and adb
to install (the second method will require the APK to be signed manually).
Regarding the double tap problem on Android, it seems that it is simply a problem of distance detected between the two inputs.
event.position
doesn't seem to be normalized in rcore_android.c
, which would explain the logs.
Edit: BINGO! The issue was indeed that the event.position
values were not normalized on Android, and now the double tap works for me.
What should we do? Should I open another PR, even though the previous one is still pending?
@Bigfoot71 That's excellent, great job!
@Bigfoot71 good catch! thanks for the review! feel free to send a PR!
@michaelfiber @ubkp @Bigfoot71 WARNING! Just redesigned InitGraphicsDevice()
into InitPlatform()
and ClosePlatform()
for organization, consistency and coherence reasons.
Main reason for this change is that InitGraphicsDesign()
was currently doing many other things:
I thought it could be a good idea to unify all that into InitPlatform()
, this way, not only all platform initialization is unified into a single function (using/updating required CORE
data) but also allows to move InitWindow()
/CloseWindow()
back to rcore.c
(still under consideration).
I tested PLATFORM_DESKTOP
and it works ok, if you could give a try to the other platforms at some moment it would be great! Thanks!
PD. PLATFORM_ANDROID
is a bit special so I kept the InitGraphicsDevice()
but I reviewed it to do just that, init graphic device.
EDIT: Still considering dividing InitPlatform()
into smaller functions or at least organize all the internal processes into sections with better comments.
@raysan5 Compiled PLATFORM_WEB
and PLATFORM_DRM
, tested a few examples, everything looks ok there.
Off-topic: for the future, any chance of moving the automation/event-recording system outside rcore
?
@raysan5 There was a small error in InitPlatform
in rcore_android.c
, and I've just submitted a correction here: https://github.com/raysan5/raylib/pull/3415.
I'm also getting another compilation error due to the new header stb_image_resize2.h
, still on Android:
raylib/external/stb_image_resize2.h:2422:17: error: initializing 'float16x8_t' (vector of 8 'float16_t' values) with an expression of incompatible type 'int'
float16x8_t in = vld1q_f16(input);
^ ~~~~~~~~~~~~~~~~
What is this new header supposed to be? (I mean, compared to the old one)
Edit: The error occurs when I compile for x86_64, and I see that this part is supposed to be for ARM64. There may be some project definition issues on my end. I'll investigate and try compiling raylib with its Makefile to see if everything works.
for the future, any chance of moving the automation/event-recording system outside rcore?
@ubkp Sure! It's on my TODO list to review all that system, I'll try to allocate some time as soon as possible.
What is this new header supposed to be? (I mean, compared to the old one)
@Bigfoot71 This header is only used for efficient image scaling, using SIMD and other accelerated instructions modesl provided by the CPU, if available on the target platform. In this case it is trying to call Arm Neon Intrinsics
instructions: vld1q_f16()
.
It seems it can be turned off just defining STBIR_NO_AVX
and STBIR_NO_AVX2
before including the library.
Thank you for the explanation. I just tested building raylib with its Makefile for Android (all architectures), and there were no errors. So the issue seems to be on my end with Android Studio. Good news!
I'll edit this message once I've successfully compiled and tested the project.
@Bigfoot71 It could be an Android Studio issue with its libraries, afaik, Arm Neon
functions should be available in most target Android devices nowadays. The benefit of using them could acceletate x4 the image resizing.
It's all good, the issue is resolved. It turned out to be missing rules in the CMakeLists.txt of raymob and Android Studio was indicating 'x86_64,' but when I checked the logs myself, the problem was indeed with 'armeabi-v7a'...
Everything is working now! The application launches correctly, and there's nothing to report!
@raysan5 I haven't tried this on other platforms, but on Android, I get these warnings for SetupFramebuffer()
, SetupViewport()
, InitTimer()
:
Call to undeclared function 'SetupFramebuffer'; ISO C99 and later do not support implicit function declarations
Call to undeclared function 'SetupViewport'; ISO C99 and later do not support implicit function declarations
Call to undeclared function 'InitTimer'; ISO C99 and later do not support implicit function declarations
Since they are declared and defined in rcore.c
and called in rcore_android.c
, this doesn't pose any problem except for the warnings.
Edit: Should steps be taken to remove these warnings?
Also, I noticed a small issue with InitGraphicsDevice()
in rcore_android.c
, which is supposed to return a bool
but returns -1 or 0. Additionally, we don't do anything with its return value when calling it in AndroidCommandCallback()
. LINE 869
Edit: In your opinion, should we take a particular action such as closing the application if InitGraphicsDevice()
fails? Or let the activity run if it can with a black screen?
And there's one more warning, still for rcore_android.c
, for int InitPlatform(void)
which doesn't return anything at the end of its definition.
Edit: As asked previously, should any actions be taken here if InitPlatform()
fails?
I haven't tried this on other platforms, but on Android, I get these warnings for
SetupFramebuffer()
,SetupViewport()
,InitTimer()
:
@Bigfoot71 Compiled current master branch (2498170) on PLATFORM_DESKTOP
, PLATFORM_WEB
and PLATFORM_DRM
and didn't get those warnings. I guess it's something with rcore_android.c
in particular.
@michaelfiber @ubkp @Bigfoot71 WARNING! Just redesigned
InitGraphicsDevice()
intoInitPlatform()
andClosePlatform()
for organization, consistency and coherence reasons.Main reason for this change is that
InitGraphicsDesign()
was currently doing many other things:* Initialize support library * Setup window/framebuffer flags * Initialize window/display * Initialize graphic device * Setup platfom callbacks (window, inputs...) * Initialize inputs systems * Initialize OpenGL extensions * Initialize rlgl (buffers and shaders)
I thought it could be a good idea to unify all that into
InitPlatform()
, this way, not only all platform initialization is unified into a single function (using/updating requiredCORE
data) but also allows to moveInitWindow()
/CloseWindow()
back torcore.c
(still under consideration).I tested
PLATFORM_DESKTOP
and it works ok, if you could give a try to the other platforms at some moment it would be great! Thanks!PD.
PLATFORM_ANDROID
is a bit special so I kept theInitGraphicsDevice()
but I reviewed it to do just that, init graphic device.EDIT: Still considering dividing
InitPlatform()
into smaller functions or at least organize all the internal processes into sections with better comments.
I merged these changes into my fork with DRM input updates and it all is working.
@Bigfoot71 Compiled current master branch (2498170) on
PLATFORM_DESKTOP
,PLATFORM_WEB
andPLATFORM_DRM
and didn't get those warnings. I guess it's something withrcore_android.c
in particular.
I just compiled using the Makefile for Android, and indeed, the warning did not appear. Only Android Studio (and clang during compilation) warns me about it, but it still builds and runs without any issues afterward.
I'm not sure why this happens, rcore_android.c is properly included after the function declarations. Personally, I'm stumped on this matter.
@Bigfoot71 Does Android Studio keep some sort of cache that could be using some previous artifacts?
@ubkp Yes, Android Studio does it, but no matter how many times I invalidate the cache, clean the project, rebuild it entirely, the problem persists, and it's not just the IDE indicating it, but the compiler as well. It's strange but not very serious in my opinion.
Issue description
As part of my experimenting in pr #3311 to divide rcore into submodules, one of the things I found was that the CoreData structure adds a lot of complexity to that endeavor and MAY benefit from being split up as well. I believe it may benefit because the current CoreData structure can be fairly difficult to understand with the large number of preprocessor directives within the struct itself. This is largely done to accommodate the many different target platforms. But if rcore is divided into different submodules based on platforms there is an opportunity to increase clarity.
Here are a couple of initial ideas for approaches:
Maintain a single instance of CoreData but have CoreData only contain data that all submodules use so that it is a simpler struct with no preprocessor directives. A single CORE instance would be defined extern in rcore.h, defined in rcore.c, and accessed from rcore.c and all the submodules. In addition to this each submodule would have a small CoreData like struct of its own, i.e. DesktopData, WebData, etc. The data that is currently in CoreData that is specific to an submodule would be moved to the data struct in that submodule. Each struct definition would become much more legible because it wouldn't be full of preprocessor directives but there would be more structs overall and they'd be more spread out instead of centralized in rcore.
Another option could be to have the CoreData struct defined entirely within a submodule specific header file so that each submodule has a complete CoreData struct and rcore.c can include a specific one based on the PLATFORM variable. There'd be more duplication but fewer structs and data would not be divided across multiple structs. Any changes to the data in CoreData that is used by all submodules would require applying the change to each submodule.
Personally I'm leaning towards the first option but am interested in what others think.
Environment
Experiments are being coded in Ubuntu 23.04 and leverage github actions to attempt to run build steps for all platforms.
Code Example
3311