urho3d / urho3d

Game engine
https://urho3d.github.io
MIT License
4.54k stars 994 forks source link

provide simple MIT licensed way to detect compiled 3D API #1066

Closed bvanevery closed 8 years ago

bvanevery commented 8 years ago

Urho3D only supports one 3D API at a time. That decision is made at compile time. The documentation section "Shaders... API Differences" talks about the "D3D11" and "GL3" macros being defined according to the 3D API in use, but this is not actually true for consumers of Urho3D as a library. Rather, URHO3D_D3D11 or URHO3D_OPENGL are passed as command line /D options to Urho3D's build. They are not made readily available to any third party code. One could incorporate pkg-config into one's project, but this is GPL V2 code and also a rather heavyweight requirement for someone just trying to get some Urho3D sample code going. Culturally it may seem common on Linux to assume pkg-config is available and people know how to use it, but that isn't true on Windows and I have no idea on OSX.

It would be quite acceptable to have macros such as URHO3D_D3D11 or URHO3D_OPENGL actually defined in an Urho3D header file somewhere, and therefore available to consumers like the documentation actually says. The advisability of this method, is related to the request for a general "configuration header" in #991. Having only a single header for every macro, does tend to trigger massive rebuilds in real world projects when any option is changed. It may be better to distribute such macros across the Urho3D header files that are actually relevant to them.

An example of real world need for this mechanism came up in https://github.com/gawag/Urho-Sample-Platformer/issues/14 . I spent a few solid days tearing my hair out trying to figure out why the USP code didn't work on my system but did work on the developer's. Lots and lots of debug tracing, looking at lots of resource paths that seemed to be loading ok, but were difficult to cover in their entireity. Finally we learned about the debug log file and I noticed that lots of shaders had failed to load. The dev was only providing GLSL shaders, and I had built my Urho3D for DX11. Nothing in the build had warned me that this wouldn't work, and upon investigation, I see pkg-config is the only method by which such things would be known.

JSandusky commented 8 years ago

I don't think you're making any sense.

Finally we learned about the debug log file and I noticed that lots of shaders had failed to load.

You didn't do a very good job of reading the documentation on urho3d.github.io

First, you should've known about that issue in minutes - not days. I can only assume you didn't implement the console in your project. If it genuinely took days then you might be overreaching.

The dev was only providing GLSL shaders, and I had built my Urho3D for DX11

Shaders are content. Even if you knew about the render pipeline through preprocesser definitions, what you propose would not solve your "real world" case.

How exactly is a preprocessor def going to help you resolve a content problem? It won't. The log is already telling you what's going wrong.


As to the general graphics code, the interfaces for the types are all consistent, and where inconsistency is likely (exclusively to data as far as I've seen) helpers such as returning the appropriate texture format exist and are meant to be used to cover all platforms.

From a Windows/VS standpoint including foreign defs is easily resolvable via imports, in C++ that does require that you actually know how MSBuild works as I'm unaware of UI for it (not to say one doesn't exist, I've just never seen it). Via a complete CMake script it's even easier.

But that won't help what you're describing in the slightest.


There are valid cases for such a thing, though personally I think they're so easily resolvable through tools that it's practically irrelevant. Not my call. But what you describe, isn't the right reason for such a thing.

bvanevery commented 8 years ago

JSandusky, you're making a lot of assumptions about how a person becomes engaged to Urho3D or a downstream project, that are not valid. I've filed a lot of issues against Urho3D and Urho Sample Platformer over the past several months, because I have more expertise in CMake build systems than I presently have in Urho3D specifically. Wrapping one's head around the minutiae of other people's complex code projects is not actually the best way to test and get things done. Seeing as how I found and filed stuff that other people previously didn't, I'm not going to apologize for my methods of learning or not learning about Urho3D. It's best for you to understand that when presented with overwhelming external complexity, people adopt many different strategies for dealing with it.

If I have simple URHO3D_D3D11 or URHO3D_OPENGL macros defined in a .h file, then I can change Urho Sample Platformer code to correctly do what it's supposed to do. In an easy and straightforward way, that many open source projects have done for eons. Heck, it tends to be the primary modality of Autoconf-based builds for instance; do you have any experience writing those?

Why are you even talking about MSBuild? Urho3D is CMake driven and makes no explicit reference to it. Urho Sample Platformer also happens to use a CMake build. Not that this is even relevant to what I'm asking for.

JSandusky commented 8 years ago

I can make these assumptions very safely, and even more so based on your continual reply.

Sigh, again ... content problem. No amount of macros will change the fact that you're missing shaders.

I've filed a lot of issues against Urho3D and Urho Sample Platformer over the past several months, because I have more expertise in CMake build systems than I presently have in Urho3D specifically

No, you clearly don't, and are definitely not an expert in CMake, or you'd never have filed this issue as it would have been a 2 line solution. So don't give me that CMake expert nonsense.

Why are you even talking about MSBuild?

Because you were talking about Windows. You can't compile for Windows (and distribute without serious financial consequence) without going through MSBuild. You're proving yourself uninformed.

If I have simple URHO3D_D3D11 or URHO3D_OPENGL macros defined in a .h file, then I can change Urho Sample Platformer code to correctly do what it's supposed to do

That isn't the responsibility of Urho3D, that's your responsibility for using 3rd party code. If you use unofficial code than it is entirely on YOU.

If I have simple URHO3D_D3D11 or URHO3D_OPENGL macros defined in a .h file, then I can change Urho Sample Platformer code to correctly do what it's supposed to do

Oh, you mean by porting it over which is exactly what you have to do to resolve the content problem described above.

you're making a lot of assumptions about how a person becomes engaged to Urho3D or a downstream project, that are not valid

They're quite valid, especially in this case. What do I need to do, hold PRs for geometry shaders and tesselation hostage in order for you to realize how inept you are?

Done. You have single handedly ended PRs for texture arrays, geometry shaders, tesselation, and compute. Congrats, you hammered the final nail in the coffin that will make all future PBR updates entirely under the GPL.

bvanevery commented 8 years ago

If you're quite through biting my head off for not particularly good reasons, I suppose I'll wait to see if anyone else has sane opinions on the many virtues of passing macros in .h files that indicate whether various Urho3D compile time features are present or absent. Meanwhile, why don't you provide your great insights to #991 while you're at it? Same idea, just broader in scope than what I propose here.

Regarding MSBuild... nobody needs to know about it to build Urho3D on Visual Studio.

Regarding your pull requests... I know this is all about you and not me, as I've had no interactions with you before. Sounds like you've reached some kind of point of crisis in your involvement with Urho3D.

cadaver commented 8 years ago

Please keep the discussion civil, doesn't need to go into the whole perceived skill levels (or lack of) thing.

This and #1065 both seem to center on the wish to burn the build configuration variables into a header, in addition to the other methods we provide (like pkg-config). I remember it has also been discussed before. @weitjong can you comment (again)?

weitjong commented 8 years ago

I still remember that we had that discussion. If I recall correctly, at the time I didn't think it is a good idea to "bake" those defines into the auto-generated export header file, which was unfortunately named as "Urho3D.h". The name gives the impression that this header file should contain ALL things required by Urho3D. I did agree that the information could be valuable and that's why we started the work with pkg-config .pc file generation. The idea was, when CMake couldn't be used for some reasons in an external project then pkg-config can be used to help to setup the build system on platforms that pkg-config is available. Otherwise, the information baked in the .pc file can still be referenced manually to guide user on how to set thing up. Granted that this is error prone but that is exactly why we have gone through all the troubles to make our CMake scripts work out of the box for most of the use cases.

That said, I agree our current build system can still be improved. When a Urho3D library is being built (whether it is still in a build tree or already installed as SDK), a few of the build options are already exercised or expired. e.g. you cannot build a 32-bit Urho3D lib and later use it for 64-bit build by passing URHO3D_64BIT=1 in the external project. So, some of the compiler definitions tied to those exercised or expired build options should actually be baked or stored in a header file. And then the library users can simply reuse that header file without having to repeat the similar -D build options again in the external project. Conflicting build options won't work anyway. However, currently our build system is designed to only take user inputs from build options and environment variables. We need to examine carefully the implication of having the compiler defines stored in a header file and not set by Urho3D-CMake-common module file based on a given set of the build options, if we really would want to go this way.

cadaver commented 8 years ago

For me it'd be fine to change the system to output the most important config defines (at least the graphics API type and SSE option, possibly others) into Urho3D.h.

I understand the inputs to the build (CMake options, env.vars) would stay authoritative like before, and the header would be just the output. Potentially Urho3D-CMake-common would then have to do less when preparing a client application build (as opposed to building Urho3D itself.)

1vanK commented 8 years ago

The name gives the impression that this header file should contain ALL things required by Urho3D

We can use a another name, e.g. "Config.h" :)

bvanevery commented 8 years ago

I filed this as a separate issue, rather than contributing discussion to #991, because centralized Config.h files do trigger a lot of rebuilding in practice. It is better if the graphics files that actually change based on URHO3D_D3D11 and URHO3D_OPENGL, contain these macros. Graphics.h looks like it might be the correct point of localized control. I'd advocate similarly for other subsystems, whatever their .h files are.

damu commented 8 years ago

Whoa this thread got quite wild in the beginning. I'm also kinda involved in this and have some things to contribute (without wanting to escalate further or reheat):

The basic issue was of the project having currently only GLSL shaders (I'm working on HLSL ones) and that led to quite some confusion why bvanevery couldn't see anything. I didn't think of him using DirectX and assumed some missing files or path issues, and we even fixed several of such issues. That confusion and the time shifted communication over GitHub slowed us quite down, so much for the days he mentioned.

Sigh, again ... content problem. No amount of macros will change the fact that you're missing shaders.

Yes. That wasn't the point though. One could act differently in code if some materials are for example only available for OpenGL, like fallback materials. Some Urho users may want or need such a thing. The idea of bvanevery was to have defines for that (like URHO3D_OPENGL or URHO3D_DIRECTX9,...). As I understood it, there are currently none (for a user of Urho). Also I agree that Urho.h would be a bad name and something like BuildConfig.h or BuildMacros.h is more suited. There may be also the need for other defines like URHO3D_LUA, URHO3D_LUAJIT, URHO3D_LUASAFE,... bvanevery's point of the disadvantages of a centralized Config.h are also quite valid. This could be resolved by having multiple files for different areas like a BuildConfigGraphics.h that contains such defines as URHO3D_OPENGL and is included by the Urho internals that need to know it. These separate files could be included by a centralized BuildConfig.h that can be used by Urho users to get all defines at once (easy of use).

Why are you even talking about MSBuild?

Because you were talking about Windows. You can't compile for Windows (and distribute without serious financial consequence) without going through MSBuild. You're proving yourself uninformed.

MSBuild seems to be a Microsoft build tool, I assume only for Visual Studio. I'm happily using GCC and no Visual Studio on Windows for my Urho stuff and we (the company where I work) are too and we are shipping and selling our software like that (C++ with Qt via GCC+QtCreator, no Visual Studio or MSBuild). This has the advantage of having more modern compilers (GCC for Windows and Linux and Clang for MacOS X) and no Microsoft stuff. It may be unusual, but Visual Studio or its compiler are not required to produce commercial or other software for Windows. You're the one proving yourself uninformed. ;-) Also as another common misconception: One does not need to use Visual Studio and the Microsoft API on Windows, XCode with the Mac API on MacOS and so on to produce and deploy a software on multiple systems. We are using C++ with Qt and with the QtCreator to deploy on Windows and MacOS and can potentially also deploy to GNU/Linux, Android, iOS and some others with >99% the exact same code. There are only some minor differences that need to be taken into account. Some companies/developers seem to completely recreate their software if they want to have it on other platforms as well, they tend to even use different programming languages and environments like ObjectiveC/Swift for MacOS and C# for Windows. Ha. They're proving themself uninformed. ;-)

@JSandusky: Even if an issue is opened and is not perfectly clear or perfectly written in your view, starting with such a rage is a really bad start and a bad attitude. You continue by making big claims and acting from a really high horse (or whatever is said in english) like you know everything. It's way better to be more modest and not like "You're such a moron, it's like that and that" (I'm exaggerating). One can fall quite deep if being incorrect, it's also quite escalating, not nice, bossy and also unprofessional. I haven't looked at your work or profile, you may be doing great stuff but such an attitude is not great at all. In a commercial software setting you will lose customers if you act like that to them ("serious financial consequence"). #SoftSkills

JSandusky commented 8 years ago

@gawag I do agree with your sentiment to a degree, However when I see anything that resembles the nonsense that someone would say about writing a shader (which is no different than writing pure C with some C++ boons) the only rational response is to fly into a rage. Compilers aren't magic, IDEs aren't magic, and shaders aren't magic ... incompetency with an IDE is not a good reason for something.

I find the request to be trite at best, and I am content with manually merging foreign and local sources rather than accepting such a thing, because it is genuinely a case of "you don't know your tools," and therefore easily worked around.

You continue by making big claims and acting from a really high horse (or whatever is said in english) like you know everything.

Pick something, I'll race you to both success and the most generic implementation.

Just to remind you, I'm running real-time hybrid GI at the present using precalculated form factors (reevaluated against a procedural sector based environment) with transform feedback to evaluate the lighting contributions and reflection. In a codebase built around the OGL and able to dump the OGL rules at any point (logistical insanity).

Orders of magnitude difference. Pretty difficult to ascertain though as I've removed my Angelscript IDE, geometry shader, tesselation shader, CPU skinning, IK, RakNet, asPEEK angelscript debugging, compute branches. Basically everything pertaining to Urho3D I've removed from public access.

Just added vector fields to particle effects earlier today, totally awesome ... in about 30 minutes. If you can't implement vector fields faster than that then you're not worth my time.


I'm opting out of further involvement. The sandbox isn't deep enough and I have no tolerance for anyone that can't bother to read RFCs, thus that's the right decision.

damu commented 8 years ago

@gawag I do agree with your sentiment to a degree, However when I see anything that resembles the nonsense that someone would say about writing a shader (which is no different than writing pure C with some C++ boons) the only rational response is to fly into a rage. Compilers aren't magic, IDEs aren't magic, and shaders aren't magic ... incompetency with an IDE is not a good reason for something.

I'm not sure if I know what you mean. The idea is not really about shaders (that could have been made more clear), it's more about giving (some of) the build options Urho has been build with to the using application. Some users may want to act differently if Urho was build, for example, with URHO3D_LUAJIT, like using an own Lua JIT compiler if Urho's is not JIT (performance reasons). Same with DirectX11 vs. 9. Some applications may be acting way differently depending on the available graphic API as just (automatically) using different shaders.

I find the request to be trite at best, and I am content with manually merging foreign and local sources rather than accepting such a thing, because it is genuinely a case of "you don't know your tools," and therefore easily worked around.

So you mean such a change would be better made in the users application / build process? Can Urho be build with different options as the using application and still be used normally? There seem to be less options for the using application. Could there be a case of really needing Urho's build options and not the one of the user application? Also there may be 3rd party code for Urho in use by the user which does different stuff depending on the build options of Urho, so integrating such code would be more work if the build process has to be changed.

There seem to be also the idea around to build the users application always with the same options that Urho has been build with. That would be really neat as it would save time and be less error prone compared to having to manually compare and activate CMake options. There are rather abstract linking errors if Urho was build for example with DirectX9 and the user application is trying to be build with OpenGL. Could that be done? I want Urho to be way easier to use and such a build process simplification would be one big step.

[...] Just added vector fields to particle effects earlier today, totally awesome ... in about 30 minutes. If you can't implement vector fields faster than that then you're not worth my time.

These parts seem to be about two things:

This was never about your work or your ability to do stuff, it is about your way to aggressive and hostile reaction to an idea about an Urho change.

JSandusky commented 8 years ago

Partial quote:

I'm not sure if I know what you mean. The idea is not really about shaders (that could have been made more clear), it's more about giving (some of) the build options Urho has been build...

I gave shaders as an example of something that is less difficult than eating a bagel that somehow manages to be classified as hard. Somehow switching vec3 to float3 and "mix" to "clamp" is difficult. The example was for comparison purposes. Unfortunately that was clearly lost.

So you mean such a change would be better made in the users application / build process?

No, I'm stating that I don't care, because it's trite for ME to work around. As in, "meh, whatever ... I can deal with it."

Yes. That wasn't the point though. One could act differently in code if some materials are for example only available for OpenGL, like fallback materials.

No, you already have your platform defines. That's not a good reason. If you're compiling for RPI you already have the platform defines for it, so use them.

These parts seem to be about two things:

You bragging about stuff you are doing and how fast, which is totally unrelated to this issue/thread You have removed public access to stuff you have done, I also don't get the relation to this issue/thread. This was never about your work or your ability to do stuff, it is about your way to aggressive and hostile reaction to an idea about an Urho change.

Seriously. You said:

@JSandusky: Even if an issue is opened and is not perfectly clear or perfectly written in your view, starting with such a rage is a really bad start and a bad attitude. You continue by making big claims and acting from a really high horse (or whatever is said in english) like you know everything. It's way better to be more modest and not like "You're such a moron, it's like that and that" (I'm exaggerating). One can fall quite deep if being incorrect, it's also quite escalating, not nice, bossy and also unprofessional. I haven't looked at your work or profile, you may be doing great stuff but such an attitude is not great at all. In a commercial software setting you will lose customers if you act like that to them ("serious financial consequence"). #SoftSkills

Tell me how that isn't an explicit call out and therefore about me?

You hash-tagged it, It's a very explicit challenge.

Reminder, in case you forgot, you hash-tagged it. Tell me again how you didn't drag me into it? Because you did in that paragraph and by hash-tagging it.

So what? Each pick two and let a forum vote decide the subject matter? You picked the fight either fight it out or rebuke the grounds on which you declared the fight.


Also, I never called the poster a moron. I repeatedly hammered it down that the request would not fix his problem. No one has addressed that, possibly because the aggressiveness I've used has made it taboo now to say that "no #defines will not fix a problem with your content."

Never did the original poster present anything to resemble a "here's my content" for anyone to check and verify how reasonable it is. The poster never stated anything regarding path handling or other such scenarios that could very well lead to such scenarios. Which is why I have a "meh, don't give a shit" attitude to methods that would address the inclusion of #defines, I'll just keep asking "Graphics" who he is.


The reason for something is more important than the end result. As posted, the reason is bad, but livable and I look forward to the next post by the requester about how it didn't fix his problem.

friesencr commented 8 years ago

I am tired of reading this thread. I am going to lock it and if a maintainer fixes the issue they can close it.

weitjong commented 8 years ago

Following build options should be auto-discovered based on the found Urho3D library:

More can be added later if need to as the new mechanism can easily bake and later auto-discover any compiler define tied to build option. PR is welcome.

weitjong commented 8 years ago

I have unlocked the issue for open discussion. Don't make me regret it. It can be re-locked again in no time.

bvanevery commented 8 years ago

Ok, running a Windows CMake build for the VS2015 64-bit generator, and selecting a DX11 build, I get the following in Urho3D.h:

#define URHO3D_STATIC_DEFINE
/* #undef URHO3D_OPENGL */
#define URHO3D_D3D11
#define URHO3D_SSE

That's cool. Thanks!

I'm a little puzzled by the following comment in Urho3D-CMake-common.cmake though:

# The URHO3D_OPENGL option is not defined on non-Windows platforms as they should always use OpenGL

That would make cross-platform C++ code more difficult to write. I would expect that one would want URHO3D_OPENGL defined wherever OpenGL is in use. I realize that this mirrors the actual build options at present, and that it can be worked around by guarding with #ifdef _WIN32. But I think a define that simply states OpenGL is in use would be more straightforward. Not realizing this quirk, could trick a Windows-centric author into writing an OpenGL code path that is not in fact obeyed on most platforms.

I'm wondering if there should be an explicit DX9 define for similar reasons, but at least there's no cross-platform confusion.

I do wonder at why URHO3D_64BIT wouldn't be applicable to VS users. It is possible to do 32-bit or 64-bit builds in VS.

I'm afraid that I don't have a test case in actual code for this mechanism at present. The original precipitating issue that I mentioned, gawag just up and wrote HLSL shaders for. I think in coming months though, I'll pull ahead in the opposite direction. I'll have HLSL, there won't be any GLSL. So I'll kick its tires then.

Ergo I wouldn't worry about "further improvement" until someone comes along with a real need for this in the wild. We had one, but it passed. Thanks again.

weitjong commented 8 years ago

The URHO3D_OPENGL option is not defined on non-Windows platforms as they should always use OpenGL

Yes, the wording was a little bit confusing. It actually just trying to say the option is not available. The build system will set URHO3D_OPENGL to 1 internally and always use OPENGL regardless, even when a naive Linux user passing in URHO3D_OPENGL=0 at the command line. At the moment there is no other options, well, not until Vulkan comes.

I do wonder at why URHO3D_64BIT wouldn't be applicable to VS users. It is possible to do 32-bit or 64-bit builds in VS.

You are quoting the comment out of the context again. For VS users, this URHO3D_64BIT is an input variable, i.e. VS users have to pass it in when calling CMake. This is because MSVC is not a multilib-capable compiler, unlike GCC or Clang. MSVC have one 32-bit compiler version and one 64-bit compiler version to do the 32-bit or 64-bit builds, respectively. While the multilib GCC/Clang can do all that with a single binary and simply by altering one compiler flag (-m flag). So, for GCC/Clang it is possible to use the same compiler to do both builds and hence it is possible for our FindUrho3D module to return URHO3D_64BIT as an output variable based on the value of -m compile flag. Anyway, this limitation is at the CMake/VS generators side, so we could not do anything with it even when it is not so nice for our VS users.

bvanevery commented 8 years ago

But when CMake is initialized, at least in the CMake GUI, the 1st thing the user does is select whether they want a 32-bit or 64-bit generator for VS. It's confusing, and that detail can be missed. I actually filed a feature request with CMake about it, that CMake could be presenting a better initial guess to users. https://public.kitware.com/Bug/view.php?id=15698 Like if I'm on a 64-bit system, why should CMake present me with a 32-bit generator as the default? Or why offer a default compiler that isn't even on my system? I'm not really expecting action on it though. Sometimes, I've seen communication on things I filed 8 years ago! But who knows if I file the bug, maybe someone else will run into it and agree with me.

Anyways if one goes through CMake GUI, then URHO3D_64BIT should be deduced, and I believe it is. Although, maybe it is deduced because of my 64-bit OS, and not because of my choice of CMake generator? If so, that would be a subtle bug. Haven't checked.

If one's going through the command line via one of the cmake_vs*.bat files... I'm concerned that the name of the generators may not be being canonized properly? The important line in cmake_generic.bat is:

    if "%~1" == "-VS" set "OPTS=-G "Visual Studio %~2%arch%""

but (say) "Visual Studio 14 Win64" is not the name that appears in CMake GUI. It is "Visual Studio 14 2015 Win64". I never use the .bat files, so I haven't tested this. I don't know if CMake has alternate generator names than what is displayed.

weitjong commented 8 years ago

I think you have grasped the nature of the problem with VS generator(s). The fact that you have to choose between 32/64 bit generator explicitly up front when using GUI is exactly why when on CLI user has to pass URHO3D_64BIT=0 or 1 initially. We can name the variable anything we like as long as we can tell CMake which generator variants we want to use. As for why CMake developers have two variants of VS generators, you have to ask them. IMHO though, this is already the best they can do, given how VS works. You have over-simplify thing to think that user on 64-bit Windows host system would only want to target for 64-bit Wintel/AMD platform. I am not really familiar with VS but due to my recent work to setup our CI jobs using AppVeyor, I quickly learn that actually VS in its own weird way (from the point of view of a Linux guy) is fully capable of performing cross-compiling. So, 64-bit VS could not just only target 64-bit but also 32-bit or even ARM for that matter. As a side note, it will be interesting to see VS experts in our limited community to integrate that VS capability into our build system. Anyway, my point is CMake developers cannot assume for sure which platform you want to target simply because you have a 64-bit host.

About the cmake_generic.bat, if it is wrong then we would know about it from our users by now. I think we have more VS users in our community, no? And now we can even rely on AppVeyor for that. You don't have to worry about this thing yourself :)

bvanevery commented 8 years ago

Except that it's not reasonable to assume 32-bit builds by default, this idea that URHO3D_64BIT must by rights be passed. We live in a 64-bit era. URHO3D_64BIT=1 should be the default behavior.

To be honest I'm not sure why most Windows users would stick with using a .bat interface to CMake. It is far easier, in terms of understanding options and behavior, to use the CMake GUI. Some people might be strongly oriented toward a command line, or worrying about scriptable tools, but I'm doubting those are usual use cases. So my concern was for how much testing the .bat files actually get from users. But I did try them now and they do work. Maybe CMake is pattern matching the input.