Closed techtonik closed 7 years ago
Cpu can do 3d too. But cpu is much slower doing realtime rendering stuff, 2d or 3d. Also Godot rendering is mostly sprite based, so gpu is the choice.
It is not really viable to do any kind of software rendering for games in 2017. Most polished games, even 2D ones, will end up using fancy particle effects (which will be GPU-based in 3.0) and shaders – and these are not doable with reasonable performance using only software rendering.
@honix why sprite based rendering requires GPU? As far as I remember, there are no sprite objects in OpenGL specifications. Only textures with no memory control, with no room for optimizations like in Quake or Second Reality on pixel level. CPU may be slower relatively, but if it is still 60 FPS against 400 FPS on GPU, then it doesn't matter. For sprite based rendering, the primary concern may be speed of system memory. But watching HD videos on my PC tells me that it is not a problem.
@Calinou don't you think that require modern GPU for all 2D games, just because some 2D games want to use fancy 3D effects is somewhat wrong? I would like to see those features optional and have a choice to raise my system requirements only if I really need them. This should be also true if I want my 2D game to use fancy effects that require OpenGL 4.0, new Vulcan specifications, or unique capabilities of Google Pixel or some hardware AI rendering modules.
This should be also true if I want my 2D game to use fancy effects that require OpenGL 4.0, new Vulcan specifications, or unique capabilities of Google Pixel or some hardware AI rendering modules.
That would require writing 3/4 different rasterizers, with different capability. That's a lot of work and it will be hard to maintain. This is the reason why OpenGL (and graphics libraries) where created in the first place. So you would only need to implement the rasterizer once.
why sprite based rendering requires GPU? As far as I remember, there are no sprite objects in OpenGL specifications.
Not an expert here, but alpha channels, object culling, z-buffer handling, shading, color blending, filters, and so on are all pretty hard on the CPU (both computationally and to implement).
This is the reason why OpenGL (and graphics libraries) where created in the first place. So you would only need to implement the rasterizer once.
That was 25 years ago. And the word rasterizer
is not from 2D space. For 2D rendering backend I'd expect something like "blit pipeline" instead. Choosing that backend and checking its features match those that game requires could be the job of 2D engine. Implementing correct abstraction at the game designer's level around features, it may allow people to choose between compatibility and fanciness, and would also allow developers port more features or implement compatible workaround, alternatives.
Not an expert here, but alpha channels, object culling, z-buffer handling, shading, color blending, filters, and so on are all pretty hard on the CPU (both computationally and to implement).
I agree that this is common assumption about CPU capabilities, but from the other side we don't have data on what CPU optimizations are capable of. I won't be surprised if alpha channels, sprite culling and z-buffers are already implemented in even primitive graphics card drivers, because modern OS use them for ordinary UI.
There is zero benefit of using CPU to render 2D, and zero hardware without GPU support. This is never going to happen.
On Mon, May 22, 2017 at 8:20 AM, anatoly techtonik <notifications@github.com
wrote:
This is the reason why OpenGL (and graphics libraries) where created in the first place. So you would only need to implement the rasterizer once.
That was 25 years ago. And the word rasterizer is not from 2D space. For 2D rendering backend I'd expect something like "blit pipeline" instead. Choosing that backend and checking its features match those that game requires could be the job of 2D engine. Implementing correct abstraction at the game designer's level around features, it may allow people to choose between compatibility and fanciness, and would also allow developers port more features or implement compatible workaround, alternatives.
Not an expert here, but alpha channels, object culling, z-buffer handling, shading, color blending, filters, and so on are all pretty hard on the CPU (both computationally and to implement).
I agree that this is common assumption about CPU capabilities, but from the other side we don't have data on what CPU optimizations are capable of. I won't be surprised if alpha channels, sprite culling and z-buffers are already implemented in even primitive graphics card drivers, because modern OS use them for ordinary UI.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/godotengine/godot/issues/8857#issuecomment-303072202, or mute the thread https://github.com/notifications/unsubscribe-auth/AF-Z23yBCqNPTgIlM34WhYGwS2YpCuyqks5r8W93gaJpZM4NiCnH .
I won't be surprised if alpha channels, sprite culling and z-buffers are already implemented in even primitive graphics card drivers, because modern OS use them for ordinary UI.
That might be the case, but I have no proof of it, and running any modern UI on a non-accelerated hardware actually leads to awful performance, mouse lagging, and so on. Do you have any knowledge that this kind of optimization exists or is it just and assumption? Because I can see no modern UI (i.e. with transparency, effects, and the like) that does not lag on non-accelerated hardware.
Additionally, that would require writing custom renderer code for every platform since Windows default driver probably expose different functions from Linux, Mac, Android, etc (and, as you said OpenGL was born 25 years ago to solve this problem, and it still does!).
Of course you can write very optimized code in assembler, like MenuetOS does, and have wonderful performance, but it's gonna take a very long time, and will result in a very rigid, hard to port, very limited system that only runs on very specific hardware.
Fun fact, you can already use your CPU as a GPU on modern OSes (see LLVMPIPE https://www.mesa3d.org/llvmpipe.html ) although the performances are not good, if the game is very lightweight as you say it might work.
Also, GPUs provide acceleration for 2D drawing since I have memory.. I remember even my old Trident graphics cards using 16 bit ISA bus in 1994 supporting rectangle blitting, and early S3 GPUs in 1996 had 3D acceleration too...
On Mon, May 22, 2017 at 9:15 AM, Fabio Alessandrelli < notifications@github.com> wrote:
I won't be surprised if alpha channels, sprite culling and z-buffers are already implemented in even primitive graphics card drivers, because modern OS use them for ordinary UI.
That might be the case, but I have no proof of it, and running any modern UI on a non-accelerated hardware actually leads to awful performance, mouse lagging, and so on. Do you have any knowledge that this kind of optimization exists or is it just and assumption? Because I can see no modern UI (i.e. with transparency, effects, and the like) that does not lag on non-accelerated hardware.
Additionally, that would require writing custom renderer code for every platform since Windows default driver probably expose different functions from Linux, Mac, Android, etc (and, as you said OpenGL was born 25 years ago to solve this problem, and it still does!).
Of course you can write very optimized code in assembler, like MenuetOS does, and have wonderful performance, but it's gonna take a very long time, and will result in a very rigid, hard to port, very limited system that only runs on very specific hardware.
Fun fact, you can already use your CPU as a GPU on modern OSes (see LLVMPIPE https://www.mesa3d.org/llvmpipe.html ) although the performances are not good, if the game is very lightweight as you say it might work.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/godotengine/godot/issues/8857#issuecomment-303083050, or mute the thread https://github.com/notifications/unsubscribe-auth/AF-Z2wws_uV74I1UDpz5d5STUfu_n8YBks5r8XxNgaJpZM4NiCnH .
Going back to the NES and C64 in 1983, it also did everything using GPU.. I think sotware blitting was only popular in the early 90s when VGA was just introduced... the only other time when it was used was in Flash, which did vector graphics using CPU, and most flash stuff never even reached 60fps...
On Mon, May 22, 2017 at 9:44 AM, Juan Linietsky reduzio@gmail.com wrote:
Also, GPUs provide acceleration for 2D drawing since I have memory.. I remember even my old Trident graphics cards using 16 bit ISA bus in 1994 supporting rectangle blitting, and early S3 GPUs in 1996 had 3D acceleration too...
On Mon, May 22, 2017 at 9:15 AM, Fabio Alessandrelli < notifications@github.com> wrote:
I won't be surprised if alpha channels, sprite culling and z-buffers are already implemented in even primitive graphics card drivers, because modern OS use them for ordinary UI.
That might be the case, but I have no proof of it, and running any modern UI on a non-accelerated hardware actually leads to awful performance, mouse lagging, and so on. Do you have any knowledge that this kind of optimization exists or is it just and assumption? Because I can see no modern UI (i.e. with transparency, effects, and the like) that does not lag on non-accelerated hardware.
Additionally, that would require writing custom renderer code for every platform since Windows default driver probably expose different functions from Linux, Mac, Android, etc (and, as you said OpenGL was born 25 years ago to solve this problem, and it still does!).
Of course you can write very optimized code in assembler, like MenuetOS does, and have wonderful performance, but it's gonna take a very long time, and will result in a very rigid, hard to port, very limited system that only runs on very specific hardware.
Fun fact, you can already use your CPU as a GPU on modern OSes (see LLVMPIPE https://www.mesa3d.org/llvmpipe.html ) although the performances are not good, if the game is very lightweight as you say it might work.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/godotengine/godot/issues/8857#issuecomment-303083050, or mute the thread https://github.com/notifications/unsubscribe-auth/AF-Z2wws_uV74I1UDpz5d5STUfu_n8YBks5r8XxNgaJpZM4NiCnH .
There is zero benefit of using CPU to render 2D, and zero hardware without GPU support. This is never going to happen.
This issue is deeper that its title, but I am not going to become over-defensive. Yes, maybe the title is misleading. I didn't meant that game engine will run fine without graphics card at all. I meant that if game engine architecture gives this option to developers, it is more flexible to adapt to any graphics card and platform peculiarities, and not just the rigid OpenGL 3.0 support.
I was just thinking that there are engines like of Klei Entertainment that give a good performance in 2D where other engines are equally sluggish in 2D and 3D. Just a random link about SDK that promises to get to assembly optimizations as needed http://getmoai.com/wiki/index.php?title=Moai_SDK_Basics_Part_One
Back in the days people tuned their games to specific graphic cards, and there was no internet to sync the efforts, and games were fast for that hardware. Internet should have been added something, like distributed competence to implement and test specific features/optimizations on specific graphics card, but I don't see it happen. Game engines attract a lot of developers, much more than project like https://people.freedesktop.org/~nh/piglit/ that try to close this issue for OpenGL. I just wish there could be a great framework / constructor for such efforts in 2D space.
Other engines are moving from software to hardware because the software performance is terrible on mobile and html5, and even desktop is limited.
I don't see the point to go where everyone else is trying to escape from.
it is more flexible to adapt to any graphics card and platform peculiarities, and not just the rigid OpenGL 3.0 support. Just a random link about SDK that promises to get to assembly optimizations as needed
You can actually do that. The whole rendering part is just few files, if you want to optimize for specific hardware you can just change the renderer part (driver/gles2
, driver/gl_context
) for your non-GL card which uses its own driver calls to draw and you can easily use assembly (or just C++) code to optimize other part of the engine via custom modules or editing the engine source code itself (which I do for my game for example).
The main issue is that any modern card implements OpenGL, including most single board computer (e.g. Raspberry PI which runs fantastic for 2D with the amazing work from efornara). And any modern processor include an integrated Graphic Card supporting OpenGL (like my 4 years old development machine, which does not have a dedicated video card). I don't see why time should be spent in trying to support specific hardware from the 90s or the early 2000 while the market the clearly going toward standard libraries like OpenGL. If a game developer wants to do a game targeting specific hardware he can, but I don't see the point of having core devs working towards supporting something like the Savage 3D from S3 just to mention one card I remember from back in the days.
I think this is not a question about Godot or any specific program, but about the driver stack you're using on your operating system.
MESA includes 3 different software implementations of OpenGL: OpenSWR, LLVM Pipe and softpipe. All those three implement OpenGL 3.3, which is the requirement for Godot 3 on desktop.
OpenSWR apparently has Windows support as well.
There is 2D abstraction and you can make a VisualServer that does 2D in the CPU, but it'll probably be very difficult to mix that with accelerated 3D.
On 22 May 2017 at 14:13, Ferenc Arn notifications@github.com wrote:
I think this is not a question about Godot or any specific program, but about the driver stack you're using on your operating system.
MESA includes 3 different software implementations of OpenGL: OpenSWR http://openswr.org/, LLVM Pipe https://www.mesa3d.org/llvmpipe.html and softpipe. All those three implement OpenGL 3.3 https://mesamatrix.net/, which is the requirement for Godot 3 on desktop.
OpenSWR apparently has Windows binaries http://openswr.org/build-windows.html as well.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/godotengine/godot/issues/8857#issuecomment-303162793, or mute the thread https://github.com/notifications/unsubscribe-auth/AGVmPe6ruwjmXpaIKW8x9t4_oG4jdsxSks5r8cIngaJpZM4NiCnH .
Other engines are moving from software to hardware because the software performance is terrible on mobile and html5, and even desktop is limited.
I don't see the point to go where everyone else is trying to escape from.
I don't see where it conflicts with my point. The point is that OpenGL may be a terrible standard for mobile and resource critical platforms (battery life, memory usage etc.), and especially if we require it for 2D. That's why we've got Vulcan. And even Vulcan may not be as good for 2D games as some Matrox graphics card from the past.
So if Godot engine could be able to target some hardware platforms feature by features and tell developers - "hey, that component/feature/effect bumps the requirements for user computer to OpenGL 3. x and here are the stats of the users who possess such computers - and your grandma/girlfriend/wife/father is not on the list" - that could really help. Filtering components by the requirements and making them expose features that are tuned to specific platforms could be awesome. Tying them to test that users can run on their hardware - could be great for ecosystem as well. Telling - "hey, you game could get 5+ FPS on rpi if you swap this effect library with this one" - good incentive to contribute.
If people are mining coins, they could as well mine performance tests.
If a game developer wants to do a game targeting specific hardware he can, but I don't see the point of having core devs working towards supporting something like the Savage 3D from S3 just to mention one card I remember from back in the days.
Right. There is no such point as having core devs implementing S3 or other specific hardware. The point is that core dev may probably provide first class interface for alternative platforms in their engine (one side), that feature matches with 2D abstractions in the API (the other side) to enable easy exchange and integration of such components.
@techtonik, I do not want to convince you that your idea is bad. Instead, I'll give you a couple of ways to get what you want: 1 Simple: You just need to build a project https://en.wikipedia.org/wiki/Mesa_(computer_graphics) and put the dll files into one co-op with the executable file. And your project will start to draw everything with the help of cpu (it works on Windows, on Linux and other platforms, it may be necessary to play a little with the tambourines). 2 Complicated: You can implement your own render. To do this, you just have to pericipate a couple (maybe a little more) of the classes in the engine. If you do not know how to program in c ++, or you do not want to be able to find like-minded people, they will throw money and order this work for an old man for money (unless you no longer need anyone else, then you realize how bad this idea is).
I'm not sure if you've ever programmed in raw opengl, but writing cross platform opengl code is nothing but sanity practice. Even after 20 years, the situation hasn't improved. It's actually gotten worse with shaders. The benefit to CPU rendering is utter reliability. Your code will render identically on every single system that supports your compiler. As a developer you can properly step through your code, instruction by instruction, and debug - whereas with shaders you can only approximate these things.
Unreal Tournament ran on a 90mhz pentium in 1998 purely in software. Surely a 4.2ghz quad core i7 with hyperthreading and SIMD instructions can run a 2D game. Beautifully even.
And your users will never run into driver issues or card issues. Having a software renderer for 2D games would be pretty cool.
What's the issue here tho? "2D abstraction" is there already, it's the "canvas_*" stuff in VisualServer, you can make one that implements just that, and texture management.
On 25 May 2017 at 02:12, matthew levenstein notifications@github.com wrote:
I'm not sure if you've ever programmed in raw opengl, but writing cross platform opengl code is nothing but sanity practice. Even after 20 years, the situation hasn't improved. It's actually gotten worse with shaders. The benefit to CPU rendering is utter reliability. Your code will render identically on every single system that supports your compiler. As a developer you can properly step through your code, instruction by instruction, and debug - whereas with shaders you can only approximate these things.
Unreal Tournament ran on a 90mhz pentium in 1998 purely in software. Surely a 4.2ghz quad core i7 with hyperthreading and SIMD instructions can run a 2D game. Beautifully even.
And your users will never run into driver issues or card issues. Having a software renderer for 2D games would be pretty cool.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/godotengine/godot/issues/8857#issuecomment-303924236, or mute the thread https://github.com/notifications/unsubscribe-auth/AGVmPTmBm4JsglJ_28jKZMBR0y6sR7Ugks5r9Q2ygaJpZM4NiCnH .
@techtonik We've discussed this many times over the years in the various issues you opened: give up, it won't happen. Godot Engine is a modern 2D and 3D game engine, it can't cater to very low end setup, as much as we'd like to support both the most advanced 2017 research in 3D rendering and toasters from the 90s.
There are other frameworks and maybe lightweight engines that are much better suited to your use case than Godot.
I can't add a label modern
to a game engine that is hardcoded into 10 years old OpenGL 3 standard. For me, modern
means that it includes toolset that helps game developers to deliver excellent game experience for capabilities of phones, html5 web apps, desktops and IoT devices (VR, webcams, EEG etc.).
I realize that it may be not technically feasible to build such architecture, but even writing about this with examples may be as equally useful for community as describing Godot 3 renderer design.
Oh, cut the crap already. OpenGL 3 is too recent for you (2.1 was already too recent too), but it's not modern enough? You're the only user ever to have an issue with how we handle our graphics API requirements, so just face the facts: Godot is not made for you. There are plenty of other fish in the sea, go catch them.
Otherwise, if you really really really want this to happen, fork Godot and implement it. If it ends up being awesome and a game changer, we might even consider a PR, who knows. But let me be clear once again: no existing Godot dev has any interest (or even sees any advantage) in what you're describing here. So until code and performance tests back up a real use case, it will just be wishful thinking and something we don't waste our limited time on.
My old laptop that was $200 off craigslist renders godot apps with 60+fps..
Also, this suggestion would undermine and disrespect the thousands of devs that worked HARD for their money to buy a GPU, to achieve higher fps. Just going to throw all that way and make everyone go buy $400 Coffee Lake 8700ks? Not fair imo
@reduz
There is zero benefit of using CPU to render 2D, and zero hardware without GPU support.
But there is a lot of hardware with GPUs that godot doesn't support. It would be nice if one could at least render the editor on CPU.
It would be nice if one could at least render the editor on CPU.
Mesa's software OpenGL implementation is sufficient to run Godot 3.0. You can download Windows binaries on mesa-dist-win.
But there is a lot of hardware with GPUs that godot doesn't support.
With the GLES2/GL2.1 setup? I find hard to believe that there are "a lot of hardware" that doesn't support it. And as said, you can find CPU support for the OpenGL API.
With the GLES2/GL2.1 setup? I find hard to believe that there are "a lot of hardware" that doesn't support it.
On my computers that only support OpenGL2.1 the editor refuses to start with the message:
Your system's graphic drivers seem not to support OpenGL 3.3 / OpenGL ES 3.0, sorry :( Godot Engine will self-destruct as soon as you acknowledge this error message
If there is a way to start it in an another way then this probably should be noted in the message itself.
And as said, you can find CPU support for the OpenGL API.
There seems to be no documentation for godot for starting it with CPU rendering.
@Tsutsukakushi only for master, run the executable with --video-driver GLES2
, if you return from the editor to the manager, it will do it in GLES3 mode for now.
I suppose that the project manager will be able to detect if GLES3 is not supported and restart in the other mode automatically on stable.
You can try to use this fork of 3.0, uses the GLES2 renderer by default (the WIP, with the current limitations and not possible to use particles). https://github.com/efornara/godot/releases
Some users are using this already as workaround for mobile on simple games.
OpenGL ES 2.0 Renderer: Mesa DRI Intel(R) Ironlake Mobile ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. handle_crash: Program crashed with signal 11 Dumping the backtrace. Please include this when reporting the bug on https://github.com/godotengine/godot/issues [1] /lib/x86_64-linux-gnu/libc.so.6(+0x35fc0) [0x7f2d51e78fc0] (??:0) -- END OF BACKTRACE -- Aborted
@eon-s doesn't seem to work.
Guys, open a new issue please.
On Fri, Aug 3, 2018, 10:55 Tsukiko Tsutsukakushi notifications@github.com wrote:
OpenGL ES 2.0 Renderer: Mesa DRI Intel(R) Ironlake Mobile ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. ERROR: _gl_debug_print: GL ERROR: Source: OpenGL Type: Error ID: 9 Severity: High Message: GL_INVALID_OPERATION in glTexImage2D(bad target for texture) At: drivers/gles2/rasterizer_gles2.cpp:111. handle_crash: Program crashed with signal 11 Dumping the backtrace. Please include this when reporting the bug on https://github.com/godotengine/godot/issues [1] /lib/x86_64-linux-gnu/libc.so.6(+0x35fc0) [0x7f2d51e78fc0] (??:0) -- END OF BACKTRACE -- Aborted
@eon-s https://github.com/eon-s doesn't seem to work.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/godotengine/godot/issues/8857#issuecomment-410248878, or mute the thread https://github.com/notifications/unsubscribe-auth/AF-Z21njaNPn_bQOQ8aHwVH_-Rr3und9ks5uNFZcgaJpZM4NiCnH .
@reduz: Can you lock this issue so that it doesn't get bumped again?
According to slant.co, Godot is now the top 2D game engine https://www.slant.co/topics/341/viewpoints/22/~best-2d-game-engines~godot
It would be even more awesome if 2D engine could run without 3D accelerator.
These things mean more funding, better performance for specific cases (window management, real-time raster video-rendering, etc.), more attention from low powered devices and high skilled math hackers.