paroj / gltut

Learning Modern 3D Graphics Programming
https://paroj.github.io/gltut/
MIT License
1.57k stars 377 forks source link

Suggest OpenGL 3.2 as a basis #50

Closed paroj closed 8 years ago

paroj commented 12 years ago

Originally reported by: Shawn Walker (Bitbucket: binarycrusader, GitHub: binarycrusader)


Instead of basing this work on OpenGL 3.3, I'd like to suggest you base it on OpenGL 3.2.

While I realize that many platforms have OpenGL 3.3 available, OpenGL 3.2 is currently the max supported on OS X (and it has an excellent implementation of it).

The gltut materials are excellent, and generally work on OpenGL 3.2 with very little modification. (Mainly just removing use of the layout keyword, and using a different shader version.)

I don't see a particularly compelling reason to limit yourself to OpenGL 3.3 support, when OpenGL 3.2 allows you to fulfill all of the goals of writing about a modern graphics rendering pipeline.

Alternatively, a short section providing a summary of differences between OpenGL 3.2 and 3.3 would suffice for users following along.


paroj commented 12 years ago

Original comment by Jason McKesson (Bitbucket: alfonse, GitHub: alfonse):


Thank you for your understanding. However, one minor point:

I would note that OpenGL 3.2 was finalized in December 2009, so you're talking about 2.5 years, not 3 years.

GL 3.2 was released August 3, 2009. The spec received some editorial modifications since then (which is where your December date comes from), but those were spec bugs, not real changes to the meaning of anything.

paroj commented 12 years ago

Original comment by Shawn Walker (Bitbucket: binarycrusader, GitHub: binarycrusader):


I disagree with your conclusions, but I completely respect your decision.

I would note that OpenGL 3.2 was finalized in December 2009, so you're talking about 2.5 years, not 3 years. And vendors do need time to actually deliver a full implementation, so Apple's really not that far behind. I remain hopeful that they will update to OpenGL 4.x when they end-of-life support for the last bits of desktop hardware that are incapable of anything beyond OpenGL 3.3.

I had only filed this request to see if this was an option that had been considered and the possible reasons behind it.

The materials you've written are by far the best modern introduction I've read to the OpenGL pipeline, and I was just hopeful that perhaps they might be accessible to a wider audience.

You have my sincere thanks for taking the time to reason out why you chose a different approach, and I look forward to the published result. (This definitely deserves to be published as a book when complete!)

paroj commented 12 years ago

Original comment by Jason McKesson (Bitbucket: alfonse, GitHub: alfonse):


I don't see a particularly compelling reason to limit yourself to OpenGL 3.3 support, when OpenGL 3.2 allows you to fulfill all of the goals of writing about a modern graphics rendering pipeline.

The principle compelling reason is exactly what you removed: layout(location=#). It makes it very clear in the shader itself where the attribute indices come from. I don't have to have to explain a bunch of pre-linking functions (glBindAttribLocation). It's easier for the user to both use and understand.

Oh, and don't forget about sampler objects, which are also 3.3 (and I use them all the time). I'm probably going to employ texture swizzling at some point too, which is 3.3.

Ultimately, it comes down to this: there is no hardware that exists which supports 3.2 that //cannot// support 3.3. MacOSX //could// support it, but it doesn't (and probably won't anytime soon). I don't support MacOSX (if for no other reason than the fact that I can't test something on MacOSX), so I don't see a reason why I should give up layout(location) syntax for a platform that I don't support.

I personally refuse to let Apple's unwillingness to update their GL implementation interfere in the progression of OpenGL. If they don't want to support GL versions released almost 3 years ago, and more modern extensions and so forth, so be it.

I might make a notation somewhere in an appendix about Apple's 3.2 implementation. But ultimately no, I'm not going to roll back the version on my tutorials. Honest, the biggest reason I don't require 4.2 (and therefore separate_shader_objects, 420_pack, and all manor of other goodness) is because there's a lot less hardware out there that can run them. And even //that// excuse is becoming less credible by the day.

Only AMD and nVidia support OpenGL > 3.2 currently on Linux (out of the major vendors).

There is also a notation on the "What You Need" page that explains that you need NVIDIA or AMD hardware only. I don't support Intel hardware of any kind, whether on Linux or Windows. Intel's drivers are terrible, and the open source drivers aren't getting the job done as far as modern GL support.

Again, I'm not going to hold up the progress of OpenGL just because Intel and the open-source community can't get their implementations working. The most effective way to push people to update their implementations is to make users //want// to update them. And that means users need to know what's out there.

paroj commented 12 years ago

Original comment by Shawn Walker (Bitbucket: binarycrusader, GitHub: binarycrusader):


I'd also add that on Linux, OpenGL support isn't expected to reach 3.0 support until the end of this year for open-source drivers such as Intel's, and early 2013 is the best we can expect for OpenGL 3.1/3.2 support.

http://www.phoronix.com/scan.php?page=news_item&px=MTA5Njc

Only AMD and nVidia support OpenGL > 3.2 currently on Linux (out of the major vendors).