1
I know this question may sound a little bit silly but I'd like to know if there is a way to do what I pretend without "Brute forcing".
I am developing a quit big application that uses some OpenGL Stuff, for GUI purposes. I've been using a lot of extensions of OpenGL. Everything was done in a machine with a good Graphical Card, so everything always worked fine.
Now I tested my application in a machine with weaker graphical card - The first one supports a lot of opengl 4 features, the second one only up to a few from version 3 - and, unfortunately, a couple of things are not being displayed.
I'd say the reason is that, somehow, I am using an extension which is not supported on the weaker machine, however I've got no single clue which extension or feature may be not supported.
So, the question is is there any way that I can find out which extension it is by, for example, disabling one extension somehow and checking if it still works on the good machine. My problem is that the code is quite big to start commenting line by line on the good machine until I experience the same problem.
1You forgot to mention the OS :) There are system dependent tools doing that. – count0 – 2012-10-02T13:18:05.113
I'd use the OpenGL Extensions Viewer. If memory serves, it can produce lists in CSV files (or something similar), which should make comparison fairly simple (e.g., with
– Jerry Coffin – 2012-10-02T13:20:19.297diff
).