Created attachment 126049 [details]
stops when glRenderMode has high latency test case
for unknown reasons I noticed my GL application was very slow after migrating to recent platforms.
after dissection of application, I found that has happened when using GL_FEEDBACK rendermode.
Sometime, it takes very long time. I made a single testcase app that monitors time spent in it to reproduce the problem.
This has happened too using GL_ALWAYS_SOFTWARE mode =1
GL_FEEDBACK and GL_SELECT is implemented entirely in software. This is unlikely to be solved any time soon...improving the speed of extremely legacy features isn't high on anyone's priority list. Sorry :(
I may not have understood everything...Do you mean glRenderMode(GL_FEEDBACK) is obsolete and I should use new OpenGL features?
Yes, glRenderMode has been deprecated for a long long time.
If you want to disable rendering, you can use glEnable(GL_RASTERIZER_DISCARD).
Transform feedback (GL 3.0) can record a subset of your vertex shader's outputs into a buffer. Another alternative would be to have your vertex shader write data into a shader storage buffer object (SSBO) (GL 4.3).
Unfortunately, both of those options give you data pre-clipping, while glRenderMode(GL_FEEDBACK) gives you data post-clipping. It's a bit awkward because there isn't an exact replacement for the deprecated functionality.
The thing is, GL_FEEDBACK and GL_SELECT don't really exist in hardware. We have to emulate them somehow in the driver.
Thanks for your reply..
It is very good too know that GL_FEEDBACK is not made by hardware...because I was sure it was using hardware and thus sparing cpu consuming on small processors architecture....and I think I am not standalone....
I will migrate to new OpenGL specifications.