There is a small bug at in file glxextensions.c at line 321, ---------------------------- static void __glXProcessServerString( const struct extension_info * ext, const char * server_string, unsigned char * server_support ) { unsigned base; unsigned len; (void) memset( server_support, 0, sizeof( server_support ) ); ------------ // server_support - is now sizeof(unsigned char*) == 4 (on x86) // insted sizeof (server_support) == 8
Created attachment 6069 [details] [review] solution, clean up the code This solution do some more clean up, but it is not absolute trivial, so recheck it
Created attachment 6190 [details] [review] Slightly cleaned up patch Your patch looks valid. I cleaned it up a bit though and removed the assertion that you described in #7354. Can someone commit that who knows the code better than me?
The problem is that __glXProcessServerString is used to process the string of GLX extension from the server *and* the string of GL extensions from the server. The bit-fields used to track GL and GLX extensions have different lengths, so __glXProcessServerString can't know in advance how much to clear. The correct fix, which I've just committed, is to move the memset from __glXProcessServerString to __glXCalculateUsableExtensions. In that function, server_support is an array (instead of a pointer), so sizeof(server_support) will generate the correct size value.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.