a bit field server_support is used for a bitfield. Unfortunately length of the bitfield is not correctly considered when accessing the field.
In the following code fragment the length of the zeroed memory is the length of the pointer (4 bytes in case of a 32bit system) not the length of the data the pointer points to (__GL_EXT_BYTES bytes).
__glXProcessServerString( const struct extension_info * ext,
const char * server_string,
unsigned char * server_support )
(void) memset( server_support, 0, sizeof( server_support ) );
Furthermore the length of the memory area pointed to by server_support
is defined in varying ways in the coding:
#define __GL_EXT_BYTES ((__NUM_GL_EXTS + 7) / 8)
unsigned char server_support[ __GL_EXT_BYTES ];
unsigned char server_support;
Currently __NUM_GL_EXTS = 132, so __GL_EXT_BYTES = 9.
So where server_support is used a buffer overflow by setting a bit may occur.
__GL_EXT_BYTES should always be used to refer to the length of the bitfield.
This problem was identified with cppcheck.
Would be nice to explain why you decided the bug was invalid.
It's invalid because the code referenced in the bug report was removed by commit 7ef4a07 in July 2006. I think it's time for any upgrade. :)
See also bug #7353.