XQuartz 2.7.8 works fine; all releases since then result in the following error when using glxinfo:
X Error of failed request: BadValue (integer parameter out of range for operation)
Major opcode of failed request: 150 (GLX)
Minor opcode of failed request: 3 (X_GLXCreateContext)
Value in failed request: 0x0
This appears to be independent of the MacOS version, and is rather an introduced bug in XQuartz.
This issue has been reported on several blogs/forums.
Workaround: roll back to XQuartz 2.7.8
Server environment (Ubuntu/Trusty):
GPU 0: GeForce 9800 GTX/9800 GTX+ (UUID: GPU-824df94c-eb1a-7d03-7a3b-d8b7f9f74025)
nvidia-340, 340.96, 3.13.0-87-generic, i686: installed
01:00.0 VGA compatible controller : NVIDIA Corporation G92 [GeForce 9800 GTX / 9800 GTX+] [10de:0612] (rev a2)
[ 25.314] (II) NVIDIA dlloader X Driver 340.96 Sun Nov 8 21:48:15 PST 2015
[ 25.128] (II) NVIDIA GLX Module 340.96 Sun Nov 8 22:11:26 PST 2015
client glx vendor string: NVIDIA Corporation
client glx version string: 1.4
direct rendering: No
server glx vendor string: SGI
server glx version string: 1.4
Client environment (Mac OS X 10.11.4 El Capitan)
GPU: Intel Iris Pro
I'm confused here. You say the server is on Ubuntu, but our libGL hasn't supported IGLX for quite some time. Did you mean that the client is Ubuntu? I assume so, and this is #96260
*** This bug has been marked as a duplicate of bug 96260 ***
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct.