Hi. I am having problems starting X windows with Nvidia binaries. I am running Fedora Core 2-test3 with Nvidia's 55.36 drivers. The problem is annoying. I can't start the system (boot) with Nvidia drivers, but after booting with standard Linux/nvidia drivers, the system X comes up fine. I can then comment "x:5:respawn:/etc/X11/prefdm -nodaemon" in /etc/inittab, run init q, change the xorg.conf to run the nvidia drivers, then uncomment the line in /etc/inittab and re-run init q. Nvidia starts up perfectly (and runs well, including tv overscan, glx applications (tux racer, blender, gears runs at 3200fps, etc.). Restarting the system (now with nvidia binary drivers) results in a locked login screen (and a reboot to the rescue disk, where I can change the xorg.conf file back to the linux/nvidia drivers and then it once again boots ok). /var/log/Xorg.0.log.old shows the following problems: Symbol __glXActiveScreens from module /usr/X11R6/lib/modules/extensions/libdri.a is unresolved! Symbol __glXActiveScreens from module /usr/X11R6/lib/modules/extensions/libdri.a is unresolved! (EE) Failed to initialize GLX extension (NVIDIA XFree86 driver not found) (**) Option "Protocol" "IMPS/2" (**) Mouse0: Device: "/dev/input/mice" (**) Mouse0: Protocol: "IMPS/2" (**) Option "CorePointer" (**) Mouse0: Core Pointer (**) Option "Device" "/dev/input/mice" (**) Option "Emulate3Buttons" "yes" (**) Mouse0: Emulate3Buttons, Emulate3Timeout: 50 (**) Option "ZAxisMapping" "4 5" (**) Mouse0: ZAxisMapping: buttons 4 and 5 (**) Mouse0: Buttons: 5 (II) Keyboard "Keyboard0" handled by legacy driver (II) XINPUT: Adding extended input device "Mouse0" (type: MOUSE) (II) Mouse0: ps2EnableDataReporting: succeeded AUDIT: Tue May 4 12:00:26 2004: 4271 X: client 5 rejected from local host I ran nm /usr/X11R6/lib/modules/extensions/libdri.a | grep glXActiveScreens and it shows: U __glXActiveScreens where the big U means the symbol is undefined. So the question is, is this an X.org problem, or an Nvidia binaries problem (I don't know who owns libdri.a), and is it related to the problems I'm having? The kernel I'm running is 2.6.6-rc3-bk5 Nvidia binaries: 53.36 Video card: Ti4200/128 (agp4 model) glxinfo with the nvidia drivers shows: name of display: :0.0 display: :0 screen: 0 direct rendering: Yes server glx vendor string: NVIDIA Corporation server glx version string: 1.3 server glx extensions: GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control, GLX_ARB_multisample client glx vendor string: NVIDIA Corporation client glx version string: 1.3 client glx extensions: GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_EXT_import_context, GLX_SGI_video_sync, GLX_NV_swap_group, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer, GLX_SGI_swap_control, GLX_NV_float_buffer GLX extensions: GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control, GLX_ARB_multisample, GLX_ARB_get_proc_address OpenGL vendor string: NVIDIA Corporation OpenGL renderer string: GeForce4 Ti 4200/AGP/SSE2 OpenGL version string: 1.4.1 NVIDIA 53.36 OpenGL extensions: GL_ARB_depth_texture, GL_ARB_imaging, GL_ARB_multisample, GL_ARB_multitexture, GL_ARB_occlusion_query, GL_ARB_point_parameters, GL_ARB_point_sprite, GL_ARB_shadow, GL_ARB_texture_border_clamp, GL_ARB_texture_compression, GL_ARB_texture_cube_map, GL_ARB_texture_env_add, GL_ARB_texture_env_combine, GL_ARB_texture_env_dot3, GL_ARB_texture_mirrored_repeat, GL_ARB_transpose_matrix, GL_ARB_vertex_buffer_object, GL_ARB_vertex_program, GL_ARB_window_pos, GL_S3_s3tc, GL_EXT_texture_env_add, GL_EXT_abgr, GL_EXT_bgra, GL_EXT_blend_color, GL_EXT_blend_minmax, GL_EXT_blend_subtract, GL_EXT_compiled_vertex_array, GL_EXT_draw_range_elements, GL_EXT_fog_coord, GL_EXT_multi_draw_arrays, GL_EXT_packed_pixels, GL_EXT_paletted_texture, GL_EXT_point_parameters, GL_EXT_rescale_normal, GL_EXT_secondary_color, GL_EXT_separate_specular_color, GL_EXT_shadow_funcs, GL_EXT_shared_texture_palette, GL_EXT_stencil_wrap, GL_EXT_texture3D, GL_EXT_texture_compression_s3tc, GL_EXT_texture_cube_map, GL_EXT_texture_edge_clamp, GL_EXT_texture_env_combine, GL_EXT_texture_env_dot3, GL_EXT_texture_filter_anisotropic, GL_EXT_texture_lod, GL_EXT_texture_lod_bias, GL_EXT_texture_object, GL_EXT_vertex_array, GL_HP_occlusion_test, GL_IBM_rasterpos_clip, GL_IBM_texture_mirrored_repeat, GL_KTX_buffer_region, GL_NV_blend_square, GL_NV_copy_depth_to_color, GL_NV_depth_clamp, GL_NV_fence, GL_NV_fog_distance, GL_NV_light_max_exponent, GL_NV_multisample_filter_hint, GL_NV_occlusion_query, GL_NV_packed_depth_stencil, GL_NV_pixel_data_range, GL_NV_point_sprite, GL_NV_register_combiners, GL_NV_register_combiners2, GL_NV_texgen_reflection, GL_NV_texture_compression_vtc, GL_NV_texture_env_combine4, GL_NV_texture_rectangle, GL_NV_texture_shader, GL_NV_texture_shader2, GL_NV_texture_shader3, GL_NV_vertex_array_range, GL_NV_vertex_array_range2, GL_NV_vertex_program, GL_NV_vertex_program1_1, GL_NVX_ycrcb, GL_SGIS_generate_mipmap, GL_SGIS_multitexture, GL_SGIS_texture_lod, GL_SGIX_depth_texture, GL_SGIX_shadow, GL_SUN_slice_accum glu version: 1.3 glu extensions: GLU_EXT_nurbs_tessellator, GLU_EXT_object_space_tess visual x bf lv rg d st colorbuffer ax dp st accumbuffer ms cav id dep cl sp sz l ci b ro r g b a bf th cl r g b a ns b eat ---------------------------------------------------------------------- 0x21 24 tc 0 32 0 r y . 8 8 8 0 0 24 8 16 16 16 16 0 0 None 0x22 24 dc 0 32 0 r y . 8 8 8 0 0 24 8 16 16 16 16 0 0 None 0x23 24 tc 0 32 0 r y . 8 8 8 8 0 24 8 16 16 16 16 0 0 None 0x24 24 tc 0 32 0 r . . 8 8 8 0 0 24 8 16 16 16 16 0 0 None 0x25 24 tc 0 32 0 r . . 8 8 8 8 0 24 8 16 16 16 16 0 0 None 0x26 24 tc 0 32 0 r y . 8 8 8 0 0 16 0 16 16 16 16 0 0 None 0x27 24 tc 0 32 0 r y . 8 8 8 8 0 16 0 16 16 16 16 0 0 None 0x28 24 tc 0 32 0 r . . 8 8 8 0 0 16 0 16 16 16 16 0 0 None 0x29 24 tc 0 32 0 r . . 8 8 8 8 0 16 0 16 16 16 16 0 0 None 0x2a 24 tc 0 32 0 r y . 8 8 8 0 0 0 0 16 16 16 16 0 0 None 0x2b 24 tc 0 32 0 r y . 8 8 8 8 0 0 0 16 16 16 16 0 0 None 0x2c 24 tc 0 32 0 r . . 8 8 8 0 0 0 0 16 16 16 16 0 0 None 0x2d 24 tc 0 32 0 r . . 8 8 8 8 0 0 0 16 16 16 16 0 0 None 0x2e 24 tc 0 32 0 r y . 8 8 8 0 0 24 8 16 16 16 16 2 1 Ncon 0x2f 24 tc 0 32 0 r y . 8 8 8 8 0 24 8 16 16 16 16 2 1 Ncon 0x30 24 tc 0 32 0 r y . 8 8 8 0 0 24 8 16 16 16 16 4 1 Ncon 0x31 24 tc 0 32 0 r y . 8 8 8 8 0 24 8 16 16 16 16 4 1 Ncon 0x32 24 tc 0 32 0 r y . 8 8 8 0 0 16 0 16 16 16 16 2 1 Ncon 0x33 24 tc 0 32 0 r y . 8 8 8 8 0 16 0 16 16 16 16 2 1 Ncon 0x34 24 tc 0 32 0 r y . 8 8 8 0 0 16 0 16 16 16 16 4 1 Ncon 0x35 24 tc 0 32 0 r y . 8 8 8 8 0 16 0 16 16 16 16 4 1 Ncon 0x36 24 dc 0 32 0 r y . 8 8 8 8 0 24 8 16 16 16 16 0 0 None 0x37 24 dc 0 32 0 r . . 8 8 8 0 0 24 8 16 16 16 16 0 0 None 0x38 24 dc 0 32 0 r . . 8 8 8 8 0 24 8 16 16 16 16 0 0 None 0x39 24 dc 0 32 0 r y . 8 8 8 0 0 16 0 16 16 16 16 0 0 None 0x3a 24 dc 0 32 0 r y . 8 8 8 8 0 16 0 16 16 16 16 0 0 None 0x3b 24 dc 0 32 0 r . . 8 8 8 0 0 16 0 16 16 16 16 0 0 None 0x3c 24 dc 0 32 0 r . . 8 8 8 8 0 16 0 16 16 16 16 0 0 None 0x3d 24 dc 0 32 0 r y . 8 8 8 0 0 0 0 16 16 16 16 0 0 None 0x3e 24 dc 0 32 0 r y . 8 8 8 8 0 0 0 16 16 16 16 0 0 None 0x3f 24 dc 0 32 0 r . . 8 8 8 0 0 0 0 16 16 16 16 0 0 None 0x40 24 dc 0 32 0 r . . 8 8 8 8 0 0 0 16 16 16 16 0 0 None 0x41 24 dc 0 32 0 r y . 8 8 8 0 0 24 8 16 16 16 16 2 1 Ncon 0x42 24 dc 0 32 0 r y . 8 8 8 8 0 24 8 16 16 16 16 2 1 Ncon 0x43 24 dc 0 32 0 r y . 8 8 8 0 0 24 8 16 16 16 16 4 1 Ncon 0x44 24 dc 0 32 0 r y . 8 8 8 8 0 24 8 16 16 16 16 4 1 Ncon 0x45 24 dc 0 32 0 r y . 8 8 8 0 0 16 0 16 16 16 16 2 1 Ncon 0x46 24 dc 0 32 0 r y . 8 8 8 8 0 16 0 16 16 16 16 2 1 Ncon 0x47 24 dc 0 32 0 r y . 8 8 8 0 0 16 0 16 16 16 16 4 1 Ncon 0x48 24 dc 0 32 0 r y . 8 8 8 8 0 16 0 16 16 16 16 4 1 Ncon Prior to adding the nvidia drivers I do the following: cd /usr/X11R6/lib rm -f libGL.* cd modules/extensions rm -f libGL* rm -f libglx* cd /usr/lib rm -f libGL.* I then build the latest version of Mesa (6.0) and from the top of the Mesa source tree I run: find ./ -name libGL* -exec cp {} /usr/lib \; find ./ -name libglut.so.3 -exec cp {} /usr/lib \; find ./ -name glut.h -exec cp {} /usr/include/GL \; then install freeglut (2.2.0) and make softlinks from /lib to the libglut.so and libglut.so.3 files produced by freeglut. I then install the nvidia drivers. Close this bug if it isn't relevant to Xorg. Thanks, Bob
Mike, you may want to look at this one. I guess commenting out x:5:respawn:/etc/X11/prefdm -nodaemon results in a console mode on boot. However I don't understand why it works when the Xserver startup is delayed.
This is a driver installation problem. You're using Nvidia's proprietary driver, but the X server can't find it. The DRI extension is loading and the libglx.a that is installed is Nvidia's, which does not work with DRI. - Uninstall the Nvidia proprietary drivers (they do not work with Fedora Core 2 and neither Nvidia nor Red Hat support them) - Reinstall X.org from the RPM packages that come with Fedora Core 2 - Run system-config-display --reconfig Your system should now work with the "nv" driver, and the above errors should not occur. This is not an X.org bug/issue, so I'm closing it as "INVALID". If you require further technical assistance in resolving the issue, you can either try one of several X related help mailing lists, or alternatively try Nvidia's web forums, or Nvidia technical support. Hope this helps.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.