Hello. Using kernel 3.2.9 vanilla (also tried 3.1.0-rc9 + nouveau patches), libdrm/ddx/mesa from git, and X server 1.10.4 i have strange bug on my rs600: if i start celestia (self-compiled, version 1.6.1, with gtk2 interface) in its default rendering mode - it show mostly white window. Selecting another rendering path, like "basic" or "multitexture" makes it work OK. Witb white window i have extremely low framerates, like 1.x something fps. Tried RADEON_DEBUG options , without luck. Will add logs and screenshot.
Created attachment 59120 [details] dmesg
Created attachment 59121 [details] lspci -vvvn
Created attachment 59122 [details] X.org log
Created attachment 59123 [details] glxinfo
Created attachment 59124 [details] rs600's rom (from sysfs)
Created attachment 59125 [details] Screenshot
Mesa was compiled with ./configure --prefix=/usr/X11R7 --disable-egl --enable-gallium-llvm --with-gallium-drivers=i915 nouveau r600 r300 swrast --enable-texture-float --with-dri-drivers=i965 r200 radeon nouveau --enable-shared-glapi --enable-gallium-g3dvl --enable-vdpau --enable-debug Gallium libs - version 2.9. (I tried DRAW_NO_LLVM=1 - bug was still there). llvmpipe renders Celestia's window correctly in all modes.
If i set "Star style" to "Points" it works OK, another two options (default "Fuzzy Points" and "Scaled Disks") produces white slow screen.
as far as I can see in celestia-1.6.1/src/celengine/render.cpp (function void Renderer::PointStarVertexBuffer::startSprites() ) - it uses some combination of GL_ARB_point_sprite and vertex shaders .... Any ideas what can fail with this combo on rs600 ? ---------------------------- void Renderer::PointStarVertexBuffer::startSprites(const GLContext& _context) { context = &_context; assert(context->getVertexProcessor() != NULL || !useSprites); // vertex shaders required for new star rendering unsigned int stride = sizeof(StarVertex); glEnableClientState(GL_VERTEX_ARRAY); glVertexPointer(3, GL_FLOAT, stride, &vertices[0].position); glEnableClientState(GL_COLOR_ARRAY); glColorPointer(4, GL_UNSIGNED_BYTE, stride, &vertices[0].color); VertexProcessor* vproc = context->getVertexProcessor(); vproc->enable(); vproc->use(vp::starDisc); vproc->enableAttribArray(6); vproc->attribArray(6, 1, GL_FLOAT, stride, &vertices[0].size); glDisableClientState(GL_TEXTURE_COORD_ARRAY); glDisableClientState(GL_NORMAL_ARRAY); glEnable(GL_POINT_SPRITE_ARB); glTexEnvi(GL_POINT_SPRITE_ARB, GL_COORD_REPLACE_ARB, GL_TRUE); useSprites = true; } -------------------------------------------- Any piglit test?
wild guess: r300g need something like this? http://cgit.freedesktop.org/mesa/mesa/commit/?id=022e270b1b972b6d04890f1ac1fc2a4a4ed03ff7
Or http://cgit.freedesktop.org/mesa/mesa/commit/?id=1a69b50b3b441ce8f7a00af3a7f02c37df50f6c3 Namely calls to draw_create_fragment_shader() and draw_bind_fragment_shader() , probably they should go into 300_state.c, r300_{create|bind}_fs_state ?
(In reply to comment #7) > (I tried DRAW_NO_LLVM=1 - bug was still there). FWIW, that's not a recognized environment variable. You probably meant DRAW_USE_LLVM=0.
(In reply to comment #12) > (In reply to comment #7) > > (I tried DRAW_NO_LLVM=1 - bug was still there). > > > FWIW, that's not a recognized environment variable. You probably meant > DRAW_USE_LLVM=0. Yes, sorry. Tried twice with DRAW_USE_LLVM=0 - bug still here. I'll try to patch driver with my idea above and see if it will fix this bug ...
(In reply to comment #13) > (In reply to comment #12) > > (In reply to comment #7) > > > (I tried DRAW_NO_LLVM=1 - bug was still there). > > > > > > FWIW, that's not a recognized environment variable. You probably meant > > DRAW_USE_LLVM=0. > > Yes, sorry. Tried twice with DRAW_USE_LLVM=0 - bug still here. I'll try to > patch driver with my idea above and see if it will fix this bug ... Unfortunately, it was not as simple as copy/pasting code ... no patch from me :(
.. and spriteblast mesa demo surely don't work correctly here. (I see something remotely like _giant_ sprites flashing on screen)
Is this bug present in Mesa 8.0 and if yes, is it present in 7.11 as well?
(In reply to comment #16) > Is this bug present in Mesa 8.0 and if yes, is it present in 7.11 as well? yes, at minimum in 8.0 and 7.11 git _branches_. With 7.11 it was a bit unstable - but bug still around, both for celestia and pointblast/spriteblast mesa demos.
Bug still here with mesa commit 34e53adc51ade8d53d74b6ae35bec90f1a6b9b29 ("r600g: inline r600_upload_index_buffer")
I know. The vertex shader point size output is broken on SWTCL chipsets (piglit/glsl-vs-point-size fails randomly). I haven't been able to find the cause yet.
-- GitLab Migration Automatic Message -- This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity. You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/mesa/mesa/issues/341.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.