Bug 61412

Summary: glCallLists/glBitmap calls slow on Intel implementation of Mesa drivers
Product: Mesa Reporter: James Ogden <rexhunter99>
Component: OtherAssignee: mesa-dev
Status: RESOLVED WONTFIX QA Contact:
Severity: critical    
Priority: highest    
Version: 9.0   
Hardware: x86-64 (AMD64)   
OS: Linux (All)   
Whiteboard:
i915 platform: i915 features:

Description James Ogden 2013-02-24 22:34:30 UTC
When running Carnivores 2 linux port on my Intel chipset (low-performance) the application performs astoundingly well, upwards of 120 fps, 20 more frames than the nvidia high performance card.  As soon as I implemented the text support using XFonts and call lists (yes deprecated) I was getting ~8 fps, commented out glCallLists and discovered this was the culprit, the performance slow down was constant from game start to finish.

Rebuilt the application for my Windows 7 machine, ran it and had no problems with the WGL version of things, went back to Linux and ran it using only the nvidia drivers and had no performance issues at all.

I realise that Lists are outdated and deprecated however I'm building this project for people who still have OpenGL 1.3 and 2.0 devices and have issues with large textures (game is multilingual as well, so bitmap fonts in my own textures would get rather enormous to support everything from Latin to Cyrillic)

Not a huge problem since it is deprecated after all but it definitely kills backwards compatibility and totally kills off anyone with a netbook or older device driver.

Running the application using GLX/WGL context creation and am able to get a 3.0+ context, however am sticking to the legacy 2.1 and below side of things.  List is a standard 256 character font mapping if it helps any.
Comment 1 Brian Paul 2013-02-25 15:22:55 UTC
glBitmap, not glCallLists, is probably the real issue.  It might be helpful to see the "OpenGL renderer string" from glxinfo to identify the GPU.  I suspect the i965 driver needs some sort of glBitmap caching mechanism (similar to what's in the gallium state tracker) to improve performance.
Comment 2 Kenneth Graunke 2013-02-25 18:43:30 UTC
Yeah, I think we're relying on the meta path for glBitmap in many cases.  We're probably hitting software rasterization.

Still, 120 fps -> 8 fps is clearly not reasonable.

What generation of Intel hardware are you running?  (lspci -nn would tell you.)  Can you post an apitrace which exhibits the problem?
Comment 3 Eric Anholt 2013-02-25 19:19:52 UTC
Please, please, please don't use display lists, and don't use bitmaps.  Display lists are basically nvidia-only for performance, so it's a bad route to go.  Use normal texturing and "discard" instructions to render your bitmaps, or normal texturing and alhpa blending if you're doing fixed function.  Sticking your textures in one big atlas and vertex data in a vbo, you'll get way better performance than you'd ever get out of bitmap.
Comment 4 James Ogden 2013-02-25 21:30:11 UTC
(In reply to comment #3)
> Please, please, please don't use display lists, and don't use bitmaps. 
> Display lists are basically nvidia-only for performance, so it's a bad route
> to go.  Use normal texturing and "discard" instructions to render your
> bitmaps, or normal texturing and alhpa blending if you're doing fixed
> function.  Sticking your textures in one big atlas and vertex data in a vbo,
> you'll get way better performance than you'd ever get out of bitmap.

Really dude, did you not read anything I said?  It's a compatibility thing, plus the people who play this game are mostly either kids or computer illiterates so if a font texture goes bye bye because of something silly they did, I'll be the one who has to personally troubleshoot their issue. I *want* to use the proper method, but I've had that many issues in the past with relying on users being intelligent enough to keep track of their file system that I now know better.  I know they're slow, I know they're deprecated and I do care, but I don't have much choice if I want to stay sane.

Oh and my last machine was an AMD ATi setup, dual radeon graphics, display lists worked fine using the official drivers, or 3rd party ones, I just hate APUs which is why I upgraded as soon as the laptop died before the warranty ran out. So this isn't a case of nvidia drivers being better than the rest.

(In reply to comment #2)
> Yeah, I think we're relying on the meta path for glBitmap in many cases.  We're > probably hitting software rasterization.
> 
> Still, 120 fps -> 8 fps is clearly not reasonable.
> 
> What generation of Intel hardware are you running?  (lspci -nn would tell you.)  > Can you post an apitrace which exhibits the problem?
I'm not sure how much I trust lspci's information, it thinks I have a Xeon server processor :/ I'm running an Intel i5 Gen 3 Core processor the 3210M model IvyBridge.
The VGA section for the low-performance side of things states an Intel 3rd Gen Graphics Controller.  Not a lot of help on that side.

(In reply to comment #1)
> glBitmap, not glCallLists, is probably the real issue.  It might be helpful
> to see the "OpenGL renderer string" from glxinfo to identify the GPU.  I
> suspect the i965 driver needs some sort of glBitmap caching mechanism
> (similar to what's in the gallium state tracker) to improve performance.
Renderer is Mesa 9.0 I believe, like I said, it's using Mesa drivers.
Comment 5 Eric Anholt 2013-02-25 22:11:40 UTC
I did read it.  But alpha blended texturing would also do the job that glBitmap does, and is just fine with fixed function GL.
Comment 6 James Ogden 2013-02-25 23:44:24 UTC
(In reply to comment #5)
> I did read it.  But alpha blended texturing would also do the job that
> glBitmap does, and is just fine with fixed function GL.
You clearly DID NOT read it:

> The people who play this game are mostly either kids or computer illiterates
> so if a font texture goes bye bye because of something silly they did, I'll
> be the one who has to personally troubleshoot their issue. I *want* to use
> the proper method, but I've had that many issues in the past with relying on
> users being intelligent enough to keep track of their file system that I now
> know better.  I know they're slow, I know they're deprecated and I do care,
> but I don't have much choice if I want to stay sane.

Besides, I have reported a clearly broken feature of the driver, which appears to be related to glBitmap, perhaps instead of solely glCallLists, and it should be fixed. So stop telling me what I should do instead of using it when I've given my perfectly reasonable reasons and am not interested in people such as yourself arrogantly parading around like you do.  Either help with fixing the issue that is the driver seemingly falling back to some slow software implementation or bugger off.
Comment 7 Ian Romanick 2013-02-26 00:22:29 UTC
(In reply to comment #4)
> (In reply to comment #3)
> > Please, please, please don't use display lists, and don't use bitmaps. 
> > Display lists are basically nvidia-only for performance, so it's a bad route
> > to go.  Use normal texturing and "discard" instructions to render your
> > bitmaps, or normal texturing and alhpa blending if you're doing fixed
> > function.  Sticking your textures in one big atlas and vertex data in a vbo,
> > you'll get way better performance than you'd ever get out of bitmap.
> 
> Really dude, did you not read anything I said?  It's a compatibility thing,
> plus the people who play this game are mostly either kids or computer
> illiterates so if a font texture goes bye bye because of something silly
> they did, I'll be the one who has to personally troubleshoot their issue. I

Can you elaborate on what you mean by this?  I believe that Eric's proposal was for you to change the way the application is rendering.  I'm not sure how this has any affect on end-users troubleshooting things.

> *want* to use the proper method, but I've had that many issues in the past
> with relying on users being intelligent enough to keep track of their file
> system that I now know better.  I know they're slow, I know they're
> deprecated and I do care, but I don't have much choice if I want to stay
> sane.

I'm assuming that you have fonts stored on disk as bitmap (1-bit per pixel) images, and that's why you're mentioning filesystems.  You can use this data as a texture by using the GL_PIXEL_MAP_I_TO_A table.  The data is sent to glTexImage2D like:

    glTexImage2D(GL_TEXTURE_2D, 0, width, height, 0,
                 GL_ALPHA4, GL_COLOR_INDEX, GL_BITMAP, pixels);

Configure the GL 1.2 texture combiners and blend mode to blend the font the way you want:

    glTexEnvi(GL_TEXTURE_2D, GL_TEXTURE_ENV_MODE, GL_REPLACE);
    glBlendFunc(GL_SRC_ALPHA, GL_ZERO);
    glBlendEquation(GL_FUNC_ADD);

Then draw a single quad in the color you want the font to be:

    glColor3f(1.0f, 0.0f, 0.0f);
    glBegin(GL_QUADS);  // or your favorite way to draw
    glTexCoord2f(...);  // location of the character in the atlas
    glVertex2f(...);
    ...
    glEnd();

You could even build those calls into display lists (as you currently do with the glBitmap calls) to keep the changes isolated from the reset of your code.

Excluding the call to glBlendEquation, that will work on every OpenGL implementation since GL 1.1, and it will probably be faster than glBitmap on every GL implementation since the data doesn't need to be re-uploaded for each drawing operation.

Some implementations may cache the bitmap data in GPU memory with the display list, but there's no guarantee.

> Oh and my last machine was an AMD ATi setup, dual radeon graphics, display
> lists worked fine using the official drivers, or 3rd party ones, I just hate
> APUs which is why I upgraded as soon as the laptop died before the warranty
> ran out. So this isn't a case of nvidia drivers being better than the rest.

(In reply to comment #6)
> (In reply to comment #5)
> > I did read it.  But alpha blended texturing would also do the job that
> > glBitmap does, and is just fine with fixed function GL.
> You clearly DID NOT read it:
> 
> > The people who play this game are mostly either kids or computer illiterates
> > so if a font texture goes bye bye because of something silly they did, I'll
> > be the one who has to personally troubleshoot their issue. I *want* to use
> > the proper method, but I've had that many issues in the past with relying on
> > users being intelligent enough to keep track of their file system that I now
> > know better.  I know they're slow, I know they're deprecated and I do care,
> > but I don't have much choice if I want to stay sane.
> 
> Besides, I have reported a clearly broken feature of the driver, which
> appears to be related to glBitmap, perhaps instead of solely glCallLists,
> and it should be fixed. So stop telling me what I should do instead of using
> it when I've given my perfectly reasonable reasons and am not interested in
> people such as yourself arrogantly parading around like you do.  Either help
> with fixing the issue that is the driver seemingly falling back to some slow
> software implementation or bugger off.

We have a small team, and a lot of other things to work on.  The probably that we will have time to optimize this case is very, very small.  We have never encountered any other application that depends on the performance of glBitmap.
Comment 8 James Ogden 2013-02-26 01:01:36 UTC
(In reply to comment #7)
> Can you elaborate on what you mean by this?  I believe that Eric's proposal
> was for you to change the way the application is rendering.  I'm not sure
> how this has any affect on end-users troubleshooting things.
Seriously I feel like I'm a canary, repeating the same tune over and over.
I can't rely on a texture based font stored in the game directory, or even the truetype/freetype fonts (except maybe Ubuntu/fixed on Linux and Arial on Windows) because I am catering for people who do weird stuff to their computers and files go missing mysteriously.  So instead I rely on something I know is there and if its not, then their system is completely buggered.

> I'm assuming that you have fonts stored on disk as bitmap (1-bit per pixel)
> images, and that's why you're mentioning filesystems.  You can use this data
> as a texture by using the GL_PIXEL_MAP_I_TO_A table.  The data is sent to
> glTexImage2D like:
> 
>     glTexImage2D(GL_TEXTURE_2D, 0, width, height, 0,
>                  GL_ALPHA4, GL_COLOR_INDEX, GL_BITMAP, pixels);
> 
> Configure the GL 1.2 texture combiners and blend mode to blend the font the
> way you want:
> 
>     glTexEnvi(GL_TEXTURE_2D, GL_TEXTURE_ENV_MODE, GL_REPLACE);
>     glBlendFunc(GL_SRC_ALPHA, GL_ZERO);
>     glBlendEquation(GL_FUNC_ADD);
> 
> Then draw a single quad in the color you want the font to be:
> 
>     glColor3f(1.0f, 0.0f, 0.0f);
>     glBegin(GL_QUADS);  // or your favorite way to draw
>     glTexCoord2f(...);  // location of the character in the atlas
>     glVertex2f(...);
>     ...
>     glEnd();
> 
> You could even build those calls into display lists (as you currently do
> with the glBitmap calls) to keep the changes isolated from the reset of your
> code.
> 
> Excluding the call to glBlendEquation, that will work on every OpenGL
> implementation since GL 1.1, and it will probably be faster than glBitmap on
> every GL implementation since the data doesn't need to be re-uploaded for
> each drawing operation.
I know how to do all of this, again I do NOT have a font texture file, I rely on system installed fonts because if they aren't there, then the system is buggered to hell.

> Some implementations may cache the bitmap data in GPU memory with the
> display list, but there's no guarantee.
Yes I am aware of that, but this isn't an implementation specific thing, it's Mesa specific on all intel renderers for Linux.

> We have a small team, and a lot of other things to work on.  The probably
> that we will have time to optimize this case is very, very small.  We have
> never encountered any other application that depends on the performance of
> glBitmap.
Hardly an excuse if you ask me, I have a two man team working on a big project but we never make excuses like that, we simply prioritise any issues we encounter and get to them when we can.

And this is probably why no one ports games over from Windows, because the Mesa implementations are horrible, fantastic fast when you use Core 3.0 stuff and 2.1 stuff, but otherwise you'd never be able to port golden titles from the 1990's over without gutting their rendering systems and rewriting them yourself by hand.
Comment 9 Ian Romanick 2013-02-26 01:24:38 UTC
(In reply to comment #8)
> (In reply to comment #7)
> > Can you elaborate on what you mean by this?  I believe that Eric's proposal
> > was for you to change the way the application is rendering.  I'm not sure
> > how this has any affect on end-users troubleshooting things.
> Seriously I feel like I'm a canary, repeating the same tune over and over.
> I can't rely on a texture based font stored in the game directory, or even
> the truetype/freetype fonts (except maybe Ubuntu/fixed on Linux and Arial on
> Windows) because I am catering for people who do weird stuff to their
> computers and files go missing mysteriously.  So instead I rely on something
> I know is there and if its not, then their system is completely buggered.
> 
> > I'm assuming that you have fonts stored on disk as bitmap (1-bit per pixel)
> > images, and that's why you're mentioning filesystems.  You can use this data
> > as a texture by using the GL_PIXEL_MAP_I_TO_A table.  The data is sent to
> > glTexImage2D like:
> > 
> >     glTexImage2D(GL_TEXTURE_2D, 0, width, height, 0,
> >                  GL_ALPHA4, GL_COLOR_INDEX, GL_BITMAP, pixels);
> > 
> > Configure the GL 1.2 texture combiners and blend mode to blend the font the
> > way you want:
> > 
> >     glTexEnvi(GL_TEXTURE_2D, GL_TEXTURE_ENV_MODE, GL_REPLACE);
> >     glBlendFunc(GL_SRC_ALPHA, GL_ZERO);
> >     glBlendEquation(GL_FUNC_ADD);
> > 
> > Then draw a single quad in the color you want the font to be:
> > 
> >     glColor3f(1.0f, 0.0f, 0.0f);
> >     glBegin(GL_QUADS);  // or your favorite way to draw
> >     glTexCoord2f(...);  // location of the character in the atlas
> >     glVertex2f(...);
> >     ...
> >     glEnd();
> > 
> > You could even build those calls into display lists (as you currently do
> > with the glBitmap calls) to keep the changes isolated from the reset of your
> > code.
> > 
> > Excluding the call to glBlendEquation, that will work on every OpenGL
> > implementation since GL 1.1, and it will probably be faster than glBitmap on
> > every GL implementation since the data doesn't need to be re-uploaded for
> > each drawing operation.
> I know how to do all of this, again I do NOT have a font texture file, I
> rely on system installed fonts because if they aren't there, then the system
> is buggered to hell.

You're repeating my point, and that's confusing.  It doesn't matter how your fonts are stored.  You can send *exactly the same bits* into OpenGL as a texture or a bitmap.
Comment 10 James Ogden 2013-02-26 01:35:03 UTC
(In reply to comment #9)
> You're repeating my point, and that's confusing.  It doesn't matter how your
> fonts are stored.  You can send *exactly the same bits* into OpenGL as a
> texture or a bitmap.
No I'm not repeating *your* point.  I'm repeating my point that you are .clearly not understanding.

I am loading these in using the XFont in glXBitmapFont or whatever the X version of wglFontBitmaps is.  I have no control of how these get loaded in or what they do under the hood.  Like I said, I am not relying on a file that may or may not exist in the game directory, I'm relying on the system.

Oh and this issue isn't resolved, perhaps you should not be so immature and disregard an issue as not your own doing when clearly it is.  Why don't you face up to the problem and fix it or atleast add it to your supposedly big "to do" list that your tiny team should be slogging through.
Comment 11 Ian Romanick 2013-02-26 01:40:38 UTC
We've tried really hard to help you, and all of your responses have been filled with rudeness and belligerence.  I think we're done trying to help you.  You can reopen the bug if you want, but we have no plans to do anything with it.

Have a nice day.
Comment 12 James Ogden 2013-02-26 01:49:14 UTC
(In reply to comment #11)
> We've tried really hard to help you, and all of your responses have been
> filled with rudeness and belligerence.  I think we're done trying to help
> you.  You can reopen the bug if you want, but we have no plans to do
> anything with it.
> 
> Have a nice day.
It's nice to see the world is still full of people who are happy to live with broken utilities.

It really is quite sad that a company like NVidia - or AMD even - who have to have their hands forced by a company like Valve can at least produce working drivers for their devices while you lot who supposedly work for the community and produce things by the community can't even acknowledge and attempt to fix a problem brought up by someone.  And then go so far as to be arrogant and ignore everything said person says and then use their frustration against them to further ignore the issue.

Good day, take your shitty drivers and wallow in your arrogance and self righteousness.  It's people like you that drive other people away from using bug reporting systems to try and find a proper solution to an issue and resolve it for future cases, and even deter people from using the operating system the issue originates on.

:)

Marking this as RESOLVED - WONTFIX, since that's the reality of the situation, NOTOURBUG is just pathetic really.
Comment 13 Brian Paul 2013-02-26 02:45:29 UTC
(In reply to comment #11)
> We've tried really hard to help you, and all of your responses have been
> filled with rudeness and belligerence.

Ian, I don't think James is being unreasonable.  He's simply trying to use a legacy OpenGL feature (that works fine with other drivers) and is frustrated that nobody seems interested in helping him.  And simply saying "just forget about glXUseXFonts and use textures" doesn't really help either, given the other constraints that he described (and were probably glossed over by others).

James, I think I have a possible path to a solution for you.  The texture map approach really is the best way to go, but you're probably wondering about how to get your font glyphs without pulling in some new font utility/library, font files, etc.

If you look at Mesa's src/mesa/drivers/x11/xfonts.c file you'll find an implementation of glXUseXFont().  It uses some Xlib code to convert X font glyphs into bitmap images.  I think you could adapt that code so that you'd save the bitmap images in some data structure, then implement a simple bitmap character renderer to render strings into a texture image (see also src/mesa/state_trackers/st_cb_bitmap.c and the _mesa_expand_bitmap() function).  Then render the texture/string with GL_ALPHA_TEST or blending.  I think you could do all this in a few hundred lines of code (most of it from xfonts.c).  Maybe there's even some code on the net that does this already.  Anyway, you could build new code this with your other app sources and avoid any new external dependencies.

Hope that helps.
Comment 14 Ian Romanick 2013-02-26 02:48:18 UTC
(In reply to comment #12)
> (In reply to comment #11)
> > We've tried really hard to help you, and all of your responses have been
> > filled with rudeness and belligerence.  I think we're done trying to help
> > you.  You can reopen the bug if you want, but we have no plans to do
> > anything with it.
> > 
> > Have a nice day.
> It's nice to see the world is still full of people who are happy to live
> with broken utilities.
> 
> It really is quite sad that a company like NVidia - or AMD even - who have
> to have their hands forced by a company like Valve can at least produce
> working drivers for their devices while you lot who supposedly work for the
> community and produce things by the community can't even acknowledge and
> attempt to fix a problem brought up by someone.  And then go so far as to be
> arrogant and ignore everything said person says and then use their
> frustration against them to further ignore the issue.
> 
> Good day, take your shitty drivers and wallow in your arrogance and self
> righteousness.  It's people like you that drive other people away from using
> bug reporting systems to try and find a proper solution to an issue and
> resolve it for future cases, and even deter people from using the operating
> system the issue originates on.

1. Nobody was arrogant or self righteous.  I went back and read every message in this thread several times.  Based on incomplete information, we tried to offer detailed suggestions of how to make your application work better with our drivers.

2. Key information like "I'm using glXUseXFont so I never interact with the font data at all in any way" was omitted until message 10 in the thread.  Up to that point, at least five different people that read the bug thought your application was calling glBitmap directly.  There was certainly no reason to think otherwise.  That changes things a lot.

3. Nobody wants to volunteer to be yelled at and called names in a bug tracker.  We didn't tell you to "take your shitty ..." and do something with it.  We didn't tell you to "bugger off."  I don't expect that you'd talk to the guy behind the counter at the coffee shop that way.  If you did, they'd probably kick you out permanently.  I'm surprised that you think it's okay to talk to us that way.  In an asynchronous communication mechanism, there's plenty of time to self-censor.  I'm somewhat infamous for... speaking with rage before thinking, so I understand better than most how difficult that can be.

4. You suggested that we should "simply prioritise any issues we encounter and get to them when we can."  In comment #7 I tried to describe exactly that.  Where to you prioritize an issue seen by one user versus issues seen by many users?


Let's start over.

In comment #2, Ken asked for an apitrace of your application.  Can you provide that?
Comment 15 Brian Paul 2013-02-26 02:51:00 UTC
(In reply to comment #13)
> already.  Anyway, you could build new code this with your other app sources

Err, "build this new code with" ...
Comment 16 James Ogden 2013-02-26 03:25:16 UTC
(In reply to comment #13)
> (In reply to comment #11)
> > We've tried really hard to help you, and all of your responses have been
> > filled with rudeness and belligerence.
> 
> Ian, I don't think James is being unreasonable.  He's simply trying to use a
> legacy OpenGL feature (that works fine with other drivers) and is frustrated
> that nobody seems interested in helping him.  And simply saying "just forget
> about glXUseXFonts and use textures" doesn't really help either, given the
> other constraints that he described (and were probably glossed over by
> others).
No I didn't think I was being unreasonable either, being a complete a**hole maybe, but not unreasonable.

> James, I think I have a possible path to a solution for you.  The texture
> map approach really is the best way to go, but you're probably wondering
> about how to get your font glyphs without pulling in some new font
> utility/library, font files, etc.
I know it's the best way to go, I'd prefer it since the game has a TTF dedicated to it that wasn't used originally back in the 90s due to the developers being too lazy to install it with the game.  But as we all know from my previous comments that I'd rather rely on the system :)

> If you look at Mesa's src/mesa/drivers/x11/xfonts.c file you'll find an
> implementation of glXUseXFont().  It uses some Xlib code to convert X font
> glyphs into bitmap images.  I think you could adapt that code so that you'd
> save the bitmap images in some data structure, then implement a simple
> bitmap character renderer to render strings into a texture image (see also
> src/mesa/state_trackers/st_cb_bitmap.c and the _mesa_expand_bitmap()
> function).  Then render the texture/string with GL_ALPHA_TEST or blending. 
> I think you could do all this in a few hundred lines of code (most of it
> from xfonts.c).  Maybe there's even some code on the net that does this
> already.  Anyway, you could build new code this with your other app sources
> and avoid any new external dependencies.
I've already begun the process of writing a setup like this and manually creating a texture at load-time with the font glyph data stored in an appropriate structure.  Thanks for suggesting the source, I would have just wrote the whole thing myself from scratch but now I may take a look at how it's done in Mesa's source.

> Hope that helps.
Kind of, definitely a much better response than I initially received, thank you Brian.

(In Reply to Comment #12)
> Let's start over.
> 
> In comment #2, Ken asked for an apitrace of your application.  Can you provide that?
Building the utility and will return with the dump as soon as I'm done.
Comment 17 James Ogden 2013-02-26 04:26:57 UTC
Here is a raw copy of the .trace from the app, its 10mb (dump was 20mb) because it takes 4 seconds to quit the game so it continued tracing the renderer.

http://www.mediafire.com/?2fiejjbonnu9e42
Comment 18 Roland Scheidegger 2013-02-26 16:33:05 UTC
FWIW if it's glBitmap causing this due to sw fallback, it will be much worse on old radeons (r100/r200), due to sw fallback being way worse there.
There's still a couple of other bugs filed due to that I think, the typical case I've seen is that the fallback case is hit because fog was enabled when doing glBitmap (either deliberately or by accident), though there certainly are other reasons for fallbacks (but those seem less likely to be hit). If that's the case it certainly would be possible to avoid the fallback by taking into account fog in meta's bitmap code.
Comment 19 James Ogden 2013-02-26 22:50:45 UTC
I didn't think to check if GL_FOG was enabled at all during the rendering!  I know that I enabled it someplace for volumetric fog using glFogCoordf() which is not available on this hardware (dunno why... my AMD Radeon HD 6000 and 7000 both had it) But I can't remember if it's disabled afterwards... Thanks for bringing this up, will definitely take a gander.
Comment 20 Roland Scheidegger 2013-02-27 00:36:35 UTC
Well if it's really a fallback, you could set a breakpoint in _mesa_meta_Bitmap and check what state is set which causes it right at the beginning of the function. Ideally there wouldn't be any fallbacks there but I guess it would be quite some work.
Comment 21 James Ogden 2013-02-27 02:28:53 UTC
(In reply to comment #20)
> Well if it's really a fallback, you could set a breakpoint in
> _mesa_meta_Bitmap and check what state is set which causes it right at the
> beginning of the function. Ideally there wouldn't be any fallbacks there but
> I guess it would be quite some work.

I ran it with GL_FOG forced off, seems that did the trick, Intel is back to being faster than my discrete NVidia card.  Haha, the irony.  Well thanks to software side state checking, I managed to get this fixed so now we know it's a fallback of some kind related to fog with glBitmap, not sure if we should mark as resolved or if it should be left open until the issue is looked at?

Anyway thanks for all the help guys!  (or gals if any)
Comment 22 Roland Scheidegger 2013-02-27 12:21:24 UTC
I think if you had fog active by accident that would contribute to the feeling there's probably no real use case for glBitmap with active fog hence probably noone really interested in making that fast (I think the past issues I've seen with that also had it active by mistake though it's hard to tell - it is obviously easy to forget something like that, you could also have a fragment shader active on top of glBitmap which unlike fog just sounds crazy).
Still I think it would be nice-to-have, accidental usage or not, it's not the first time this is hit.
Comment 23 Ian Romanick 2013-02-28 00:25:27 UTC
I'm closing this bug as WONTFIX, and I've just opened bug #61582 to replace it.

Meta should have provided this feedback to the app.  If James had gotten a message like "glBitmap fell back to software because fog was enabled", a lot of his time would have been saved tracking this issue down.  He really shouldn't have had to trace into _mesa_meta_Bitmap to figure this out.  We've added some support for messages like this, and we just need to add more.

Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.