Bug 56090 - Excessive amounts of memory used per context
Summary: Excessive amounts of memory used per context
Status: RESOLVED MOVED
Alias: None
Product: Mesa
Classification: Unclassified
Component: Drivers/DRI/i965 (show other bugs)
Version: 7.11
Hardware: Other Linux (All)
: low enhancement
Assignee: Ian Romanick
QA Contact: Intel 3D Bugs Mailing List
URL:
Whiteboard:
Keywords:
: 65342 (view as bug list)
Depends on:
Blocks:
 
Reported: 2012-10-17 14:52 UTC by Steve
Modified: 2019-09-25 18:49 UTC (History)
1 user (show)

See Also:
i915 platform:
i915 features:


Attachments
Modified isosurf sample from wxWidgets (5.63 KB, text/plain)
2012-10-17 14:52 UTC, Steve
Details
Header file for modified sample code (1.51 KB, text/plain)
2012-10-17 14:53 UTC, Steve
Details

Description Steve 2012-10-17 14:52:12 UTC
Created attachment 68715 [details]
Modified isosurf sample from wxWidgets

This issue has been seen in Mesa 7.x, 8.x, and 9.x. In the latest available drivers from xorg-edgers 2:2.20.10 the memory usage actually went down some, but it is still unacceptably high.

Creating multiple OpenGL contexts results in an excessive amount of memory being used. I would expect some extra memory usage, but not causing an application to use 4x or more memory (roughly 12MB virtual and 2MB physical).

This was noticed as an attempt to work around bug #55675 where using a single canvas to render to multiple drawables resulted in performance degradation over time. Using a separate context per drawable resolves the performance degradation, but results in using around 4x more memory.

In the attached sample you can uncomment the SINGLE_CONTEXT define to compare memory usage between using a single context and using multiple contexts.
Comment 1 Steve 2012-10-17 14:53:19 UTC
Created attachment 68716 [details]
Header file for modified sample code
Comment 2 Steve 2012-10-17 14:55:44 UTC
(In reply to comment #0)
> Created attachment 68715 [details]
> Modified isosurf sample from wxWidgets
> 
> This issue has been seen in Mesa 7.x, 8.x, and 9.x. In the latest available
> drivers from xorg-edgers 2:2.20.10 the memory usage actually went down some,
> but it is still unacceptably high.
> 
> Creating multiple OpenGL contexts results in an excessive amount of memory
> being used. I would expect some extra memory usage, but not causing an
> application to use 4x or more memory (roughly 12MB virtual and 2MB physical).
> 
> This was noticed as an attempt to work around bug #55675 where using a
> single canvas to render to multiple drawables resulted in performance
> degradation over time. Using a separate context per drawable resolves the
> performance degradation, but results in using around 4x more memory.
> 
> In the attached sample you can uncomment the SINGLE_CONTEXT define to
> compare memory usage between using a single context and using multiple
> contexts.

In my comment "roughly 12MB virtual and 2MB physical" that should state roughly 12MB virtual and 2MB physical per context not total. The total memory went from 76MB virtual and 17MB physical to 471MB virtual and 84MB physical.
Comment 3 Eric Anholt 2012-10-31 20:07:09 UTC
Unfortunately due to Mesa architecture it's hard to remove the major offender in memory allocation on legacy GL contexts.  If you ask for a core GL context, you'll save most of the memory allocated by the driver.
Comment 4 Steve 2012-10-31 20:10:27 UTC
The same code running on other mesa drivers (ie an Intel G35 chipset) doesn't use this much memory. Maybe I just don't know enough about the drivers, but if one driver can do it I would think the other could as well.
Comment 5 Steve 2012-10-31 20:14:38 UTC
Nevermind my last comment I see what you're saying now. The older G35 driver uses an older OpenGL version for its context so the compatibility mode context required for the code uses more memory. I will have to look into creating a core context instead and see if that is possible for our code base. However, the only reason we tried having multiple contexts was to work around a different issue to begin with.
Comment 6 Steve 2012-11-02 15:39:38 UTC
I can't seem to create a core context either. I've modified wxWidgets directly and even tried the simple sample application found here: http://www.opengl.org/wiki/Tutorial:_OpenGL_3.0_Context_Creation_%28GLX%29

Note that the above sample had to be modified to use glXGetClientString( display, GLX_EXTENSIONS ) instead of glXQueryExtensionsString since the drivers seem to report GLX_ARB_create_context as a client extension only.

Both attempts resulted in the same XErrorEvent firing and no context being created. The XError is a BadRequest Serial 26, Error Code 1, Request Code 153, Minor Code 34. The same error also happens when attempting to create an older context using glXCreateContextAttribsARB as well.

Here's the output from that sample application:
Getting matching framebuffer configs
Found 4 matching FB configs.
Getting XVisualInfos
  Matching fbconfig 0, visual ID 0x20: SAMPLE_BUFFERS = 0, SAMPLES = 0
  Matching fbconfig 1, visual ID 0x77: SAMPLE_BUFFERS = 0, SAMPLES = 0
  Matching fbconfig 2, visual ID 0xb5: SAMPLE_BUFFERS = 1, SAMPLES = 4
  Matching fbconfig 3, visual ID 0xb1: SAMPLE_BUFFERS = 0, SAMPLES = 0
Chosen visual ID = 0xb5
Creating colormap
Creating window
Mapping window
Creating context
XError: Serial 26, Error Code 1, Request Code 153, Minor Code 34
Failed to create GL 3.0 context ... using old-style GLX context
XError: Serial 29, Error Code 1, Request Code 153, Minor Code 34
Failed to create an OpenGL context
Comment 7 Steve 2012-11-02 16:15:02 UTC
Also worth noting that Mesa 7.x doesn't support OpenGL 3.0 and glXCreateContextAttribsARB() so I wouldn't think 7.x would have the memory issue. I could see that being the case in 8.x/9.x but as noted above in 9.x I couldn't create a context at all using glXCreateContextAttribsARB(). So I'm not entirely sure how to get a core context if that doesn't work.
Comment 8 Ian Romanick 2012-11-02 17:17:27 UTC
(In reply to comment #6)
> I can't seem to create a core context either. I've modified wxWidgets
> directly and even tried the simple sample application found here:
> http://www.opengl.org/wiki/Tutorial:_OpenGL_3.0_Context_Creation_%28GLX%29
> 
> Note that the above sample had to be modified to use glXGetClientString(
> display, GLX_EXTENSIONS ) instead of glXQueryExtensionsString since the
> drivers seem to report GLX_ARB_create_context as a client extension only.

Don't do that.  If it's not advertised in glXQueryExtensionsString, it's not (fully) supported.  As you've noticed, when it's not supported, it doesn't work.  In this case, it means your xserver is too old to support that extension.
Comment 9 Steve 2012-11-02 21:05:47 UTC
Good to know. I have upgraded that system to the latest Ubuntu 12.10 release which does include the newer Xorg in order to support this. I can confirm that if I create an OpenGL 3.0 context using the GLX_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB flag the memory issues go away. However, if I don't specify GLX_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB then it still uses extra memory. However, the other bug (#55675) now crashes after enough make current calls.
Comment 10 Eric Anholt 2013-09-19 15:53:45 UTC
*** Bug 65342 has been marked as a duplicate of this bug. ***
Comment 11 GitLab Migration User 2019-09-25 18:49:19 UTC
-- GitLab Migration Automatic Message --

This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity.

You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/mesa/mesa/issues/1387.


Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.