Bug 39010

Summary: better handling of large pixmaps
Product: xorg Reporter: Xavier <shiningxc>
Component: Driver/nouveauAssignee: Nouveau Project <nouveau>
Status: RESOLVED MOVED QA Contact: Xorg Project Team <xorg-team>
Severity: normal    
Priority: medium CC: ballogyor, tiagomatos
Version: unspecified   
Hardware: Other   
OS: All   
Whiteboard:
i915 platform: i915 features:
Attachments:
Description Flags
possible workaround for 64mb vram cards
none
exa: set max dimensions based on available VRAM
none
exa: set max dimensions based on available VRAM none

Description Xavier 2011-07-06 12:18:04 UTC
Created attachment 48831 [details] [review]
possible workaround for 64mb vram cards

Large images in Firefox only show up as black rectangles.

At the same time, the following errors appear in dmesg :
[  590.075369] [drm] nouveau 0000:01:00.0: fail ttm_validate
[  590.075371] [drm] nouveau 0000:01:00.0: validate vram_list
[  590.075374] [drm] nouveau 0000:01:00.0: validate: -12

Test cases :
- http://geoeyemediaportal.s3.amazonaws.com/assets/images/gallery/ge1/hires/tehran_iran_02_11_10.jpg
- http://xkcd.com/802_large/
- http://upload.wikimedia.org/wikipedia/commons/5/55/Futuregen_DOE_Concept_art.jpg

References :
- https://bugs.freedesktop.org/show_bug.cgi?id=28763#c15
- http://lists.freedesktop.org/archives/nouveau/2010-February/005125.html
Comment 1 Balló György 2011-10-26 09:08:56 UTC
Thanks for your patch, it solves the problem for me in Firefox with NV17 card on Arch Linux.

However I sill get 'fail ttm_validate' messages with OpenGL apps (e.g. quadrapassel, emerillon). When I extend their window's size to the full screen, their contents are disappear, and I can see only black and gray rectangles. There is no problem if I use them only in a small window. (My screen resolution is 1440 × 900.)

Could you help me to solve this problem? On the mailing list you mentioned a libdrm patch to fallback on reloc failures. Could you send me this patch for a test, please? Or do you know any other workaround?
Comment 2 Balló György 2011-10-26 20:03:23 UTC
Unfortunately, your patch extremely slows down XVideo performance.

I think we need to find another solution to avoid memory flush.
Comment 3 Balló György 2011-10-30 00:40:40 UTC
Created attachment 52905 [details] [review]
exa: set max dimensions based on available VRAM

I found a solution!

Since the VRAM size could vary inside each series of cards (e.g. NV10 cards may have 32, 64 or 128 MB VRAM), it would better to set max dimensions based on available VRAM instead of card series.
    
My proposed patch ensures that we always have enough space in VRAM to process images. It fixes memory flush on 64 MB cards, and adds back exa support for cards that have 32 MB or less memory (with a minimal 8 MB VRAM).

I tested this patch only on NV17 with 64 MB VRAM.
Comment 4 Andrew Randrianasulu 2011-10-30 01:56:11 UTC
(In reply to comment #3)
> Created attachment 52905 [details] [review] [review]
> exa: set max dimensions based on available VRAM
> 
> I found a solution!
> 
> Since the VRAM size could vary inside each series of cards (e.g. NV10 cards may
> have 32, 64 or 128 MB VRAM), it would better to set max dimensions based on
> available VRAM instead of card series.
> 
> My proposed patch ensures that we always have enough space in VRAM to process
> images. It fixes memory flush on 64 MB cards, and adds back exa support for
> cards that have 32 MB or less memory (with a minimal 8 MB VRAM).
> 
> I tested this patch only on NV17 with 64 MB VRAM.

Hm .... but those limits may also reflect some hw (family-specific) limits? What about more complex check? Pseudo : 

if card_family == NV10 
{ if vram >= nv_10_min_ram : exa->maxX= 4096; else exa->maxX = 2046; } 
 elseif card_family == NV50
{ if vram >= nv_50_min_vram : exa->maxX = 8190; else exa->maxX = 4096; }
exa->maxX =  2046
endif
Comment 5 Andrew Randrianasulu 2011-10-30 02:02:30 UTC
(In reply to comment #3)
> Created attachment 52905 [details] [review] [review]
> exa: set max dimensions based on available VRAM
> 
> I found a solution!
> 
> Since the VRAM size could vary inside each series of cards (e.g. NV10 cards may
> have 32, 64 or 128 MB VRAM), it would better to set max dimensions based on
> available VRAM instead of card series.
> 
> My proposed patch ensures that we always have enough space in VRAM to process
> images. It fixes memory flush on 64 MB cards, and adds back exa support for
> cards that have 32 MB or less memory (with a minimal 8 MB VRAM).
> 
> I tested this patch only on NV17 with 64 MB VRAM.

And anyway ... shouldn't big pixmaps in case of vram shortage just be placed in GART, and used from it? (sure, xvideo will be slower ...special hint for xv pixmaps? can't remember if its already there .....)

I have hacked up 2.6.38 kernel (frambuffer default bpp to 8, not 32 -> 5 to 1 mb of vram saved) on machine with tnt2 vanta - 8mb/AGP - it works ... even as OpenGL hw renderer, i just need to lower resolution back to 640x480x32 for good Quake1.
Comment 6 Balló György 2012-09-09 17:43:40 UTC
Created attachment 66885 [details] [review]
exa: set max dimensions based on available VRAM

I updated the patch for the current git master.
Comment 7 Balló György 2013-08-20 01:43:50 UTC
It is still a problem with the following components:
- linux 3.10.7
- libdrm 2.4.46
- xf86-video-nouveau 1.0.9

Could someone review the patch from the nouveau team?
Comment 8 Ilia Mirkin 2013-08-20 02:04:52 UTC
What's with the weird non-power-of-2 sizes?

Wouldn't it be better to have a target number of pixmaps to fit in vram, or perhaps a percentage of vram that you're willing to take up with a single pixmap? There are lots of cards with all sorts of quantities of memory (e.g. there are 8800's with 320MB of RAM).

So once you determine a percentage, just have a list of sizes like

sizes = [8192, 4096, 2048]
for size in sizes:
  if size * size * 4 < memory * 0.1:
    return size
return 1024

(This is an example where I assume RGBA pixmaps, up to 10% of vram, but I picked that at random... maybe other percentages would be more reasonable... play around with the numbers. It certainly seems that we'd never want to allow a pixmap that's bigger than VRAM, at the very least, which with current code, there could be, if there were nv10+ with < 64MB of vram, which I'm pretty sure I've seen with at least NV1A with 32M of stolen ram.)

[Note: please don't take this as an endorsement of the approach. In fact, I thought TTM should be able to handle all that, but apparently not. This is just a comment on improving your patch, irrespective of the overall approach.]
Comment 9 Balló György 2013-08-20 02:45:59 UTC
I'm not a developer, so I don't know what would be the best solution. With the current driver on NV17 with 64 MB VRAM, if I open a picture in Firefox that larger than 2047px×2047px, it causes 'fail ttm_validate' errors. If I reduce the max. pixmap size from 4096×4096 to 2046×2046, it solves the problem.

I don't have any other NVIDIA hardware to test my patch.
Comment 10 Ilia Mirkin 2013-08-20 03:10:02 UTC
So a 2046x2046 image works but a 2047x2047 image doesn't? That's awefully specific... and weird. I'll try to remember to test this all out on my NV18 with 64M of VRAM and see what happens.
Comment 11 Tobias Klausmann 2015-01-25 19:25:12 UTC
Is this still a problem with the newest xf86-video-nouveau?
Comment 12 Martin Peres 2019-12-04 08:26:51 UTC
-- GitLab Migration Automatic Message --

This bug has been migrated to freedesktop.org's GitLab instance and has been closed from further activity.

You can subscribe and participate further through the new bug through this link to our GitLab instance: https://gitlab.freedesktop.org/xorg/driver/xf86-video-nouveau/issues/19.

Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.