Bug 41461 - [i965, ilk] H264/AVC 10-bit playback through OpenGL is misrendered
Summary: [i965, ilk] H264/AVC 10-bit playback through OpenGL is misrendered
Status: RESOLVED FIXED
Alias: None
Product: Mesa
Classification: Unclassified
Component: Drivers/DRI/i965 (show other bugs)
Version: 7.11
Hardware: x86-64 (AMD64) Linux (All)
: medium minor
Assignee: Eric Anholt
QA Contact:
URL:
Whiteboard:
Keywords:
: 41595 (view as bug list)
Depends on:
Blocks:
 
Reported: 2011-10-04 11:41 UTC by Tobias Jakobi
Modified: 2019-03-08 06:55 UTC (History)
5 users (show)

See Also:
i915 platform:
i915 features:


Attachments
mplayer2 + vo_gl + h264 10-bit content (36.28 KB, image/png)
2011-10-04 11:41 UTC, Tobias Jakobi
Details

Description Tobias Jakobi 2011-10-04 11:41:58 UTC
Created attachment 51969 [details]
mplayer2 + vo_gl + h264 10-bit content

Hello,

I noticed a problem with 10-bit playback in the i965 GL driver. I use mplayer2 for playback (latest git master) and when using the vo_gl backend the visuals are messed up (screenshot attached).

This doesn't seem to be a bug in mplayer2 itself, because the issue disappears when using LIBGL_ALWAYS_SOFTWARE and therefore forcing the usage of the software rasterizer.

Output also works well with vo_xv, but I assume that mplayer2 converts the format then. At least it outputs:
[swscaler @ 0x7ff65290b200]using unscaled yuv420p10le -> yuv420p special converter
VO: [xv] 848x480 => 848x480 Planar YV12

With vo_gl it displays:
VO: [gl] 848x480 => 848x480 Planar 420P 10-bit little-endian
...and this fails.

A H264 file that shows the problem can e.g. be found here:
http://utw.me/2011/10/03/tamayurahitotose-01/

A large part of the anime fansub scene seems to switch to 10-bit H264 encoding now, so this might become a problem for more people sooner or later.

Anyway, an unrelated question: I noticed that VAAPI or more specifically, the intel VAAPI driver, doesn't support 10-bit H264 very well (it shows heavy blocking artifacts at times) and I read that this is expected. So, does VAAPI support this at all and if yes, is the hardware the problem or the current driver implementation?

Greets,
Tobias

P.S.: System specs:
Intel Corporation Arrandale Integrated Graphics Controller

vanilla-kernel 3.0.4
libdrm, mesa and xf86-video-intel git master
xorg-server-1.11.0

mesa git master = e4394fb19f735da3fad9340653637bbe54778069
libdrm git master = c82ef03e4c92017bf5644f294ea04e30500f8d4c
intel git master = d8fe941bc245e24c83c417ccff5c57e83baac3f7
Comment 1 Tobias Jakobi 2011-10-06 14:32:17 UTC
Some commits have landed in mplayer2 fixing some problems with vo_gl and PBOs. But it doesn't affect the issue reported here. In fact it doesn't make any difference I force vo_gl to use PBOs or not.
Comment 2 Eric Appleman 2011-10-09 13:41:44 UTC
I can replicate the problem on Sandy Bridge (i965) and Nouveau. Nvidia binary is unaffected.

Workaround is using the flag "-vf format=yv12".
Comment 3 almos 2011-10-15 11:47:29 UTC
I can't reproduce this with r300g. Both '-vo gl' and '-vo xv' renders the linked video correctly (but for xv mplayer converts the format: yuv420p10le -> yuv420p).
Comment 4 nfxjfg 2011-10-30 05:22:17 UTC
Mplayer's vo_gl output does colorspace conversion in shaders. The three YUV planes are uploaded as separate luminance textures, and when rendering the video, the shader samples from all three textures and multiplies the resulting 3 component vector with a colorspace conversion matrix.

10 bit videos are uploaded as GL_LUMINANCE16, GL_LUMINANCE, GL_UNSIGNED_SHORT. Here is the catch: of these 16 bits, only 10 bit are significant. The colorspace conversion matrix contains a multiplier to scale the 10 bit range to 16 bit by multiplying all values with 64.0 (= 2^(16-10)) or something.

And here's the bug: Mesa seems to throw away the lower 8 bits (or at least the lower 7 bits). We only had 10 bits of precision in the first place, and so the video is left with at most 2 or 3 bits precision, which leads to unacceptable image degradation.

This can't be fixed directly. 10 bit is the output the decoder produces. The point of vo_gl is to do as much as possible on the GPU, and shifting all video bits on the CPU (to scale it from 2^10 to 2^16) would be against the point.

I'm not sure whether this should be considered a mplayer bug or a Mesa bug: it seems the OpenGL actually allows reducing texture quality in the way Mesa does. Apparently you are supposed to use 16 bit integer texture formats if you need guaranteed precision. But it looks like these formats are not available on Mesa.

In any case, this seems to work fine with ATI and Nvidia drivers. I'm not sure what's the right way to actually fix it: at worst it's trying to detect Mesa drivers, and then refusing 10 and 9 bit pixel formats, so that software conversion is forced.

As a temporary solution, users who want to play Hi10P profile videos can add "-vf format=yv12" (or vf=format=yv12 in ~/.mplayer/config) to force software conversion to 8 bit. Of course that might be slower.

>Some commits have landed in mplayer2 fixing some problems with vo_gl and PBOs.
But it doesn't affect the issue reported here.

Yes, these fixes are unrelated. These bugs didn't even show up when using Intel/Mesa systems due to mplayer's detection of ATI hardware and extra code paths that attempt to work around ATI bugs.
Comment 5 Eric Anholt 2011-11-08 10:37:57 UTC
Notes from IRC:

1) With released drivers, the only 16-bit channel formats supported are R16 and RG1616 (SNORM and UNORM).  Check out GL_ARB_texture_rg

2) With GL2, the sizes of channels for sized internalformats are not guaranteed, other than "if you asked for it, we didn't give you 0 bits of it" and vice versa.  You can check how many bits you actually got using glGetTexParameter(GL_TEXTURE_RED_SIZE) or other channel names.

3) With GL3, there are now requirements that sized internalformats give certain bit sizes (see the "Required Texture Formats" section of the GL 3.0 spec).  Expect patches for these to Mesa master this week.  You'll be seeing a lot more real 16-bit channel formats soon.

4) Additionally, shader math may have lower precision.  However, the minimum should be fine, due to the "roughly 1 part in 10^5" guidance early in the GL spec.  You can query the actual precision using GL_ARB_ES2_compatibility's glGetShaderPrecisionFormat().
Comment 6 Eric Anholt 2011-11-09 12:29:52 UTC
I've made the driver go ahead and do what you were hoping for:

commit e56aecf2492e3ca63ea70332a346f3f8414cba6c
Author: Eric Anholt <eric@anholt.net>
Date:   Tue Nov 8 11:05:17 2011 -0800

    i965: Add support for 16-bit unorm L, A, and I textures.
    
    While not required by any particular spec version, mplayer was asking
    for L16 and hoping for actual L16 without checking.  The 8 bits
    allocated led to 10-bit planar video data stored in the lower 10 bits
    giving only 2 bits of precision in video.  While it was an amusing
    effect, give them what they actually wanted instead.
    
    Bugzilla: https://bugs.freedesktop.org/show_bug.cgi?id=41461
    
    Reviewed-by: Kenneth Graunke <kenneth@whitecape.org>
Comment 7 Tobias Jakobi 2011-11-09 12:31:42 UTC
Thanks Eric!
Comment 8 Eric Appleman 2011-11-09 16:25:44 UTC
What about Nouveau and other Mesa drivers?

I was able to replicate the bug on both of my GPUs.
Comment 9 Eric Appleman 2011-11-09 16:42:07 UTC
I don't feel like re-opening this bug, so would anyone mind if I point to the more general bug against Mesa Core for the issue?

https://bugs.freedesktop.org/show_bug.cgi?id=41595
Comment 10 Eric Appleman 2012-06-29 20:57:03 UTC
This amazingly still affects both i915 and i915g with Mesa master.

-vf format=yv12 workaround fixes.

Re-opening.
Comment 11 Daniel Vetter 2012-06-29 23:28:51 UTC
As Eric said, this is actually a problem with mplayer - it asks for 16bit textures but only gets 8 bit textures (which is totally allowed by opengl) but then doesn't check whether it really got 16 bit textures. Eric's patch simply ensure that on the i965 mplayer actually gets 16bit textures.

But if the driver doesn't support these A/L/I16 formats, there's nothing we can do in mesa to work around this mplayer bug. You'd have to file this with the mplayer devs.

[You can obviously try to implement these for the drivers you care about, but these are legacy formats of the opengl 1 days and people are not too enthusiastic about implementing funny corner-cases to support these on modern gpus.]
Comment 12 Tapani Pälli 2019-03-08 06:55:54 UTC
*** Bug 41595 has been marked as a duplicate of this bug. ***


Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.