Bug 89156 - r300g: GL_COMPRESSED_RED_RGTC1 / ATI1N support broken
Summary: r300g: GL_COMPRESSED_RED_RGTC1 / ATI1N support broken
Status: RESOLVED FIXED
Alias: None
Product: Mesa
Classification: Unclassified
Component: Drivers/Gallium/r300 (show other bugs)
Version: git
Hardware: Other All
: medium normal
Assignee: Default DRI bug account
QA Contact: Default DRI bug account
URL:
Whiteboard:
Keywords:
Depends on:
Blocks:
 
Reported: 2015-02-15 16:28 UTC by Stefan Dösinger
Modified: 2015-03-09 20:05 UTC (History)
0 users

See Also:
i915 platform:
i915 features:


Attachments
Screenshot (152.53 KB, image/png)
2015-02-24 09:13 UTC, Stefan Dösinger
Details
Shader used to read the texture (400 bytes, text/plain)
2015-02-26 09:36 UTC, Stefan Dösinger
Details
Precision fix (798 bytes, patch)
2015-03-02 14:12 UTC, Stefan Dösinger
Details | Splinter Review

Description Stefan Dösinger 2015-02-15 16:28:56 UTC
r300g on r500 chips advertises GL_ARB_texture_compression_rgtc, but the red-only format GL_COMPRESSED_RED_RGTC1 (aka ATI1N in ATI / d3d9 speak) is broken. This breaks the d3d8 and d3d9 visual tests in Wine. It can also be seen by running the rgtc-teximage-01 test in piglit.
Comment 1 Marek Olšák 2015-02-21 10:49:58 UTC
The texture filtering seems to operate at lower precision. Is that what you're seeing?
Comment 2 Stefan Dösinger 2015-02-24 09:13:39 UTC
Created attachment 113786 [details]
Screenshot

ATI1N seems to return random garbage here. Attached is a screenshot from our 3DC tests. The right half is ATI2N, it works OK. The left half is ATI1N, it is filled with random garbage. The expected result is a solid color on the left half, with R = 0x7f (plus or minus 1). We're not too particular about the result of G and B on Windows. On Wine we set G = R and B = R.
Comment 3 Marek Olšák 2015-02-24 21:38:47 UTC
I've just tested R580 and RGTC1 works very well according to piglit. What happens if you use the default swizzle?
Comment 4 Stefan Dösinger 2015-02-26 09:36:56 UTC
Created attachment 113841 [details]
Shader used to read the texture

Indeed changing the swizzle fixes the random output. I have attached the shader we use to sample the texture. We don't have a swizzle on the texture2D statement, but we do swizzle the output variable, and apparently the optimizer merges that.

If I use the RGBA values returned by the texture sampling directly I get a solid color as expected. However, the value is off quite a bit: Instead of 0x7f I get 0x6c.

The texture data we use is this:

static const char ati1n_data[] =
{
    /* A 4x4 texture with the color component at 50%. */
    0x7f, 0x7f, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
};
Comment 5 Marek Olšák 2015-02-26 11:23:14 UTC
(In reply to Stefan Dösinger from comment #4)
> Created attachment 113841 [details]
> Shader used to read the texture
> 
> Indeed changing the swizzle fixes the random output. I have attached the
> shader we use to sample the texture. We don't have a swizzle on the
> texture2D statement, but we do swizzle the output variable, and apparently
> the optimizer merges that.

So it's a compiler bug.

> 
> If I use the RGBA values returned by the texture sampling directly I get a
> solid color as expected. However, the value is off quite a bit: Instead of
> 0x7f I get 0x6c.
> 
> The texture data we use is this:
> 
> static const char ati1n_data[] =
> {
>     /* A 4x4 texture with the color component at 50%. */
>     0x7f, 0x7f, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
> };

That's expected. ATI1N operates at lower precision. I'm afraid I can't change that.
Comment 6 Stefan Dösinger 2015-02-26 11:46:17 UTC
(In reply to Marek Olšák from comment #5)
> So it's a compiler bug.
In which sense? Is there something in the spec that tells me I should expect garbage when I use texture2D().xxxx? Or is this something the driver tells the compiler and the compiler is supposed to generate a swizzle-free texture2D statement?

> > static const char ati1n_data[] =
> > {
> >     /* A 4x4 texture with the color component at 50%. */
> >     0x7f, 0x7f, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
> > };
> 
> That's expected. ATI1N operates at lower precision. I'm afraid I can't
> change that.
Windows gives me a proper value (0x7f, which is pretty close to 0x80). Do you have any idea how it does that?

I can try to find a scheme behind the imprecision if it helps you. There may be something like an off by one error that the driver can account for.
Comment 7 Marek Olšák 2015-02-26 13:26:40 UTC
A compiler bug is a compiler bug. Sorry, I don't know how to make that statement clearer.

If you run the rgtc-teximage-01 piglit test, you'll see that it looks like the red channel only has 4 bits.
Comment 8 Stefan Dösinger 2015-03-02 13:34:32 UTC
I ran some more tests, it seems that the format is operating at 3 bits precision. I can produce 8 different output colors. Otherwise it seems to follow the spec, so I don't think we're accidentally feeding the data into an R3G3B2 texture.

On Windows the format operates at the expected precision - I can get any output data from 0x00 to 0xff.

I skimmed the GPU docs for clues what may cause this behavior but could not find anything. The things I checked were enabling / disabling filtering, make sure texture address handling follows conditional NP2 texture rules, disabling alpha blending. For the sake of testing I also tried disabling FBOs and all our sRGB code.

I'm also quite sure that all 8 bits of red0 and red1 input arrive on the GPU. I tested that by setting the code of each texel to 7 and then testing red0=1, red1=0 and red0=0 and red1=1. In the former case this gives the result 0 (interpolation between red0 and red1), in the latter case this gives 0xfc (MAXRED). The same works for the input values 0x80 and 0x7f.

I tested interpolation codes (e.g. red0=0x2, red1=0xa2, code 2 for each texel, then try to reduce red0 or red1 by 1), and it seems that the input into the interpolation is OK, but either the interpolation happens at a lower precision or the output is clamped afterwards.
Comment 9 Stefan Dösinger 2015-03-02 13:50:20 UTC
The reason why I am suspicious about the 3 bits precision and ATI1N is that according to the GPU register docs TX_FMT_ATI1N is separated from TX_FMT_3_3_2 only by TX_FORMAT2.TXFORMAT_MSB. Is it possible that some code doesn't look at TXFORMAT_MSB, thinks it sees TX_FMT_3_3_2 and sets up some other part of the GPU to expect 3 bits of red data?

I'm skimming the code for something like this, so far I haven't found anything.
Comment 10 Stefan Dösinger 2015-03-02 14:12:51 UTC
Created attachment 113921 [details] [review]
Precision fix

The attached patch seems to fix the precision problem for me. It seems to make sense given the surrounding code, but I have no real clue what's special about swizzles for these formats.
Comment 11 Marek Olšák 2015-03-09 20:05:35 UTC
Fixed by f710b99071fe4e3c2ee88cdcb6bb5c10298e014. Closing.


Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.