Summary: | [OGL] Get incorrect texture image if texture's internalformat is COMPRESSED_RGB_FXT1_3DFX | ||
---|---|---|---|
Product: | Mesa | Reporter: | WuNian <nian.wu> |
Component: | Mesa core | Assignee: | mesa-dev |
Status: | RESOLVED NOTABUG | QA Contact: | |
Severity: | normal | ||
Priority: | medium | CC: | dri-devel, idr |
Version: | git | ||
Hardware: | x86 (IA32) | ||
OS: | Linux (All) | ||
Whiteboard: | |||
i915 platform: | i915 features: | ||
Attachments: | test case |
Description
WuNian
2007-08-21 20:10:25 UTC
Created attachment 11201 [details]
test case
I think this is just an artifact of the compression algorithm. IIRC, this algorithm works best when all the colors in a tile are colinear in RGB space. I tried other texel colors and got better results. For example, if you use just shades of red or green or blue the results are about what you'd expect. I'm going to mark this as not-a-bug. Reopen if you feel otherwise. I tried some other texel colors, but also incorrect, for exmaple: 0, 0 origin img: 0.096774 1.000000 0.193548 0, 1 origin img: 0.838710 0.612903 0.290323 1, 0 origin img: 1.000000 1.000000 0.000000 1, 1 origin img: 0.806452 0.129032 0.387097 image format is : GL_COMPRESSED_RGB_FXT1_3DFX (0, 0) get img: 0.098039 1.000000 0.192157 origin img: 0.096774 1.000000 0.193548 (0, 1) get img: 0.572549 0.415686 0.321569 origin img: 0.838710 0.612903 0.290323 (1, 0) get img: 0.333333 0.709804 0.258824 origin img: 1.000000 1.000000 0.000000 (1, 1) get img: 0.807843 0.125490 0.388235 origin img: 0.806452 0.129032 0.387097 --------------------------------------------------------------- 0, 0 origin img: 0.096774 1.000000 1.000000 0, 1 origin img: 1.000000 0.000000 0.000000 1, 0 origin img: 1.000000 1.000000 0.000000 1, 1 origin img: 0.000000 1.000000 0.000000 image format is : GL_COMPRESSED_RGB_FXT1_3DFX (0, 0) get img: 0.000000 1.000000 0.000000 origin img: 0.096774 1.000000 1.000000 (0, 1) get img: 1.000000 1.000000 0.000000 origin img: 1.000000 0.000000 0.000000 (1, 0) get img: 1.000000 1.000000 0.000000 origin img: 1.000000 1.000000 0.000000 (1, 1) get img: 0.000000 1.000000 0.000000 origin img: 0.000000 1.000000 0.000000 ---------------------------------------------------- And if the texture size is 5x5, 9x9 etc., incorrect texture image will also be got. If the texture size is 4x4, 8x8, it can get correct texture image. Anyway, reopen this bug. COMPRESSED_RGB_FXT1_3DFX is a compression scheme based on 8x4-texel blocks. I don't think it's expected to work on textures that aren't a multiple of that size. If the texture is not times of 8x4 bolck, COMPRESSED_RGB_FXT1_3DFX can compensate additional texels to meet the 8x4 requirement. So it should compress any size texture correctly in sprite of some precision loss. (In reply to comment #5) > If the texture is not times of 8x4 bolck, COMPRESSED_RGB_FXT1_3DFX can > compensate additional texels to meet the 8x4 requirement. So it should compress > any size texture correctly in sprite of some precision loss. That's true, I think the requirement to be able to compess non-8x4 sized textures comes from the fact this needs to work with small mipmaps even for pot-sized textures. However, in your example the colors are not even close to colinear. Remember, you have only 2 "true" colors per 4x4 block. Looks like the algorithm picks 2 from the original colors and interpolates the other 14 in a block with them (that is it does not try to find more optimal base colors not in the original 4x4 block). This is probably suboptimal for quality but is not really a bug. FXT1 comprises four different compressed texture formats. And 3DFX_texture_compression_FXT1 Spec says: During the compression phase, the encoder selects one of the four formats for each block based on which encoding scheme results in best overall visual quality. Mesa has implemented all of four compressed formats in main/texcompress_fxt1.c, but only use 2 formats - MIXED and ALPHA - in fxt1_encode()->fxt1_quantize(). So we can improve Mesa's implementation to use all of four format. Of coz, the algorithm about how to select the best format for a texture block should be complicated. Please change the component to mesa core, this is not a driver specific issue There is a way to determine whether or not it's an artifact of the compression algorithm or a bug in Mesa's implementation. We should compare the image generated by 3dfx's FXT1 compression tool. If the texture generated by the tool passes the error metric, then the bug is certainly in Mesa's implementation. The old 3dfx compression tool can be found at: http://www.falconfly.de/downloads/fxt1.tar.gz I don't quite know this issue, just post the latest result for the attached case: imgWidth is 2, imgHeight is 2 0, 0 origin img: 0.096774 1.000000 0.193548 0, 1 origin img: 0.838710 0.612903 0.290323 1, 0 origin img: 0.806452 0.129032 0.387097 1, 1 origin img: 0.806452 0.129032 0.387097 image format is : GL_COMPRESSED_RGB_FXT1_3DFX (0, 0) get img: 0.098039 1.000000 0.192157 origin img: 0.096774 1.000000 0.193548 (0, 1) get img: 0.572549 0.415686 0.321569 origin img: 0.838710 0.612903 0.290323 (1, 0) get img: 0.807843 0.125490 0.388235 origin img: 0.806452 0.129032 0.387097 (1, 1) get img: 0.807843 0.125490 0.388235 origin img: 0.806452 0.129032 0.387097 This result is against: mesa: (master)0e8a5a84742adf6e99236f246c77325fad174204 Mass version move, cvs -> git No activity on this bug for 2 years, so I think it is no longer active. Feel free to reopen if it is still relevant though. |
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.