I am wondering if there is a way to fix the following problem with Intel HD 3000 video drivers on Oneiric. As can be seen from the the linked images below the problem is that in *some* games (namely Aquaria and Myth II Soulblighter native client) all 2D textures inside OpenGL (3D) have visible edges around them, making typically bunch of rectangles that should not be seen otherwise (or are not seen on other video cards). this is the case apparently with all textures, both those that should be transparent (e.g. have something only in the middle but nothing around the edges) and non-transparent. I am wondering if this is a bug, or a setting issue that I can somehow fix. Since only some apps are affected by this, mainly an indie game (Aquaria) and an older game (Myth II), I am wondering if they use some older and possibly less common OpenGL function for displaying 2D textures inside 3D (OpenGL) context and for which current intel driver simply does not do proper texture aliasing around the edges of the texture, resulting in the said lines... It is almost as if texture aliasing attempts to take into account non-existing pixels outside the texture and thus the edge pixels of the texture change in color... http://img59.imageshack.us/img59/6900/intel3000oneirictexture.jpg http://img341.imageshack.us/img341/457/mythintel3000rectangles.jpg Forgot to mention, this is on a new HP dm1-4050 which I is a sandy bridge chipset with an integrated Intel HD 3000 inside it. The system is running Oneiric with latest updates (as of today), including latest ppa intel driver (but the vanilla Oneiric Xorg). The problem is apparent in the following 3D games: Aquaria and Myth II Soulblighter. There might be more...
Just got a reply from the Aquaria devs and the following file includes all the relevant OpenGL drawing code which will hopefully help isolate the drawing function that may be causing the problem: http://hg.icculus.org/icculus/aquaria/file/tip/BBGE/OpenGLStubs.h
An additional important issue has surfaced in discussing this matter with the Aquaria dev. Both Aquaria and Myth II (two games that have problems) are running in 32-bit while I have 64-bit system (Ubuntu Oneiric) and AFAIK none of the native 64-bit apps have this issue which leads me to believe that perhaps 32-bit libs have not been properly built in Oneiric (libGL* stuff I think)... Any ideas as to where to start with researching this angle?
I've finally gotten to building a 64-bit (native) version of Aquaria and it does not exhibit the said artifacts, so this definitely has to do with 32-bit app running on an 64-bit system and its reliance on ia32-libs package (in Ubuntu 11.10). I tried both ia32-libs and ia32-libs-multiarch:i386 and both builds exhibit the same problem, while the native 64-bit version doesn't. The only GL libraries that appear to have anything to do with GL are libGL.so.1 and libGLU.so.1. Below is the ldd for Myth2 game: ldd Myth2 linux-gate.so.1 => (0xf772a000) libGLU.so.1 => /usr/lib32/libGLU.so.1 (0xf7693000) libstdc++.so.6 => /usr/lib32/libstdc++.so.6 (0xf75a8000) libm.so.6 => /lib32/libm.so.6 (0xf757d000) libgcc_s.so.1 => /usr/lib32/libgcc_s.so.1 (0xf755f000) libc.so.6 => /lib32/libc.so.6 (0xf73e5000) libGL.so.1 => /usr/lib/libGL.so.1 (0xf7391000) libpthread.so.0 => /lib32/libpthread.so.0 (0xf7376000) libdl.so.2 => /lib32/libdl.so.2 (0xf7371000) /lib/ld-linux.so.2 (0xf772b000) libglapi.so.0 => /usr/lib32/libglapi.so.0 (0xf735a000) libX11.so.6 => /usr/lib32/libX11.so.6 (0xf7224000) libXext.so.6 => /usr/lib32/libXext.so.6 (0xf7211000) libXdamage.so.1 => /usr/lib32/libXdamage.so.1 (0xf720d000) libXfixes.so.3 => /usr/lib32/libXfixes.so.3 (0xf7207000) libXxf86vm.so.1 => /usr/lib32/libXxf86vm.so.1 (0xf7200000) libdrm.so.2 => /usr/lib32/libdrm.so.2 (0xf71f4000) libxcb.so.1 => /usr/lib32/libxcb.so.1 (0xf71d5000) librt.so.1 => /lib32/librt.so.1 (0xf71cc000) libXau.so.6 => /usr/lib32/libXau.so.6 (0xf71c8000) libXdmcp.so.6 => /usr/lib32/libXdmcp.so.6 (0xf71c0000)
Here are also relevant links to other correspondence on other forums regarding this issue: http://askubuntu.com/questions/95508/3d-textures-on-intel-hd-3000-have-rectangles-that-should-not-be-visible-on-64-bi http://www.bit-blot.com/forum/index.php?topic=4306.msg32740#msg32740 http://tain.totalcodex.net/forum/viewtopic.php?f=2&t=5800&p=57041#p57041
Here's another update. I tried running 32-bit version of Ubuntu on the same hardware and still the same problem persists. So at this point it is a question whether the same problem persists on other distributions (Intel HD 3000 + Myth II game demo obtainable here: http://tain.totalcodex.net/items/show/myth-ii-demo-linux). If so, then it is definitely something with the current state of mesa + intel 3D driver. Otherwise it is a packaging issue in Ubuntu... Any pointers would be most appreciated.
Hi! I think these rectangle artifacts were due to the Gen5+ border color bug. I've committed a fix, which you can get either from Git master or from the 8.0 branch (to be released next month). If you could test this with the 8.0 branch, that would be great. Carl Worth reported that this patch fixed Aquaria on his Sandybridge system. I haven't tried it myself yet. For reference, here's the commit: commit 8e42dadf70ea2d359ef2c6d07a9a4da6d0b8e2da Author: Kenneth Graunke <kenneth@whitecape.org> Date: Fri Jan 20 03:33:40 2012 -0800 i965: Fix border color on Sandybridge and Ivybridge. While reading through the simulator, I found some interesting code that looks like it checks the sampler default color pointer against the bound set in STATE_BASE_ADDRESS. On failure, it appears to program it to the base address itself. So I decided to try programming a legitimate bound, and lo and behold, border color worked. +92 piglits on Sandybridge. Also fixes Lightsmark on Ivybridge. NOTE: This is a candidate for stable release branches. Bugzilla: https://bugs.freedesktop.org/show_bug.cgi?id=28924 Bugzilla: https://bugs.freedesktop.org/show_bug.cgi?id=38868 Signed-off-by: Kenneth Graunke <kenneth@whitecape.org> Reviewed-by: Yuanhan Liu <yuanhan.liu@linux.intel.com> Reviewed-by: Eric Anholt <eric@anholt.net> (cherry picked from commit c25e5300cba7628b58df93ead14ebc3cc32f338c) If this does fix your issue, feel free to close this bug. Thanks for the report!
Is there an easier way to try the new driver? e.g. could I simply just compile the intel driver and run it on whatever I currently have (Oneiric) or do I need the entire 8.0 branch? Either way, where can I get the 8.0 branch git? Please advise. BTW, regarding Aquaria, on the latest builds the artifacts are gone for me so I am not sure if this has fixed it or if it is something that they fixed in their builds (I haven't tried this build yet). However, on Myth2 I've tried everything. The devs even provided the native Oneiric build and it is still there. Yet, in Wine there are no problems. So, for the time being I would say Aquaria is not a good benchmark for the fix, but rather Myth2 is. http://www.bit-blot.com/forum/index.php?topic=4306.msg32740#msg32740 http://tain.totalcodex.net/forum/viewtopic.php?f=2&t=5800&p=57041#p57041
To further complicate matters, I am running 64-bit distribution. If I am compiling these drivers, do I need to compile both 32-bit and 64-bit versions? I ask this because the app in question (Myth2) is a 32-bit-only app... Any assistance in this matter would be most appreciated since I have very little experience with Xorg and 32bit vs. 64bit matters...
Ok, I managed to recompile libdrm_intel (drm folder) and libGL* stuff (mesa folder) and the problem is indeed fixed. Many thanks for the quick turnaround. One small issue is that I had a terribly slow framerate in Myth even though the artifacts were gone. I suspect this is due to the fact I did not recompile the rest of the Xorg. Does this make sense or could it be a regression in the new driver set?
Hmm. Maybe you picked up swrast. You'll need to build a 32-bit version of Mesa for Myth II: In your Mesa source dir: make clean CFLAGS='-m32' CXXFLAGS=-m32' ./configure --disable-glu --with-gallium-drivers="" --with-dri-drivers=swrast,i965 --enable-debug --enable-32-bit make -j5 Then, export LIBGL_DRIVERS_PATH=/path-to-your-mesa-source/lib export LD_LIBRARY_PATH=$LIBGL_DRIVERS_PATH:LD_LIBRARY_PATH (obviously putting in your own path) That should make Myth2 pick up your newly built Mesa, rather than your system-installed one. By doing this, you don't need to do 'make install'. Then run Myth2. I downloaded the Myth2 demo and the patch does indeed fix the problem.
Many thanks for the reply as well as prompt bugfix! (gotta love open source ;-) Just to confirm, though, are you saying I don't need to compile anything else but the mesa from the git branch? Also, if my framerate was bad are you suggesting that the swrast was at fault? If so, why are you suggesting that I build it with your build when it is causing problems? (sorry, just not that knowledgeable about the Xorg/mesa architecture) Once again, many thanks!
Forgot to mention, I did build mine as 32-bit (did "make linux-x86-32").
Yeah, Mesa is enough---there's no need to recompile all of X. Mesa might require a newer libdrm, but that's it. Sometimes if your hardware driver (i965_dri.so) doesn't work for whatever reason (build fail, 32-vs-64, bad paths, etc.) it'll fall back to software rendering via swrast. swrast is really slow. It might also fall back to indirect rendering, which sends all OpenGL calls through the X server, which can also cause trouble in mixed 32/64 situations. You could also wait for the xorg-edgers PPA to pick this up, if that makes life easier. It looks like their Precise Pangolin packages already include the fix, but Oneiric is a few days behind. Also, the release should happen in a week or two.
Cool! Many thanks for all your help everyone! You guys rock! :-)
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.