GTK+ apps render quite slowly in my setup (it may take up to 5 seconds to draw 1920x1200 Transmission window). X server nearly always consumes 100% of CPU (thanks $deity, only one core). Oprofile suggests the time is spent in pixman: samples| %| ------------------ 239796 77.5906 libpixman-1.so.0.14.0 27629 8.9399 no-vmlinux 5421 1.7541 libc-2.10.1.so 4136 1.3383 libcairo.so.2.10800.8 In pixman_rasterize_edges, in particular: samples % symbol name ------------------------------------------------------------------------------- 232546 96.9798 pixman_rasterize_edges Environment: * 01:00.0 VGA compatible controller: ATI Technologies Inc Mobility Radeon HD 3400 Series * X.Org X Server 1.6.5 * Linux vertex 2.6.32-5-generic-pae #6-Ubuntu SMP Mon Nov 23 13:10:32 UTC 2009 i686 GNU/Linux (actually rc-something) * (II) Module radeon: vendor="X.Org Foundation" compiled for 1.6.5, module version = 6.12.99 Module class: X.Org Video Driver ABI class: X.Org Video Driver, version 5.0 * Mesa 7.7.0~git20091129 * libdrm-radeon 2.4.15+git20091125.6f66de98 * libdrm 2.4.15+git20091125.6f66de98 * Xmonad 0.9, no compositing in use * libpixman 0.14.0 X.org log and dmesg are attached.
Created attachment 31623 [details] Xorg.0.log
Created attachment 31625 [details] dmesg
Should be better with xserver >= 1.7.x thanks to a new EXA feature called 'mixed pixmaps'.
Yeah, it's now lightning-fast. Thanks.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.