I hate the standard options for colour on an 8-bit display (pseudo, static, 884 true, etc) but sometimes it cannot be avoided especially with the vesa driver. The core applications tend to work but there are normally some other applications will fail. I believe the solution to this is for the X server to have the option to present a 24 bit frame buffer to the applications and do it's own conversions to the physical 8-bit staticcolor display memory. To provide screens like these http://homepage.ntlworld.com/robert.debath/ex1.png http://homepage.ntlworld.com/robert.debath/ex2.png NB, Original for ex2.png is this nasty image. http://homepage.ntlworld.com/robert.debath/ex3.png To this end I dug out and revisited a very fast 24 bit to 8 bit converter routine from the hayday of 1MByte cards it converts the buffer using simple array lookups to a chessboard halftoned display using one of many constructed palettes. These pallets are a regular colour cube eg 6x10x4 plus a set of grey levels (eg 16). To convert a pixel you do this: ps8byte = clut_conv[ idx_red[pixptr[P_RED]] + idx_grn[pixptr[P_GRN]] + idx_blu[pixptr[P_BLU]] + ((rowno ^ columnno) & 1))]; this conversion is probably fast enough for video. The trade off is about half a megabyte extra of memory for the tables plus the 3MB 1024x768x24 bit frame buffer. The problem I've run into is simply that the X server is very complex and I can't see where a working place to do this conversion would be let alone the best place! With my graphics card a segfault seems to need a power off. My code is at ... http://homepage.ntlworld.com/robert.debath/pnmhalftone.zip Compile with: cc -o pnmhalftone pnmhalftone.c -lpnm or cc -c -DMKLIB pnmhalftone.c If you want to go with this I'm happy to help in any way that I can but for the moment I've given up trying to integrate this into X myself.
Sorry about the phenomenal bug spam, guys. Adding xorg-team@ to the QA contact so bugs don't get lost in future.
We do this already. Composite exposes a depth-32 visual, even on pseudocolor displays.
Sure it exposes a 32 bit visual, but no normal programs use it. Xterm doesn't, gdk doesn't, firefox doesn't, most others don't. It looks like xloadimage might, but that's the only one I found. Normal programs expect the X server to default to the best visual. The POINT is that that X server should react to the clients as if it actually has a 24+ bit display it should not present the visual as an afterthought that the application has to go through hoops to use (it's not an XCreateSimpleWindow anymore after all). In fact I've been learning Xlib since I raised this issue and I now think that this IS an xserver issue. Simply because it's so much more complicated for clients to navigate through all the options to do with visuals than it is to just use whatever the server thinks is best. With this level of complexity it gets so very difficult to prove the client code that it will probably be wrong. So what I would like to see is a configuration such that the X server provides a DEFAULT 24 bit visual whatever the hardware provides. For 16/15 bit hardware the X server does a quartertone ordered dither (2x2 cell 4 levels) to raise the average depth to 7:8:7 (or 7:7:7) and for 8 bit it dithers to a nice custom palette. ALL WITHOUT EVEN TELLING a dumb client what's happening. At this point 99.9% of applications will correctly ignore Visuals whereas now they just assume you'll be on a 16/24 bit "TrueColor" display.
Then use a toolkit instead of Xlib, and complain to them if they don't provide the hooks you want.
Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.