Bug 702 - Radeon only supports a maximum point size of 1.0.
Summary: Radeon only supports a maximum point size of 1.0.
Status: RESOLVED WONTFIX
Alias: None
Product: Mesa
Classification: Unclassified
Component: Drivers/DRI/R100 (show other bugs)
Version: git
Hardware: x86 (IA32) Linux (All)
: high normal
Assignee: Default DRI bug account
QA Contact:
URL:
Whiteboard:
Keywords:
Depends on:
Blocks:
 
Reported: 2004-06-01 07:52 UTC by Adam Jackson
Modified: 2010-11-10 16:16 UTC (History)
2 users (show)

See Also:
i915 platform:
i915 features:


Attachments
Screenshot of modified "point" program from mesa-demos on r200 (3.29 KB, image/png)
2010-11-05 10:50 UTC, Daniel Richard G.
Details
Screenshot of modified "point" program from mesa-demos on i915 (2.40 KB, image/png)
2010-11-05 10:51 UTC, Daniel Richard G.
Details
trivial possible fix (475 bytes, patch)
2010-11-05 21:54 UTC, Roland Scheidegger
Details | Splinter Review

Description Adam Jackson 2004-06-01 07:52:44 UTC
originally
http://sourceforge.net/tracker/index.php?func=detail&aid=735997&group_id=387&atid=100387

control vertices in Maya are hardly visible
In Maya 4.5 (and 4.0), control vertices appear as 1-pixel
dots in the viewport, making it impossible to see them
when the scene contains complex geometry. I have a
Radeon Mobility M6 16MB. I know this is a bug with the
hardware OpenGL renderer, because when I view the
identical scene using software rendering, the control
vertices are painted properly. Please see the attached
screenshots (software vs. hardware) and look at points
surrounding the sphere. In software, they appear as
little magenta squares, but in hardware, they are tiny
dots.

Followups:

Comments

Date: 2003-11-04 11:48
Sender: bassaminator
Logged In: YES 
user_id=448744

update:
I've been using the patch since the last post (10-23) and
have had no problems whatsoever; control vertices are
displaying fine in blender, I can play with their sizes, and
it feels responsive with no stability issues. 
Any chance this could make it into the main tree? What are
the issues that could prevent that?

Date: 2003-10-23 07:20
Sender: bassaminator
Logged In: YES 
user_id=448744

Thanks idr! your patch worked (I had to add some #defines in
radeon_tcl.h manually) but the result works so far for me in
unpatched blender. I'll play with it some more over the next
couple of weeks and post here if I have any problems (or not).
btw, I'm using xfree 4.3.0 at the moment.

Date: 2003-10-20 15:33
Sender: idr
Logged In: YES 
user_id=423974

Have you tried the following patch?  It's fairly old, so it
may not apply cleanly.

http://marc.theaimsgroup.com/?l=dri-devel&m=105862837814769&a
mp;w=2

Whether or not this is an issue that the driver should
handle (and I do agree with you on that), the
GL_ARB_point_parameters spec is specific (not
the "Errors"
section) that the driver doesn't have to.  The maximum point
size is the maximum size, and anything beyond that is an error.

http://oss.sgi.com/projects/ogl-sample/registry/ARB/point_paramet
ers.txt
 

Date: 2003-08-29 21:19
Sender: bassaminator
Logged In: YES 
user_id=448744

Workaround:

Hi, I managed to hack the blender source code to work around
this. 
Removed the glBegin() and glEnd(), create a bitmap array,
replace each glVertex*() call in the block with
glRasterPos*() with the same argument, followed by
glBitmap() as follows:

	
        glPointSize(3.0); 
	GLubyte Squaredot[9] = { 0xff,0xff,0xff, 0xff,0xff,0xff,
0xff,0xff,0xff };

	/*glBegin(GL_POINTS);*/
	cpack(0);
	/*glVertex3fv(sta);*/
	glRasterPos3fv(sta);
	glBitmap(3,3,0.0,0.0,0.0,0.0,Squaredot);

	/*glVertex3fv(end);*/
	glRasterPos3fv(end);
	glBitmap(3,3,0.0,0.0,0.0,0.0,Squaredot);

	/*glEnd();*/
	glPointSize(1.0);

now my control points are nice and fat again. Thanks for the
suggestion, idr, though I still think this should be handled
by the driver.

Date: 2003-06-27 13:21
Sender: pahohoho
Logged In: YES 
user_id=83339

So is there any status on this? Is anyone trying to implement 
option 2?

I actually did get the Maya source code and tried fooling 
around with it. Unfortunately it took me weeks to finally get
it
to build properly under my Linux distribution (the Maya 
developers decided to only officially support Red Hat and I 
use Gentoo). Then I tried modifying the Maya source code so 
that it would render bitmap icons instead of GL_POINTS when 
the point size was greater than 1.0. But I ran into myriad 
problems. Most were due the fact that the glVertex 
commands can't be trivially converted to glBitmap commands 
as they are contained between glBegin and glEnd, and the 
way Maya interfaces with OpenGL makes this particulaly 
difficult. I spent about three weeks on this and I've finally 
given up.

I talked to my friend about this problem and he thinks it's 
entirely the driver's responsibility to support software fallback
of larger point sizes, not the application. And as I mentioned 
in my last followup, the Windows driver does take care of it. 
So I think option 1 is out of the question.

Date: 2003-05-30 06:17
Sender: pahohoho
Logged In: YES 
user_id=83339

I also found that when I ran Maya for Windows under 
Windows XP on the same machine (same ATI Radeon card 
using hardware rendering), the control verticies are rendered 
properly. If Windows can do it perfectly, I really don't see
why
it should be a problem to do it under Linux. And as 
bassaminator just posted, it was working fine before XFree86 
4.3. So what's going on?

Date: 2003-05-29 20:06
Sender: bassaminator
Logged In: YES 
user_id=448744

this happens also in blender and wings3d. In XFree versions
prior to 4.3, they were rendering normally. does this mean
the older version of the driver did software rendering for
the control points?

Date: 2003-05-15 13:32
Sender: pahohoho
Logged In: YES 
user_id=83339

About your options:

1. It looks as though the Maya team has decided not to 
support the Radeons. (The only ATI cards they support are 
Fire GL ones.) So I doubt they'll try to accommodate it. Too 
bad I'm part of the StudioTools development team rather than 
the Maya one. :)

2. Why is this unpleasant? I'd like to try it out.

3. Well, I'm probably not clever enough for this. But I'd still 
like to attempt #2.

I've been looking through the code already, but so far I'm not 
sure where it is that points are rendered. Any tips?

Date: 2003-05-12 11:11
Sender: idr
Logged In: YES 
user_id=423974

The Radeon only supports a maximum point size of 1.0.  One
of three things needs to happen to resolve this.  

1. Maya needs to be modified to render the control points
differently when EXT_pointer_parameters is not supported or
the maximum point size is only 1.0.

2. The Radeon driver needs to be modified to support larger
point sizes and use a software fallback.

3. Someone needs to find a clever way to have the driver do
larger point sizes.

Options 1 and 3 are fairly unlikely and option 2 is fairly
unpleasant.
Comment 1 Adam Jackson 2004-06-08 10:01:36 UTC
mass reassign to dri-devel@, i'm no longer the default component owner.  sorry
for the spam.
Comment 2 Adam Jackson 2005-10-17 15:13:38 UTC
close, we have aliased points in mesa now, and aa points are probably not far
behind.
Comment 3 Roland Scheidegger 2005-10-17 18:09:53 UTC
(In reply to comment #2)
> close, we have aliased points in mesa now, and aa points are probably not far
> behind.
That's not quite true, support for large points was added to r200, but not
radeon. The radeon driver definitely would need to emulate large points with
tris in contrast to the r200 which can do it natively with its point sprite
primitive. (That said, it may not be that important nowadays. Things like
blender have been fixed to not expect large point size support.)
Comment 4 Daniel Richard G. 2010-11-05 10:50:52 UTC
Created attachment 40069 [details]
Screenshot of modified "point" program from mesa-demos on r200
Comment 5 Daniel Richard G. 2010-11-05 10:51:49 UTC
Created attachment 40070 [details]
Screenshot of modified "point" program from mesa-demos on i915
Comment 6 Daniel Richard G. 2010-11-05 11:06:09 UTC
This bug still exists on r200, and possibly Radeon generally.

I've modified mesa-demos-8.0.1/src/trivial/point.c with a glPointSize(16) call right before glBegin(GL_POINTS), and attached screenshots of the window it produces on both r200 and i915. The latter is correct, the former illustrates the bug. I get proper output using Nvidia's driver as well.

This bug is affecting Wings3D for me similarly to how it affects Maya for the original reporter; the program cannot display visible vertex points on mesh wireframes, and isolated vertices are all but impossible to see.

This behavior is inconsistent with other DRI drivers, and is particularly egregious on hardware (ATI FireGL 8800) that was purpose-built for CAD applications. This is not a request for enhancement, but a straight-up bug.

I am running Ubuntu Maverick on amd64 with kernel 2.6.36 plus Alex Deucher's patch from bug #25544, and Mesa packages from Ubuntu's xorg-edgers/radeon PPA built from git 20101103.
Comment 7 Roland Scheidegger 2010-11-05 21:10:13 UTC
(In reply to comment #6)
> This bug still exists on r200, and possibly Radeon generally.
I am quite sure this used to work on r200 (not r100 which can't do it natively). Aliased points should be supported up to size 2047 (though antialiased points are still limited to size 1). What does glxinfo -l say wrt point size?
Comment 8 Roland Scheidegger 2010-11-05 21:38:01 UTC
Hmm actually I see the problem. The driver relies on ctx->_TriangleCaps and tests for DD_POINT_SIZE. But nothing ever sets this - update_tricaps would but it's ifdefed out as it was basically reverted, looks like the revert wasn't quite complete.
Note that it should still work if point attenuation is used (e.g. pointblast demo).
Comment 9 Roland Scheidegger 2010-11-05 21:54:43 UTC
Created attachment 40078 [details] [review]
trivial possible fix

Here's a simple patch which would restore previous (2006...) behaviour...
Though I'm thinking probably the driver should just do this on its own depending on the ctx->Point.Size.
Note the problem is the hardware has two hw primitives for points: point sprite and point. Only the former can have size larger than 1. I'm not quite sure (or can't remember) why we're not always using point sprites but points when the size is 1 and it's not attenuated. Maybe it's faster.
Comment 10 Daniel Richard G. 2010-11-06 00:01:19 UTC
Hi Roland, I see you've been here before!

I no longer have the Ubuntu Maverick install I was testing with earlier (I don't have the disk it was on anymore), but I do have here an Ubuntu Lucid install (same hardware) with Mesa 7.7.1, which gives me this:

$ glxinfo -l | grep -i point
    GL_ARB_occlusion_query, GL_ARB_point_parameters, GL_ARB_point_sprite, 
    GL_EXT_packed_pixels, GL_EXT_point_parameters, GL_EXT_polygon_offset, 
    GL_ALIASED_POINT_SIZE_RANGE = 1, 2047
    GL_SMOOTH_POINT_SIZE_RANGE = 1, 1

That seems to agree with what you said.

Your patch applies cleanly to 7.7.1, and indeed, it makes the points big! The modified "point" demo, "pointblast" (only w/"Point smooth off"), and Wings3D all now do as they should. Should I test your patch with bleeding-edge code?

I don't know what to make of the patch tweaking generic code instead of driver code, but it surely solves the problem for me. Could this, or some form of this, be committed into the tree?
Comment 11 Roland Scheidegger 2010-11-06 07:05:57 UTC
(In reply to comment #10)
> Your patch applies cleanly to 7.7.1, and indeed, it makes the points big! The
> modified "point" demo, "pointblast" (only w/"Point smooth off"), and Wings3D
> all now do as they should. Should I test your patch with bleeding-edge code?
That should still work the same.

> 
> I don't know what to make of the patch tweaking generic code instead of driver
> code, but it surely solves the problem for me. Could this, or some form of
> this, be committed into the tree?
I think either this should be commited or instead it should be handled by the driver and the DD_POINT_SIZE flag completely removed as it's currently just broken. Since the plan initially was to remove the whole _TriangleCaps stuff and r200 is the only driver which makes use of this particular flag I'm leaning towards the latter.

In fact, the driver is quite broken anyway wrt point size, since with vertex programs you could output the point size per vertex, but the driver might still use the 1-sized point primitive, since _TriangleCaps DD_POINT_SIZE only looks at the global point size. But I really have no idea why the driver tries to use the point primitive instead of point sprite for 1-pixel points, otherwise it would be easiest to just always use point sprite prim (at least for aa points) which would get rid of both bugs.
(The other DD_POINT flags are also problematic - well DD_POINT_SMOOTH is not but that one just directly mirrors ctx->Point.Smooth. But DD_POINT_ATTEN is also meaningless with vertex programs. i915 uses this and at a quick glance I don't think it gets it right neither.)
Comment 12 Daniel Richard G. 2010-11-06 11:17:39 UTC
> I think either this should be commited or instead it should be handled
> by the driver and the DD_POINT_SIZE flag completely removed as it's
> currently just broken. Since the plan initially was to remove the
> whole _TriangleCaps stuff and r200 is the only driver which makes use
> of this particular flag I'm leaning towards the latter.

Sounds like a good way to go. Fewer idiosyncrasies in older drivers should be a win.

> In fact, the driver is quite broken anyway wrt point size, since with
> vertex programs you could output the point size per vertex, but the
> driver might still use the 1-sized point primitive, since
> _TriangleCaps DD_POINT_SIZE only looks at the global point size.

Still a better bug to have, at least, since vertex programs are a newer construct anyway (and less likely to be used in CAD-type scenarios).

> But I really have no idea why the driver tries to use the point
> primitive instead of point sprite for 1-pixel points, otherwise it
> would be easiest to just always use point sprite prim (at least for aa
> points) which would get rid of both bugs.

FWIW, I modified the "point" demo into a poor man's benchmark (by putting the glBegin(GL_POINTS) and glVertex*() calls inside large loops, and adding gettimeofday() calls), and I'm seeing basically no differences between 1.0 and 1.001 for glPointSize(). You'd think the 1-pixel-point primitives would be faster on some hardware, but r200 doesn't seem to special-case those.
Comment 13 Roland Scheidegger 2010-11-10 08:58:04 UTC
I've pushed a fix for r200 (c7192ab11f7e34fdfe17d36d089260c6703ddfa8).
Not sure what to do with the bug though, as it actually is against radeon (r100), I guess that's WONTFIX unless someone is still interested enough in this old chip.
Comment 14 Daniel Richard G. 2010-11-10 11:50:58 UTC
Many thanks for getting that in! I'll give the new code a try once the PPA is updated.

So r100, at the hardware level, doesn't support point sizes larger than 1.0. "Fixing" this bug, then, would require implementing a software emulation of point sprites (as was suggested in the original circa-2003 bug discussion). I suppose this might be worthwhile if it were a cross-driver facility, such that it would benefit any hardware that lacks large-point powers. But for r100 alone, I think it's reasonable to point out that the problem is not in the software. WONTFIX sounds fine to me.

For reference, the aforementioned commit---plus the one that drops DD_POINT_SIZE as well as DD_LINE_WIDTH---are viewable here:

http://cgit.freedesktop.org/mesa/mesa/commit/?id=c7192ab11f7e34fdfe17d36d089260c6703ddfa8

http://cgit.freedesktop.org/mesa/mesa/commit/?id=aad65fa112754074d24d0b5a8397db2663dc9454
Comment 15 Alex Deucher 2010-11-10 12:10:10 UTC
Please remember to push the r200 fix to the 7.9 branch as well.
Comment 16 Roland Scheidegger 2010-11-10 16:09:34 UTC
(In reply to comment #14)
> Many thanks for getting that in! I'll give the new code a try once the PPA is
> updated.
> 
> So r100, at the hardware level, doesn't support point sizes larger than 1.0.
> "Fixing" this bug, then, would require implementing a software emulation of
> point sprites (as was suggested in the original circa-2003 bug discussion). I
> suppose this might be worthwhile if it were a cross-driver facility, such that
> it would benefit any hardware that lacks large-point powers.
Yes, this is (easily) possible in gallium. But r100 can't really be ported to gallium even if you wanted... So all you can do easily is drop to swrast, which isn't terribly useful, so we just rely on apps wanting to use large points to convert them to tris themselves. Seems fair enough, apps shouldn't expect large point size support (well not on this old hardware at least...), the driver reports this correctly for r100.
Comment 17 Roland Scheidegger 2010-11-10 16:16:09 UTC
(In reply to comment #15)
> Please remember to push the r200 fix to the 7.9 branch as well.

Ok done: 78ccca5a69f285aea282384050d30f1a82cd15e1

And marking this as WONTFIX finally after 6 years :-).


Use of freedesktop.org services, including Bugzilla, is subject to our Code of Conduct. How we collect and use information is described in our Privacy Policy.