User-agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.4) Gecko/20030624 Netscape/7.1 (ax)
I don't think that would work for us.
glReadPixels (0, 0, w, h, GL_RGB, GL_UNSIGNED_BYTE, (void*)c_tmp);
is the exact line we're using. I could accept that it would be a little slower
than reading back a native format, but what state could quartz be bashing to
cause that not to work? RGB is the format that the data needs to get to, so
someone (either osx or us) has to swap bytes around.
Does this seem like a bug, somewhere? If so, is it quartz or agl?
Mike Paquette wrote:
glReadPixels takes a parameter to grab the data in a number of formats.
The GL_UNSIGNED_INT_8_8_8_8_REV option will read back a 32 bit source
in the 'native' format, ARGB or XRGB.
For regression testing, we are creating a cggl context on top of an
agl context, then taking snapshots of the rendered output from quartz
using glReadPixels. It works, but the colors have their colors
reversed (b&r channels swapped), relative to when we capture the
rendered output from opengl. I can correct it by looping through the
data and swizzling the bytes back, but it would be nice to know why
it's happening. Anyone have any ideas?
Do not post admin requests to the list. They will be ignored.
Quartz-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden