I have the following problem with CIImageAccumulator: when I set the
image of such an accumulator to an image in any colour space
different from CoreImage's internal colour space (say Generic RGB),
and then extract it again, the pixel data are slightly different than
those of the original image.
I've written a test program, reproduced below, to illustrate this
problem. The program creates a CIImage containing 3 pixels, whose
individual values are (in RGB, 8 bits, with alpha=255 in all cases):
(1,2,3), (126,127,128) and (253,254,255). This image is then rendered
into a bitmap context, and the pixel values are extracted from there.
At this stage, they are equal to the input values. However, if I then
store the CIImage inside an accumulator, extract the output image
from that accumulator, and extract the pixel values from there, I now
get (0,0,0), (125,126,128), (254,255,255). This close, but not equal,
to the original data.
Here is what I think is happening: the accumulator stores its data
internally in the linear variant of the Generic RGB colour space that
is used by CoreImage. The problem I observe is due to rounding errors
after the two-way conversion (non-linear -> linear -> non-linear).
Is that analysis right? If yes, does anybody know of a work-around?
Ideally, I'd like to be able to specify the colour space to use for
the accumulator, but there doesn't seem to be an API for that,
Do not post admin requests to the list. They will be ignored.
Quartz-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden