Re: Making an image NSCalibratedWhiteColorSpace
Re: Making an image NSCalibratedWhiteColorSpace
- Subject: Re: Making an image NSCalibratedWhiteColorSpace
- From: Gideon King <email@hidden>
- Date: Thu, 15 Nov 2001 22:56:19 +0800
Just in case anyone else is doing this, or can suggest a better way to
do it, here is what I did (in my case I always know that I am starting
with an RGBA image, and want a grayscale alpha image out):
// Note: this is only set up to work with an RGBA image, and to produce
a grayscale alpha rep!
- (NSBitmapImageRep *)convertImageToGrayscaleRep:(NSImage *)image
{
unsigned char *p;
unsigned char *q;
register int x, y;
int pixelsHigh;
int pixelsWide;
int r, g, b, a, gr;
NSBitmapImageRep *bitmapImageRep, *rep2;
NSSize imageSize;
// Lock focus on the image so that we can grab the data from it
[image lockFocus];
imageSize = [image size];
pixelsWide = imageSize.width;
pixelsHigh = imageSize.height;
// Grab the RGBA data
bitmapImageRep = [[NSBitmapImageRep alloc]
initWithFocusedViewRect:NSMakeRect(0,0,pixelsWide,pixelsHigh)];
[image unlockFocus];
// Create an image rep to hold the transformed image data
rep2 = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:NULL
pixelsWide:pixelsWide
pixelsHigh:pixelsHigh
bitsPerSample:8
samplesPerPixel:2
hasAlpha:YES
isPlanar:NO
colorSpaceName:NSCalibratedWhiteColorSpace
bytesPerRow:2 * pixelsWide
bitsPerPixel:16];
// Now put the grayscale image data into the new image
for (y = 0; y < pixelsHigh; y++) {
p = [bitmapImageRep bitmapData];
p += (y * 4 * pixelsWide);
q = [rep2 bitmapData];
q += (y * 2 * pixelsWide);
for (x = 0; x < pixelsWide; x++) {
r = *p;
p++;
g = *p;
p++;
b = *p;
p++;
a = *p;
p++;
gr = (r + g + b) / 3;
*q = gr;
q++;
*q = a;
q++;
}
}
// Let the receiver manage the memory...
return [rep2 autorelease];
}
On Thursday, November 15, 2001, at 07:56 AM, Marcel Weiher wrote:
On Thursday, November 15, 2001, at 12:25 AM, Nat! wrote:
On Mittwoch, November 14, 2001, at 11:58 Uhr, Andrew Platzer wrote:
On Tuesday, November 13, 2001, at 06:51 , Gideon King wrote:
To create the image, I need to do some drawing, so I use an image
with and NSCustomImageRep to do the drawing. I tell that image
rep to be NSCalibratedWhiteColorSpace. I then lock focus on the
image and use initWithFocusedViewRect: to get the bitmap data.
This always returns an image rep with RGBA color in
NSCalibratedRGBColorSpace. If I then set it to
NSCalibratedWhiteColorSpace, it doesn't do any conversion for
me - just leaves it as 4 samples per pixel (RGBA).
Will I need to do the conversion manually, or is there some way
that will allow me to get a grayscale alpha image directly?
initWithFocusedViewRect: reads from the window's backing store
buffer and that is always RGB or RGBA. This is true even in images
which you lock focus on since it creates an offscreen window to
record the bits.
You will have to do the conversion manually.
Andrew
Are there any plans that AppKit can do this for us ? It seems that
the machinery at least partly (-> RGB, RGBA) must already be in
place.
In theory, it should be possible to limit the depth of the window's
backing store, and there are even some NSWindow methods for this.
However, these methods don't seem to have any effect.
Marcel
--
Marcel Weiher Metaobject Software Technologies
email@hidden www.metaobject.com
Metaprogramming for the Graphic Arts. HOM, IDEAs, MetaAd etc.
_______________________________________________
cocoa-dev mailing list
email@hidden
http://www.lists.apple.com/mailman/listinfo/cocoa-dev