Creating Gray Scale Image give big leak...
Creating Gray Scale Image give big leak...
- Subject: Creating Gray Scale Image give big leak...
- From: Jerry LeVan <email@hidden>
- Date: Sun, 16 Dec 2007 13:19:34 -0500
Hi,
I have an image viewing app that display images by pushing
them into a NSImageView.
I want the user to view a gray scale version of the currently
displayed image.
When the user selects the "Show Gray Scale" menu item the
action routine does
NSImage* tmpImage = [self monochromeImage:[bigImage image] ];
//NSLog(@" mkgs: %f,%f",tmpImage.size.width,tmpImage.size.height);
[bigImage setImage:tmpImage ];
[bigView setImage:tmpImage];
(bigImage is the NSImageView ).
Monochrome image does:
- (NSImage *) monochromeImage:(NSImage*) theImage
{
// convert NSImage to bitmaprep
NSBitmapImageRep * bitmap;
bitmap = (NSBitmapImageRep*)[theImage bestRepresentationForDevice:nil];
// create CIImage from bitmap
CIImage * ciImage = [[CIImage alloc] initWithBitmapImageRep:bitmap];
// create CIFilter MaximumComponent
CIFilter *transform = [CIFilter filterWithName:@"CIMaximumComponent"
keysAndValues: @"inputImage", ciImage,nil];
// get the new CIImage
CIImage * result = [transform valueForKey:@"outputImage"];
//NSLog(@"GS width= %f,height = %f",[result extent].size.width,[result
extent].size.height);
// convert back to a NSImage
NSImage *image = [[[NSImage alloc]
initWithSize:NSMakeSize(theImage.size.width,
theImage.size.height)] autorelease];
[image addRepresentation:[NSCIImageRep imageRepWithCIImage:result]];
//NSLog(@"Reps: %@", [image representations]);
[ciImage release];
return image ;
}
The above works fine except the rascal leaks whenever I create a
grayscale image
I realize that no "bits" are carried back to the main program ie the
filter fires
every time the image is redrawn.
I tried cutting the following code into the above to try to some bits...
(I found it on the web...)
- (NSBitmapImageRep *)RGBABitmapImageRepWithCImage:(CIImage *) ciImage
{
int width = [ciImage extent].size.width;
int rows = [ciImage extent].size.height;
int rowBytes = (width * 4);
NSBitmapImageRep* rep = [[NSBitmapImageRep alloc]
initWithBitmapDataPlanes:nil pixelsWide:width
pixelsHigh:rows bitsPerSample:8 samplesPerPixel:4 hasAlpha:YES
isPlanar:NO
colorSpaceName:NSCalibratedRGBColorSpace bitmapFormat:0
bytesPerRow:rowBytes bitsPerPixel:0];
CGColorSpaceRef colorSpace = CGColorSpaceCreateWithName
( kCGColorSpaceGenericRGB );
CGContextRef context = CGBitmapContextCreate( [rep bitmapData],
width, rows, 8, rowBytes,
colorSpace, kCGImageAlphaPremultipliedLast );
CIContext* ciContext = [CIContext contextWithCGContext:context
options:nil];
[ciContext drawImage:ciImage atPoint:CGPointZero fromRect:
[ciImage extent]];
CGContextRelease( context );
CGColorSpaceRelease( colorSpace );
return [rep autorelease];
}
it worked, but the leak was worse !
So the question is :
Is there a way to pass an image to a proc that will transform the
image via
core image filters and get a new NSImage back without any leaks?
Thanks
Jerry
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden