• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: convert NSBitmapImageRep to black/white depending on pixel hsb
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: convert NSBitmapImageRep to black/white depending on pixel hsb


  • Subject: Re: convert NSBitmapImageRep to black/white depending on pixel hsb
  • From: Michael Watson <email@hidden>
  • Date: Sun, 12 Aug 2007 08:41:22 -0400

Absolutely use Shark. It knows lots and loves to tell you. :-)

You have a very time-consuming thing going on in your code:

    for (int x = 0; x < [image pixelsWide]; x++) {
        for (int y = 0; y < [image pixelsHigh]; y++) {

Every time through these loops, you're asking the image how wide and tall it is, when that information isn't going to change. You should cache this. Imagine that you have an image that is 1,000 pixels x 1,000 pixels. That's ONE MILLION messages to obtain the same 2 numbers.


int width = [image pixelsWide];
int height = [image pixelsHigh];

for (int x = 0 ; x < width ; ++x) {
    for (int y = 0 ; y < height ; ++y) {


            NSColor *color = [image colorAtX:x y:y];
            if ([color brightnessComponent] > brigthness) {
                [newImage setColor:above atX:x y:y];
            } else {
                [newImage setColor:bellow atX:x y:y];
            }
        }
    }
    return newImage;
}

All of the stuff above is where you're really getting burned, I bet. You're sending a lot of messages to manipulate one pixel at a time, when you could instead ask the bitmap for a pointer to its bytes, examine the pixels yourself, and apply transformations as necessary. If you're just going to flip them to a greyscale or black/white value, that's going to be a pretty quick operation. I wrote a small method on NSBitmapImageRep to see how fast it could be, and while I'm sure I'm doing something that is slow, here are some tests performed on a 1440x900 image:


[[Session started at 2007-08-12 08:38:38 -0400.]
2007-08-12 08:38:38.925 grey[3030] start
2007-08-12 08:38:38.979 grey[3030] finished

[Session started at 2007-08-12 08:38:41 -0400.]
2007-08-12 08:38:41.584 grey[3032] start
2007-08-12 08:38:41.634 grey[3032] finished

[Session started at 2007-08-12 08:38:44 -0400.]
2007-08-12 08:38:45.003 grey[3037] start
2007-08-12 08:38:45.055 grey[3037] finished

An average of 52 milliseconds. Not bad, right? Of course, Core Image is probably faster somehow and way easier to use. (And admittedly, I'm probably doing something that could be greatly improved and I don't realize it.) Here's the code I wrote:


//================================================== // First, an example of using the code you see below. //==================================================

NSImage *image = [[NSImage alloc] initWithContentsOfFile:@"/Users/ mikey/Desktop/desktop.png"];

NSBitmapImageRep *bitmap = [NSBitmapImageRep imageRepWithData:[image TIFFRepresentation]];

NSLog(@"start");

[bitmap monochromeImageRepWithThreshold:0.5 offColour:[NSColor redColor] onColour:[NSColor blackColor]];


	NSLog(@"finished");




//==================================================
// Returns a monochrome (not duotone) version of the reciever. A pixel whose
// brightness lies above the specified threshold will be recoloured with the
// colour specified with onColour; otherwise offColour will be used.
//
// The provided brightness value must be between 0.0 and 1.0; values higher than
// 1.0 will be treated as 1.0, and values lower than 0.0 will be treated as 0.0.
//
// This method does not compensate for bitmaps whose pixels contain premultiplied
// alpha data.
///==================================================


@implementation NSBitmapImageRep (MSGraphicsAdditions)

- (NSBitmapImageRep *)monochromeImageRepWithThreshold:(float) brightness offColour:(NSColor *)offColour onColour:(NSColor *)onColour
{
NSBitmapImageRep *imageRep = [self copy];

// in order to extract RGB components from a non-RGB colour passed to us, we must first convert the colour to an RGB colour space
NSColor *offColourRGB = [offColour colorUsingColorSpaceName:NSCalibratedRGBColorSpace];
NSColor *onColourRGB = [onColour colorUsingColorSpaceName:NSCalibratedRGBColorSpace];

// ensure that we fall within the expected range
if (brightness > 1.0 )
brightness = 1.0;
else if (brightness < 0.0)
brightness = 0.0;

// RGB values converted from 0.0-1.0 to 0-255
unsigned int offR, offG, offB;
unsigned int onR, onG, onB;

offR = [offColourRGB redComponent] * 255.0;
offG = [offColourRGB greenComponent] * 255.0;
offB = [offColourRGB blueComponent] * 255.0;
onR = [onColourRGB redComponent] * 255.0;
onG = [onColourRGB greenComponent] * 255.0;
onB = [onColourRGB blueComponent] * 255.0;

// convert float value of brightness to an integer between 0 and 255
unsigned int brightnessScaled = rint(brightness * 255.0);

// multiply the width of the bitmap by its height, then multiply by the number of samples per pixel to determine
// the total number of channel components we need to examine
unsigned int samplesPerPixel = [imageRep samplesPerPixel];
unsigned int sampleCount = [imageRep pixelsWide] * [imageRep pixelsHigh] * samplesPerPixel;


// the actual bytes of the image data; the bytes of each pixel are stored in RGB order unless your bitmap data is planar
unsigned char *bitmapData = [imageRep bitmapData];

// we're going to be counting across the pixels in the bitmap, so this'll help us remember what's what.
// we don't care about the alpha pixel, because that has nothing to do with brightness in RGB space.
unsigned int r = 0; // first red component
unsigned int g = 1; // first green component
unsigned int b = 2; // first blue component

// storage for the brightest component during comparison; in RGB space,
// the brightest of the three values determines its brightness level in HSB/V space.
// therefore, this is your "value" (brightness) level in HSB/V space.
unsigned int v;

for (r, g, b ; b < sampleCount ; r += samplesPerPixel, g += samplesPerPixel, b += samplesPerPixel)
{
// if red is brighter than blue, mark red as the brightest of the two; otherwise, blue is the brightest
v = (bitmapData[r] > bitmapData[b]) ? r : b;

// if green is brighter than the brightest of red/blue, green is the brightness of the pixel; otherwise,
// the brightest of red/blue is, and we don't need to do anything
if (bitmapData[g] > bitmapData[v]) { v = g; }

// use the brightness value to determine how to colour the pixel
if (bitmapData[v] > brightnessScaled)
{
bitmapData[r] = onR;
bitmapData[g] = onG;
bitmapData[b] = onB;
}
else
{
bitmapData[r] = offR;
bitmapData[g] = offG;
bitmapData[b] = offB;
}
}

return [imageRep autorelease];
}





On 11 Aug, 2007, at 11:19, John Stiles wrote:

If you're unsure about a performance issue, just use Shark and then you'll know why :) It can even trace inside system functions like colorAtX:y:.

On Aug 11, 2007, at 7:28 AM, email@hidden wrote:

I have figured out that the super costly code is

NSColor* color = [image colorAtX:x y:y];

Why would colorAtX be so heavy?
When using [image getPixel:rgb atX:x y:y]; instead it runs like a charm.


Thanks
_______________________________________________

Cocoa-dev mailing list (email@hidden)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
40blizzard.com


This email sent to email@hidden

_______________________________________________

Cocoa-dev mailing list (email@hidden)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
40bungie.org


This email sent to email@hidden

_______________________________________________

Cocoa-dev mailing list (email@hidden)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


References: 
 >Re: convert NSBitmapImageRep to black/white depending on pixel hsb (From: email@hidden)
 >Re: convert NSBitmapImageRep to black/white depending on pixel hsb (From: John Stiles <email@hidden>)

  • Prev by Date: Re: file reading/writing over 2 gig
  • Next by Date: Fake a apple remote?
  • Previous by thread: Re: convert NSBitmapImageRep to black/white depending on pixel hsb
  • Next by thread: Prevent NSDrawer dragging
  • Index(es):
    • Date
    • Thread