Re: Rasterizing an NSString to a bitmap
Re: Rasterizing an NSString to a bitmap
- Subject: Re: Rasterizing an NSString to a bitmap
- From: Marcel Weiher <email@hidden>
- Date: Thu, 10 Jun 2004 20:51:35 +0100
Actually, in NeXTstep days, we did *not* have access to the backing
store of a NSCachedImageRep. These were windows that lived inside
the WindowServer
I'm not talking about NSImages (or NSCachedImageReps)! I'm talking
about *windows*!
So? As I said, it was because the windows lived inside the
WindowServer that the NSCachedImageReps built on top of them didn't
have direct access to the bitmap.
Hold it. The -initWithFocusedViewRect: method is a documented and
reasonably efficient way to get rendered bitmap data. Even if it
isn't named brilliantly in relation to NSImage.
[image lockFocus];
[myObject draw];
bitmap = [[[NSBitmapImageRep alloc] initWithFocusedViewRect:myRect]
autorelease];
[image unlockFocus];
That's the canonical method of getting a bitmap of a rendered graphic
in Cocoa, as far as I am aware, and is independent of NSImage
implementation details. Another method is to get the
TIFFRepresentation, and then re-reading that, but that involves an
extra encoding/decoding step.
OK, this is true. That is faster method of getting a bitmap out of an
NSImage. Unfortunately, we end up with a bitmap with parameters that
we cannot specify,
Well, you cannot really specify the parameters completely with other
methods either, because only some formats are actually support by
CoreGraphics, and you *do* actually have some control over the format.
See NSWindowDepth and the related functions in NSGraphics.h
Also, the formats returned have been quite stable over the last decade
or so, with 32 bit RGB(A) always being available (and internal formats
converted if necessary).
so it could be one of hundreds of different formats (if you consider
all the various permutations that could theoretically arise).
Not really, you are isolated from the actual buffer by
-initWithFocusedViewRect:
But fortunately, we can assume that the system won't give us weird
stuff like CMYK buffers back, since there are no window backing stores
in CMYK. Even then, there are a lot of potential bitmap buffers that
might come back to us, so it's still a pretty poor way to get bits
out. But you're right; it's much less awful than NSReadPixel.
Actually, it's even less awful than that. However, one fairly big
limitation it used to have (not sure if that's still the case), is that
it wouldn't give you 3x5+1 bit RGB for 16 bit, but the rather lower
fidelity 3x4 + 4.
I suppose if you could find a way to convert all the various
NSBitmapImageRep types into compatible QuickDraw formats, you could
use QTNewGWorldFromPtr->CopyBits to convert these pixels into the
format of your choosing. We would have to rely on Apple's good graces
not to change initWithFocusedViewRect to return more complicated
buffer types. Or maybe we could convince Apple to document all the
buffer types that initWithFocusedViewRect might return. That would be
even better.
_______________________________________________
cocoa-dev mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/cocoa-dev
Do not post admin requests to the list. They will be ignored.