Re: Rasterizing an NSString to a bitmap
Re: Rasterizing an NSString to a bitmap
- Subject: Re: Rasterizing an NSString to a bitmap
- From: Dietmar Planitzer <email@hidden>
- Date: Wed, 9 Jun 2004 22:25:28 +0200
On Jun 9, 2004, at 3:28 PM, John Stiles wrote:
Well, I filed a DTS incident, so if there's a way, soon we'll know.
Maybe you'll know even sooner :)
There is a way to access the pixels of a window's back buffer with the
help of a few QuickDraw APIs. All you need is a QD GrafPort. You can
get this GrafPort from a Carbon window by calling GetWindowPort() on
the WindowRef. In the case of a Cocoa window, you simply call the
-windowRef method on the Cocoa window in order to get a Carbon
WindowRef, then you call GetWindowPort() on that WindowRef.
Once you have the GrafPort, all thats left to do is to get the PixMap
which represents the port bitmap, then lock its pixels and pull the
base address from it.
So, given a Cocoa window, myWindow, the following code example shows
how to retrieve a pointer to the top-left pixel in the window's back
buffer:
CGrafPtr grafPort = GetWindowPort([myWindow windowRef]);
PixMapHandle pixMapHnd = GetGWorldPixMap(grafPort);
long pixelType = GetPixDepth(pixMapHnd);
Rect pixMapBounds;
int width, height;
long bytesPerRow;
void * pixels;
GetPixBounds(pixMapHnd, &pixMapBounds);
width = pixMapBounds.right - pixMapBounds.left;
height = pixMapBbounds.bottom - pixMapBounds.top;
LockPixels(pixMapHnd);
pixels = GetPixBaseAddr(pixMapHnd);
bytesPerRow = GetPixRowBytes(pixMapHnd);
// Do something with pixels
UnlockPixels(pixMapHnd);
This technique should work with the window of a NSCachedImageRep, which
you can get from such an object by calling its -window method. I'm at
least using this technique successfully with on-screen windows in order
to let QuickTime play back a movie directly into a window so that I
don't have to go through an additional off-screen bitmap which would
require an extra copying step.
You must, however, make sure that the window is not defered / that its
window device has already been allocated before you lock its pixels.
Because otherwise you end up accessing the framebuffer of your screen
rather than the window's back buffer.
If this doesn't work for you for some reason, then there is also the
possibility of creating a QuickDraw GWorld and drawing the strings via
ATSUI into it. Its likely more work than using the NSString drawing
convenience methods and NSImage, but its far more efficient than
reading pixels out via NSReadPixel().
On Jun 9, 2004, at 2:46 AM, p3consulting wrote:
yes it's a hack and I will be happy too the day I could avoid it
because Apple will give us a way
to access the bytes of a NSCachedImageRep (setCachedSeparately:YES of
course), at least give us a way to copy from,
and to make direct drawing to a NSBitmapImageRep possible ...
(even if we will loose GPU acceleration in the process...)
Given that Quartz doesn't use the GPU for its rendering, there wouldn't
be anything to lose.
However, the real problem is that the NSCachedImageRep class still
exists at all. This class made sense back in the NeXTStep / OpenStep /
Rhapsody days because back then all drawing was done by the window
server. Further the window server could only draw into windows and
nothing else.
This is the simple reason why NSImage created an NSCachedImageRep
object when you called -lockFocus on it. The NSCachedImageRep object
then created a window on the window server and all drawing ended up in
that window. That way it wasn't necessary to move the pixels of the
image from the client (maybe even across the network) to the server -
ergo faster drawing.
However, on MacOS X all drawing is done on the client side. Further,
Quartz has the concept of a context object which can encapsulate a
simple bitmap or a PS / PDF data stream. Consequently, the
NSCachedImageRep class is actually no longer necessary and in fact
makes working with NSImage more complicated than necessary.
Things would be much easier if NSImage would create a NSBitmapImageRep
rather than a NSCachedImageRep object when lock focusing it. This would
have the added benfit that it would make it possible to specify the
pixel format of the bitmap and accessing the pixels would be a trival
exercise.
Anyway, its my opinion that NSCachedImageRep should have died back when
we moved from Rhapsody to MacOS X. Thus, I've filed an enhancement
request asking for the ability to tell NSImage that it should use an
NSBitmapImageRep for its image cache. Everyone interested in such a
feature, please do the same !
While we're talking about NSBitmapImageRep...
This class has one fundamental problem: it uses the RGBA pixel encoding
which is neither supported by the typical Macintosh graphics hardware,
nor by QuickDraw, nor by QuickTime. Thus, Cocoa (Quartz) must convert
such bitmaps always into the supported ARGB format before you can draw
them on-screen. It naturally makes integrating QT and Cocoa or QD and
Cocoa harder and far more inefficient than necessary.
This problem would be relatively easy to fix by Apple if a new
-initWithBitmapData:... method yould be introduced which would ideally
take an additional alphaMode: parameter similar to the same parameter
of the CGImage "class" in Quartz.
Now here we have two real design bugs in Cocoa, if you ask me :)
Regards,
Dietmar Planitzer
_______________________________________________
cocoa-dev mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/cocoa-dev
Do not post admin requests to the list. They will be ignored.