NSAffineTransform & bezier path line widths
NSAffineTransform & bezier path line widths
- Subject: NSAffineTransform & bezier path line widths
- From: Shamyl Zakariya <email@hidden>
- Date: Wed, 12 Mar 2008 12:22:39 -0400
I'm using an NSAffineTransform to scale and offset drawing a bezier
path into a view. The bezier path *vertices* are correctly
transformed. The oddity is that the line width sometimes is
transformed, and sometimes isn't.
Specifically, if the view is redrawn as a result of [self
setNeedsDisplay: YES] the line width is scaled by the affine
transform. I.e., if the scaling is 2 and the line width is 2, the line
on screen appears to be 4 pixels thick, which I read as correct
behavior. However, if the view is redrawn as a result of resizing the
window, the line width on screen is 2! And more bafflingly, the lines
are still otherwise correct with regards to position. They're just not
the right width.
I presume that I'm not grokking something, and perhaps somebody here
can point me to some documentation that clarifies this. I *have* read
documentation, but there's so much it's possible I overlooked something.
Here's my testing code. In this code I draw an image ( image
processing of a slice of voxels, the color marks visitation
information, etc ) scaled to fit the view and centered. And over the
image I draw a test bezier path. I'm not rendering the actual path
data because I'm still developing the algorithm which analyses the
voxel slice.
- (void)drawRect:(NSRect)rect
{
if ( voxelSlice )
{
[[NSGraphicsContext currentContext] saveGraphicsState];
[[NSGraphicsContext currentContext] setImageInterpolation:
NSImageInterpolationNone];
NSRect bounds = [self bounds];
NSSize imageSize = [voxelSlice size];
CGFloat scale = bounds.size.width / imageSize.width;
if ( imageSize.height * scale > bounds.size.height )
{
scale *= bounds.size.height / (imageSize.height * scale);
}
NSSize scaledImageSize = NSMakeSize( imageSize.width * scale,
imageSize.height * scale );
NSPoint origin = NSMakePoint( bounds.size.width/2 -
scaledImageSize.width/2,
bounds.size.height/2 -
scaledImageSize.height/2 );
[voxelSlice setSize: scaledImageSize];
[voxelSlice compositeToPoint: origin
operation:NSCompositeSourceOver];
/*
Set up an affine transform for line rendering.
After this, we're in the coordinate system of the image.
*/
NSAffineTransform *transform = [NSAffineTransform transform];
[transform translateXBy: origin.x yBy: origin.y];
[transform scaleBy: scale];
[transform concat];
// dummy code for testing -- just draw an X across the image
[[NSColor whiteColor] set];
NSBezierPath *test = [NSBezierPath bezierPath];
[test setLineWidth: 2];
[test moveToPoint: NSZeroPoint];
[test lineToPoint: NSMakePoint( imageSize.width,
imageSize.height )];
[test moveToPoint: NSMakePoint( imageSize.width, 0 )];
[test lineToPoint: NSMakePoint( 0, imageSize.height)];
[test stroke];
[[NSGraphicsContext currentContext] restoreGraphicsState];
}
}
email@hidden
this mesage brought to you by
THE MATTEL AND MARS BARS
QUICK ENERGY CHOCOBOT HOUR
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden