• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Problems with NSAffineTransform
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Problems with NSAffineTransform


  • Subject: Re: Problems with NSAffineTransform
  • From: Henry McGilton <email@hidden>
  • Date: Tue, 25 Jul 2006 08:46:10 -0700


On Jul 24, 2006, at 10:59 PM, Laurent Daudelin wrote:

Hello.

I just recently started playing with Quartz, CI and the likes, so be gentle!

I'm setting up a slide show in a window using a subclass of NSImageView,
based partly on code from the documentation and partly from code from the
FunHouse project.


I use an instance of NSAffineTransform with a CoreImage 'CIAffineTransform'
filter to scale the image to the size of the window and to center it in the
window.


The scaling part works well but the translations don't for some reason. I
did test my code with a bunch of various pictures of different types. When
I'm not trying to translate the x and y coordinates, everything is fine. As
soon as I start using translateXBy:yBy:, then some pictures appear 10 times
as large as they appear when no translation is performed and all pictures
will generally be off, relative to the center, by a various amount of
pixels. If I use a set of pictures of the same time and the same size, they
will all appear at the same x,y coordinate, but they won't be centered
either.


Try doing the  translate  *before*  you do the  scale




I'm thinking that there is something related to the resolution but I can't
find what it is.


Here is an excerpt of the code I use to read the pictures:

pictureBitmapData = [NSData dataWithContentsOfFile:[NSString
stringWithFormat:@"%@/%@",
slideShowDirectoryPath,
[files objectAtIndex:pictureIndex]]];
pictureBitmap = [[[NSBitmapImageRep alloc]
initWithData:pictureBitmapData] autorelease];
tempImage = [[[CIImage alloc] initWithData:pictureBitmapData] autorelease];


/* then I'm calling a private 'transformImage:' method: */

startingImage = [[self transformImage:tempImage] retain];

/*
Here is transformImage: The method also composites the image over
a white background. Basically, the resulting composited image should
be the same size of the window.
*/


- (CIImage *)transformImage:(CIImage *)imageToTransform
{
CIFilter    *scaleAndTransformFilter;
CGRect        outputExtent;
NSRect        rect = [self bounds];

outputExtent = [imageToTransform extent];

float scale, xscale, yscale, offsetX, offsetY;
// decide scale factor now
xscale = rect.size.width / outputExtent.size.width;
yscale = rect.size.height / outputExtent.size.height;
if (yscale < xscale)
{
scale = yscale;
offsetX = (rect.size.width - outputExtent.size.width * scale) * 0.5;
offsetY = 0.0;
}
else
{
scale = xscale;
offsetX = 0.0;
offsetY = (rect.size.height - outputExtent.size.height * scale) * 0.5;
}
scaleAndTransformFilter = [CIFilter filterWithName:@"CIAffineTransform"];
NSAffineTransform *t = [NSAffineTransform transform];
[t scaleBy:scale];
[t translateXBy:offsetX yBy:offsetY];
[scaleAndTransformFilter setValue:t forKey:@"inputTransform"];
[scaleAndTransformFilter setValue:imageToTransform forKey:@"inputImage"];
//startingImage = [[scaleAndTransformFilter valueForKey:@"outputImage"]
retain];


CIFilter *f = [CIFilter filterWithName:@"CIConstantColorGenerator"];
[f setValue:[CIColor colorWithRed:1.0 green:1.0 blue:1.0 alpha:1.0]
forKey:@"inputColor"];
CIImage *white = [f valueForKey:@"outputImage"];
CIFilter *crop = [CIFilter filterWithName: @"CICrop"
keysAndValues: @"inputImage", white,
@"inputRectangle", [CIVector vectorWithX: 0 Y: 0
Z: [self bounds].size.width
W: [self bounds].size.height],
nil];
f = [CIFilter filterWithName:@"CISourceOverCompositing"];
[f setValue:[crop valueForKey:@"outputImage"]
forKey:@"inputBackgroundImage"];
[f setValue:[scaleAndTransformFilter valueForKey:@"outputImage"]
forKey:@"inputImage"];


return [f valueForKey:@"outputImage"];
}

Like I said, if I comment out the [t translateXBy:offsetX yBy:offsetY], then
the images all appear correctly, at their correct sizes, scaled down
properly.


Does anybody have any idea about what I'm doing wrong?

Thanks in advance!

-Laurent.


_______________________________________________ Do not post admin requests to the list. They will be ignored. Cocoa-dev mailing list (email@hidden) Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


===============================+============================ Henry McGilton, Boulevardier | Trilithon Software Objective-C/Java Composer | Seroia Research -------------------------------+---------------------------- mailto:email@hidden | http://www.trilithon.com | ===============================+============================


_______________________________________________ Do not post admin requests to the list. They will be ignored. Cocoa-dev mailing list (email@hidden) Help/Unsubscribe/Update your Subscription: This email sent to email@hidden
References: 
 >Problems with NSAffineTransform (From: Laurent Daudelin <email@hidden>)

  • Prev by Date: Re: A rant and a question...
  • Next by Date: Re: ODBC
  • Previous by thread: Problems with NSAffineTransform
  • Next by thread: Re: Problems with NSAffineTransform
  • Index(es):
    • Date
    • Thread