NSImage, OpenGL, and color spaces
NSImage, OpenGL, and color spaces
- Subject: NSImage, OpenGL, and color spaces
- From: Josh Anon <email@hidden>
- Date: Mon, 24 Jan 2005 22:34:35 -0800
Hi all,
I have some code that gets a NSBitmapImageRep, for use in a GL texture,
padded to a certain size by:
Loading an NSImage
If image doesn't need to be padded
search for the bitmap rep
if we find it, grab it otherwise lock focus on the image and do an
NSBitmapImageRep initWithFocusedViewRect to get one
If the image does need to be padded:
Make a new image of the padded size and composite the original image
in there
do an initWithFocusedViewRect to get the bitmap rep
Then, I use the bitmapData as the basis for a GL texture, apply it to a
polygon, and display it.
This works beautifully except for one issue--it seems to lose the
correct color space. If I open an image in my app and in
Preview/Photoshop, it looks more saturated in Preview and Photoshop
(basically it looks like there's an sRGB profile applied in the other
apps, which makes sense since there's a camera profile--srgb--in the
source jpeg image). The rep says it's set to
NSCalibratedRGBColorSpace, and for kicks I tried doing a
setColorSpaceName: on the BitmapRep that I grab from the image.
Is there a clean way to get the adjusted pixels to pipe into GL, or do
I need to do some ColorSync wizardry first? Given how many examples of
texturing from an NSImage there are that don't do any color tricks, I'm
surprised that this doesn't just work...
Has anyone seen something similar?
Thanks!
Josh
---
Josh Anon
email@hidden
Pixar Animation Studios
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Cocoa-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden