faking bit depth
faking bit depth
- Subject: faking bit depth
- From: Chris Jordan <email@hidden>
- Date: Thu, 28 Mar 2002 08:59:40 -0800
- Organization: Yarmuth Wilsdon Calfo
Dave,
Unfortunately the faking-bit-depth concept doesn't work. The reason is,
if you start with 8 bits per channel, then when you convert to 16 there
is no way to "fill in" the rest of the levels that the 16-bit colorspace
has available. For example, imagine that your file has pixels at the
0,0,0 level and 1,1,1 level in 8-bit; in 16-bit it would convert to
0,0,0 and 256,256,256 (out of a possible 65,536 levels per channel), but
there would be no way to fill in the spaces between 0,0,0 and
256,256,256 to get any use out of those extra levels in 16-bit mode. So
your edits in 16-bit will have exactly the same effect as if you had
done them in 8-bit, and when you convert back to 8-bit you'll see the
histogram is just as broken up as if you had done the edits in 8-bit.
~chris jordan (Seattle)
www.chrisjordanphoto.com
_______________________________________________
colorsync-users mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/colorsync-users
Do not post admin requests to the list. They will be ignored.