Linear-light RAW 12bit vs R'G'B' 8bit: how much better is it really?
Linear-light RAW 12bit vs R'G'B' 8bit: how much better is it really?
- Subject: Linear-light RAW 12bit vs R'G'B' 8bit: how much better is it really?
- From: Mark <email@hidden>
- Date: Mon, 23 Jul 2007 21:31:37 +0200
Hi all,
can someone please explain me why 12 bit linear light RAW images are
supposed to be much better than gamma corrected 8 bit images?
Some sources state that 8 bit R'G'B' could be coded linearly with
about 11 bits. So while a 12 bit RAW image does have finer coding
than a 8 bit R'G'B' image, it is not that much more (as one might
naively think at first).
Is that correct or have I gotten it all wrong?
If it is like that, what's the big point in shooting RAW?
Cheers
Mark
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden