Re: Linear-light RAW 12bit vs R'G'B' 8bit: how much better is it really?
Re: Linear-light RAW 12bit vs R'G'B' 8bit: how much better is it really?
- Subject: Re: Linear-light RAW 12bit vs R'G'B' 8bit: how much better is it really?
- From: Klaus Karcher <email@hidden>
- Date: Mon, 23 Jul 2007 22:10:10 +0200
Mark wrote:
Hi all,
can someone please explain me why 12 bit linear light RAW images are
supposed to be much better than gamma corrected 8 bit images?
Some sources state that 8 bit R'G'B' could be coded linearly with about
11 bits. So while a 12 bit RAW image does have finer coding than a 8 bit
R'G'B' image, it is not that much more (as one might naively think at
first).
Is that correct or have I gotten it all wrong?
Every additional bit doubles the precision: 1 bit gives you two steps, 2
bits 4 steps, 3 bits 8 steps, ... 12 bits are twice as precise than 11
(4096 instead of 2048 steps per chanel)
Given that 11 bits linear are equal in precision to 8 bit gamma
corrected (I did not and don't want to verify this), 12 bits are twice
as precise.
Klaus
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden