Re: Bruce and Dan are both right: was: 16 bits = 15 bits in Photoshop?
Re: Bruce and Dan are both right: was: 16 bits = 15 bits in Photoshop?
- Subject: Re: Bruce and Dan are both right: was: 16 bits = 15 bits in Photoshop?
- From: Ray Maxwell <email@hidden>
- Date: Sun, 17 Apr 2005 18:52:59 -0700
Hi Jon,
Let me try to define two of my terms with more accuracy and see if this
helps.
Noise...This means a random error introduced into the accuracy of your
measurement system. I will illustrate with a table. Imagine that you
are scanning a perfect mid tone gray card. If your file showed the
following output in your file, you would say that you have five units of
noise. Another way to say this is random error.
Actual Pixel Pixel Pixel Pixel Pixel Pixel
Pixel Pixel Pixel
Value 1 2 3 4
5 6 7 8 9
128 128 126 129 127 128
130 127 127 129
This signal has an error of plus and minus 2 units.
Dither...This can mean to vary randomly either the spatial placement as
in stochastic screening or to vary the amplitude of a signal randomly
with a fixed amount of noise. My use of the term refers to the later of
these two definitions.
If the distribution of the noise is defined by a gausian curve then it
is gausian noise. Most errors follow this type of distribution. The
table above does not follow this. It is only an example of how the
maximum amplitude of the noise might look.
Eight bit scans do not automatically produce 8 bits of integrity. It
depends on many of the components in the scanner. There are many
scanners that have noise or errors in the values they output to there files.
The amount of noise produced by a scanner or camera is dependent on dark
calibration, temperature stability of the CCD chip, thermonic noise in
amplifiers, and finally in the high end camera backs, active cooling.
Switching from 8 bits to 16 bits does not reduce noise. If you have the
same amount of analog noise in your signal, digitizing it to 16 bits
just increases the number of bits that contain noise. It does not make
the information more accurate.
When you turn on "dither" in the "Gradient" tool you are introducing
noise of the type that I have described.
Hope this helps,
Ray
email@hidden wrote:
I might be picky, but noise usually means amplification of a non-image
data electronic signal. We often see this manifested as blue artifacts
in the shadow end of an image.
Dither is an algorithm which blends color data between two or more
points to interpolate data into a new intermediate point.
I don't quite see how defining a scan or digital camera original fits
into the second scenario, except in discreet edits.
As described in a previous post, 8 bit scans offer a full 8 bits of
integrity (provided that the end points are properly selected),
whereas (as pointed out by Bruce in one of his posts) digital camera
files are grayscale images with a Bayer or other transform,
synthesizing the "look" of a scan. The 8 bit color data in a single
shot digital camera files is never equal to that of a correctly
produced 8 bit scan. Therefore 16 bits of digital camera data is
significantly more accurate than an 8 bit file processed within the
dcam and saved to TIFF or JPEG.
Ray, can you please describe again how clean vs dirty image data plays
out in the 8 vs 16 bit discussion?
- Jon
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden