Re: RPP raw photo processor 64
Re: RPP raw photo processor 64
- Subject: Re: RPP raw photo processor 64
- From: Ben Goren <email@hidden>
- Date: Sun, 02 Jun 2013 10:44:10 -0700
On Jun 2, 2013, at 9:29 AM, Andrew Rodney <email@hidden> wrote:
> On Jun 2, 2013, at 9:30 AM, Ben Goren <email@hidden> wrote:
>
>> Rather, the point I'm trying to make is that digital, unlike film, is very well suited for (reasonable degrees of) colorimetric accuracy
>
> What we need is a strong and well presented case this is possible in all cases where one points a camrea at some scene.
All cases? Don't be silly. Point the camera at the Sun and no print or monitor or what-not is going to reproduce the experience of looking at the Sun or even just of having it in your field of vision.
And there're similar (but less dramatic) limitations with any light source or even specular highlight. Even beyond that, if one allows for a few seconds for adaptation and the light, there are again extremes of dynamic range we simply lack the technology to even come close to reproducing.
But, in most typical non-problematic lighting conditions, yes, it's generally possible within the limits of the tools being used.
> In fact, I'd like to see a definition of what 'colorimatric accuracy' really is and looks like and what metric is used to say it's been achieved. I'd assume you mean one measures the spectral data at the scene (how many?) and the camera and processing spit out the exact color numbers in a rendered image right?
That is, indeed, exactly what Dr. Berns does when imaging artwork. He has various techniques; I believe the current gold standard is a spectroradiometer for a low-spatial-resolution high-spectral-resolution image of the work coupled with a standard camera / scanner / scanning back / whatever for a high-spatial-resolution low-spectral-resolution image, and then some sort of high-powered math to merge the two.
He's also developed techniques that get almost-as-good results just by making multiple exposures with an off-the-shelf camera with different Wratten filters, and then doing the same sort of high-powered mathematical manipulation.
> And that appears how to the end user (it shows a visual match on a display or output? It looks as we expect? It's pleasing, meaning it doesn't look scene referred and rather ugly?).
In the case of Dr. Berns's work, yes. The copy is practically indistinguishable from the original.
Now, to be sure, what I've described is an extreme case using specialized equipment and / or techniques. And neither is practical outside of that sort of a laboratory environment.
But similar principles *can* be used in a more photographer-friendly workflow.
You need a good ICC profile. RPP ships with high-quality profiles for most cameras that, though not perfect for all situations, are actually damned good for general-purpose use. But, if you've got a high patch count target with lots of colorants and you know how to photograph it and build a profile from it, you can do even better with your own particular cameras and lenses.
You also need to perfectly nail exposure and white balance, which is something you can't do by eye or with clicking. You *can*, however, use a target in the field (and the ColorChecker Passport is ideal for this), and then, when you get back to the studio, build a simple profile from it that you use to determine the proper white point (and that's all that you use that profile for). In practice, it's barely more involved than shooting a gray card and using it for a click white balance, except that it's far, far more precise.
Combine the two and you're basically done.
Here's the proverbial picture worth a thousand words. It's from this past spring, in the Superstition Mountains on the east side of the Valley of the Sun. The foreground is basically a colorimetric rendition. The sky was about a stop brighter than the foreground; I developed it separately and composited the two together (with a soft mask, not unlike a custom-shaped graduated neutral density filter). On reflection, I might have over-darkened the sky, but not excessively.
http://trumpetpower.com/images/trumpetpower.com/photos/Superstitious_Boquet.jpg
You'll have to take my word for it, of course, but, within the limits of the sRGB gamut (and many of the flowers lie outside of said gamut), this is a very, very close match to what I actually saw.
In particular, pay attention to the fine detail in the shadows and the highlights. There's no S-curve applied to this image; as a result, the midtones aren't as ``punchy'' as many photographers gravitate towards, but, at the same time, you can actually easily see all the fine textures present in the scene.
> Because dating back to the work of Bruce Fraser, he was able to use the Adobe engine to render a Macbeth that provided RGB values that matched what they are said to be in Lab in ProPhoto RGB of all 24 patches. Is that colorimetric accuracy and if not, why?
That constitutes colorimetric accuracy...for a ColorChecker. If all you're shooting are ColorCheckers, that's perfect.
It's been not only my experience but the experiences of many that a 24-patch chart, even one as well-designed as the ColorChecker, simply isn't capable of producing satisfactory results in real-world typical scenes.
I've fought similar problems, myself, in the past. I've hand-built DNG profiles such that the ColorChecker was quite accurately rendered. But nothing else was, as the contortions necessary to bring the ColorChecker into line dramatically distorted everything that wasn't close to one of those 24 patches.
In practice, there are always going to be minor errors throughout the spectrum. The goal is not to perfectly match the colors on your chart, but to characterize the response of the camera such that the entire spectrum is within some margin of error without significant outliers. Typically, I wind up with average DE values (as reported by Argyll) in my camera profiles in the range of 1-2 with extremes in the 6-8 range -- and those are generally the most-saturated colors in the chart. Since those colors lie at the boundaries if not actually beyond most working spaces, let alone monitor and printer spaces, I'm not too worried about them. I am, however, constantly refining my technique and my tools...and I suspect and hope that the next iteration will be even significantly better than what I'm getting now.
>> Actually, it's been my experience that virtually all the complaints I see from photographers are results of colorimetric failures.
>
> I would suggest most photographers don't even know what colorimetric accuracy or colorimetric failure is. They do easily understand non pleasing color.
I would agree with you on both counts. But I'd also argue that it's analogous to a situation in which musicians didn't understand basic acoustics (including the relationship between frequency and pitch), but did understand when something was out of tune. You can tune an instrument if you don't know a thing about sound waves, but that doesn't mean that some basic knowledge of the physics (and perceptual psychology) of what's going on wouldn't make your life a lot easier.
>> Improper white balance and exposure are colorimetric failures
>
> Well it certainly isn't a recommend photographic workflow to improperly white balance and improperly expose!
Except that it is.
Not that the recommended *outcome* is improper white balance and exposure; rather, that the recommended tools and techniques are incapable of achieving better than a close approximation at best -- and then, generally, only after much practice and skill.
> If I white balance a scene of a model on the beach at sunset, is that colorimetric failures? It would certainly produce an image that doesn't look anything like the original scene.
You would be surprised at how natural-looking a truly proper white balance and exposure of your model would look during the Golden Hour. The photo I liked to above was taken at the heart of the Golden Hour. Do you think a model sitting on one of those small granite boulders would look unnatural with that white balance?
The real problem is that the tools you're currently using to achieve white balance are woefully inadequate, resulting in improper white balance -- so of *course* the results look bad.
>> That same enhanced contrast, in turn, is also responsible for many of the unnatural-looking skin tones.
>
> So we're back to pleasing color? Or in all cases, what some are calling colorimetric accuracy always produces natural looking skin tones?
Practically by definition, a colorimetric rendition of a model will result in the skin tones of the photograph matching the skin tones of the subject -- and I'd certainly call that a ``natural-looking'' result.
Now, of course, some people have ugly skin, and it may be desired to retouch the photograph to give the person a digital makeover. And that may well involve hue shifts and what-not. But I think we'd agree that, at that point, the result isn't ``natural'' at all, even if it's more ``pleasing.''
I think we'd both agree that the ideal solution in such cases is to have a good makeup artist perform the magic that they do so that the subject looks good in the studio, thereby minimizing any post-production work.
> If the vast majority of photographers treat their images as they did with film, then I'd disagree.
The problem with that vast majority of photographers is that they want their Kate and Edith, too.
They want the ``pop'' of the Velveeta-like landscapes but with natural-looking skin tones and they don't want blocked shadows or blown highlights.
As with everything else, all photography is a compromise. I'm just advocating that we should have the full range of options for which compromises to make open to us. I'm advocating for transparency in the signal chain, in other words. Let us increase contrast if we want, but don't pre-bake an S-curve into the output that irretrievably destroys all that shadow and highlight detail, for example -- as Adobe's raw converters do.
> And there was a lot of secret sause in film, the processing of the film and the filter packs needed to produce a desired result. Nothing new here in terms of image capture.
But that's just it.
Lots of wonderful art was made and is still being made will all sorts of films and their secret sauces.
And lots of wonderful art has been and is still being made with digital tools that are being used to mimic various film stocks as well as virtual film stocks that never existed in the first place.
But why should digital be forced to mimic film? Why should it be burdened with all its baggage?
Yes, of course. When you want something to look like film, either shoot film or use a digital process that simulates it.
But when you're *not* after recreating the look of film, why should you have to start with that sort of a recreation and then be forced to clumsily attempt to undo all that processing just to get back to a neutral starting point?
>> I'll also note that many of the complaints I've heard about colorimetric workflows stem from people who've never actually successfully implemented one.
>
> Or those that have and do not do copy work or scientific capture and wish to impart their artistry into the process.
Respectfully, your complaints indicate to me that you've never actually worked with a quality colorimetric workflow -- only some of the deeply-flawed early attempts at such a workflow.
>> Often, they'll start with an Adobe raw development and then build an ICC profile from a badly-exposed snapshot of a classic 24-patch ColorChecker held by the photographer at arm's length...and then wonder why the results are so ugly.
>
> Well that's not possible (Adobe raw engines don't support camera ICC profiles). Badly exposed images are to be avoided. Just look at the confusion over what is called ETTR (which I'd submit should be called ER or expose right, expose for raw data to produce ideal data if possible).
Yes, we agree on this. And, yet, what I described -- or variations on that same theme but without the errors being so gross or obvious -- is what most consider to be a colorimetric workflow. If I may be so bold, you yourself would seem to fit into this category, especially if your last attempts at implementing a colorimetric workflow were some time ago and with a 24-patch ColorChecker.
Is it any wonder that it has such a bad reputation?
Cheers,
b&
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Colorsync-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden