Ben Goren wrote:
I would naively expect everything to be normalized for the same wavelength. Is there a reason why illuminants are normalized for 560 nm and the Standard Observer for 555 nm? Does it make a practical difference, let alone a real-world one?
I'm not 100% sure, but in practical terms the illuminant standards are conveying the spectral shape, not any particular absolute level - after all, you can change an illuminants intensity simply by the distance you are from it. You are therefore at liberty to normalize them in any way you want to suite your given purpose.
But the 1931 2° Standard Observer table is normalized so that y(lambda) = 1.000000 at 550 nm. And the 1964 10° Standard Observer is y(lambda) = 0.999110 at 550 nm, with nothing exactly equal to 1 (and 550 nm still the peak for y(lambda)).
This <https://en.wikipedia.org/wiki/Luminosity_function> explains why 555nm is the peak of the Y curve. If you look at the 1nm data, the 10 degree curve value reaches 1.0 at 557nm, perhaps indicating that the 10 degree Luminosity function is slightly shifted (or at least appears so from the experimental data that the curves were derived from.) Graeme Gill.