Following the common approach, like presented in Raytracing of Dispersion Effect In Transparent Materials (by Wilkie, Tobler, and Purgathofer), I randomly sample a wavelength (in the visible spectrum of 360-780 nm) and convert it to the CIE XYZ 1931 Standard Observer values (https://www.rit.edu/cos/colorscience/rc_useful_data.php).
Next, I convert the CIE XYZ 1931 values to Linear sRGB (D65) by multiplying by the color transform matrix (https://en.wikipedia.org/wiki/SRGB). This transformation can result in Linear sRGB values that are out of the [0.0, 1.0] range. For example:
Code: Select all
| 3.2406 −1.5372 −0.4986 | | 0.2511 | | −0.06179508 |
| −0.9689 1.8758 0.0415 | x | 0.0739 | = | −0.04125302 |
| 0.0557 −0.204 1.057 | | 1.5281 | | 1.61411237 |
If I want to render dispersion, clipping here I think would be problematic. It appears to me that I would be clipping out samples, resulting in darker renders due to clipped out energy.
One possible solution that comes to mind is restricting my sampling to wavelengths that result in [0.0, 1.0] RGB value without clipping. I will try this next.
If clipping is the usual approach, are there other common ones, perhaps normalizing, that would work better here? Normalizing I suspect would result in an undesirable color shift.