There's just one thing that I absolutely couldn't work out, which is CGContextDrawImage's handing of out-of-gamut colors. Here's what the basic processing pipeline is:
- I'm creating a CGImage in a wide gamut space, in floating point format.
- I then create a floating point bitmap CGContext, and in turn create a CIContext from that.
- I then render a new CGImage, using CGContext DrawImage and then CGBitmapContextCreateImage.
- If the CGContext that I rendered the new image to was in floating point format, colors were correctly converted to e.g, the sRGB space, but components are left out of gamut - e.g., <0.913567>. "Correctly" here meaning that the values are exactly the same as what you would get using the ColorSync Utility's calculator.
- If the CGContext that I rendered the new image to is in int format, the components are clipped (obviously), but otherwise the values are identical to the floating point case. This occurred regardless of what the rendering intent was set to - there is not even one decimal points change in the values with rendering intent changes.
Add a comment