Re: Avoiding color transformations in PNG/UIImage/CGImage ops?
Re: Avoiding color transformations in PNG/UIImage/CGImage ops?
- Subject: Re: Avoiding color transformations in PNG/UIImage/CGImage ops?
- From: Alex Zavatone <email@hidden>
- Date: Thu, 16 Nov 2017 22:04:01 -0600
Nah. I checked the results on two of my devices and the simulator to make sure
I had a grip on what was going wrong.
When I took all the saved images and opened them up. I’d expect each set of
them to have the same error it’s not a colorspace profile.
The reason this mattered was that I was saving out the image within the app
just to test and compare. We were doing drivers’ license photos, so if things
were going wrong, I had to know by how much.
All I can remember from that was that it was something related to the
colorspace profile that was added to the image based on which device the image
was saved onto.
> On Nov 16, 2017, at 9:58 PM, Rick Mann <email@hidden> wrote:
>
> Do you literally mean moving the iOS simulator window to a different monitor,
> even though there’s never a display context in any of this code?
>
> --
> Rick Mann
> email@hidden
>
>> On Nov 16, 2017, at 19:55, Alex Zavatone <email@hidden> wrote:
>>
>> I was basing my assumption on this line of your email:
>>
>>>>> In the unit test, I load a saved version of one of those images from a
>>>>> PNG file to UIImage, get at the buffer, pass it to my code, and then
>>>>> compare the result to a saved version of that same output.
>>
>> In that, I was assuming that the image from the CGImage that you mention is
>> similar should be the same.
>>
>> My guess here is that the PNG that you saved has a colorspace applied to it.
>> But it seems odd that the bytes are different unless the image is run
>> through a transform with CI. I wouldn’t expect one either.
>>
>> About 3 years ago when I was looking into similar things, I came across a
>> nice explanation of the CI pipeline, in the Apple Docs that did a good job
>> explaining what happens where and when. Even in the display of image items
>> that are saved/loaded to/from disk.
>>
>> Everything came down to not accommodating the color profile or something
>> related to that.
>>
>> I even got different results when testing on 2 monitors.
>>
>> An easy way to rule out my hypothesis would be to check the results on two
>> different displays with different color calibration and see if that has some
>> effect or no effect at all.
>>
>>
>> A quick walk through the CI pipeline docs will probably trigger something
>> that will answer your question.
>>
>> I’ll see it I can find the ones that were useful to me for you.
>>
>>
>>
>>
>>
>>
>>> On Nov 16, 2017, at 9:29 PM, Rick Mann <email@hidden> wrote:
>>>
>>> OpenCV is not relevant in this case, that was just background that I
>>> probably didn't need to include.
>>>
>>> I'm generating an image buffer, saving it as PNG through UIImage, then
>>> generating it again, and comparing it to the UIImage bytes I get by opening
>>> the previously-saved image. I'm not specifying anything other than device
>>> RGB color space when I create the CGBitmapContext I use to generate the
>>> image buffer.
>>>
>>>> On Nov 16, 2017, at 19:27 , Alex Zavatone <email@hidden> wrote:
>>>>
>>>> Before looking into OCV, my wild guess is that it may have to do with the
>>>> colorspace of the image. What are you setting it to? Profile Name: sRGB
>>>> IEC61966-2.1?
>>>>
>>>>
>>>>> On Nov 16, 2017, at 8:31 PM, Rick Mann <email@hidden> wrote:
>>>>>
>>>>> I'm trying to write a unit test for some code I wrote that generates one
>>>>> image from another. In the main app, the source data comes from Open CV
>>>>> as a buffer of 3 byte-per-pixel elements. My code generates a CGImage. In
>>>>> the unit test, I load a saved version of one of those images from a PNG
>>>>> file to UIImage, get at the buffer, pass it to my code, and then compare
>>>>> the result to a saved version of that same output.
>>>>>
>>>>> The saved version is a PNG. I load that, and then get the data buffer
>>>>> using
>>>>>
>>>>> let fiData = fi.dataProvider?.data as Data?
>>>>>
>>>>> I do a similar thing with the generated CGImage. Then I compare the two
>>>>> buffers, byte by byte. They are similar, but contain differences
>>>>> (sometimes more than I would expect). But if I save both as PNG and look
>>>>> at them in Preview they look identical.
>>>>>
>>>>> My guess is something's happening somewhere with color correction. In my
>>>>> code, I make a CG(bitmap)Context, specifying device RGB color space
>>>>> (should that be generic?). I don't really know what happens to the PNGs I
>>>>> save and load.
>>>>>
>>>>> Is there a way to ensure the bytes in the buffer are compressed and
>>>>> decompressed exactly as written?
>>>>>
>>>>> --
>>>>> Rick Mann
>>>>> email@hidden
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>>
>>>>> Cocoa-dev mailing list (email@hidden)
>>>>>
>>>>> Please do not post admin requests or moderator comments to the list.
>>>>> Contact the moderators at cocoa-dev-admins(at)lists.apple.com
>>>>>
>>>>> Help/Unsubscribe/Update your Subscription:
>>>>>
>>>>> This email sent to email@hidden
>>>>
>>>
>>>
>>> --
>>> Rick Mann
>>> email@hidden
>>>
>>>
>>
>
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden