Re: What format is Best Render Quality?
site_archiver@lists.apple.com Delivered-To: pro-apps-dev@lists.apple.com On Feb 27, 2008, at 12:28 AM, Brian wrote: In what format are the FxBitmap pixels, when viewing in Motion 3, with View->Render Quality->Best ? I'm mostly just doing a memcpy per scanline from inData to outData. Is there something special I need to do to handle "best"? There shouldn't be. PS: In my Project Settings, depth is 8-bit. (Source video is uncompressed 10-bit YUV. My FxPlug-in doesn't handle YUV, so it should get 8-bit ARGB.) Stepping through debugger, it does. FxBitmap is 8-bit ARGB. ([inMap pixelFormat] == kFxPixelFormat_ARGB) So, I'm stumped. Darrin -- Darrin Cardani dcardani@apple.com _______________________________________________ Do not post admin requests to the list. They will be ignored. Pro-apps-dev mailing list (Pro-apps-dev@lists.apple.com) Help/Unsubscribe/Update your Subscription: http://lists.apple.com/mailman/options/pro-apps-dev/site_archiver%40lists.ap... It should be identical to when you have Render Quality set to "Normal". In Motion you should never receive any pixel format that isn't some form of RGB. My FxPlugin views properly with "Normal" Render Quality, at Depth of both 8 bit int and 32 bit float. But, with "Best", the image is dimmed, and pure black goes bright green. Pure black going green is a sign that Y'CbCr data is being interpreted as RGB data rather than being converted to RGB. That sounds like a bug in Motion. I just ran a quick test with some 10-bit Y'CbCr footage and I see that Broadcast Safe - one of our built-in filters that works in software - also exhibits this problem. It looks like a bug on our end. FWIW, if you can get your filter running in hardware, it won't have this problem with 10-bit Y'CbCr footage. I have filed a bug about this. This email sent to site_archiver@lists.apple.com
participants (1)
-
Darrin Cardani