Hardware-accelerated scaling on iOS without OpenGL
Hardware-accelerated scaling on iOS without OpenGL
- Subject: Hardware-accelerated scaling on iOS without OpenGL
- From: Andreas Falkenhahn <email@hidden>
- Date: Fri, 25 Nov 2016 16:22:15 +0100
I'm currently writing an iOS backend for a cross-platform program whose
platform-independent engine writes all of its graphics into 32-bit pixel
buffers, in RGBA order. The alpha byte isn't used. The graphics are always
opaque so I don't need alpha blending.
What is the most efficient option to draw and scale these pixel buffers to
my CGContextRef inside my drawRect method? The pixel buffers are usually
only 320x240 pixels and need to be scaled to completely fill my view's
dimensions, e.g. 1024x768 on non-Retina iPads and 2048x1536 on Retina iPads.
This is a whole lot of work so it's best done using the GPU. But how can I
force iOS to draw and scale using the GPU without using OpenGL?
I've tried using CGContextDrawImage() but this is really slow, probably because
everything is done using the CPU.
I've also had a look at the CIImage APIs because these are apparently GPU
optimized but the problem is that CIImage objects are immutable so I'd have to
constantly create new CIImage objects for each frame I need to draw which will
probably kill the performance as well.
I could go with OpenGL of course but I'd like to get some feedback on whether
there is an easier solution to get what I want here.
Thanks for any ideas!
--
Best regards,
Andreas Falkenhahn mailto:email@hidden
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden