|[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]|
What's the most sensilbe way to grab a frame from CoreVideo, apply a Core Image filter, and then use the filtered image as a texture? All the developer examples draw into an OpenGL Context. I could of course directly use the image buffer associated with my CGBitmapContext, but I was wondering if there was a simpler/accepted way to handle this case. Thanks, David _______________________________________________ Do not post admin requests to the list. They will be ignored. Quartz-dev mailing list (email@hidden) Help/Unsubscribe/Update your Subscription: This email sent to email@hidden
Visit the Apple Store online or at retail locations.
Copyright © 2011 Apple Inc. All rights reserved.