iOS4: glReadPixels() to CMSampleBufferRef for video?
iOS4: glReadPixels() to CMSampleBufferRef for video?
- Subject: iOS4: glReadPixels() to CMSampleBufferRef for video?
- From: John Michael Zorko <email@hidden>
- Date: Mon, 25 Jul 2011 11:35:17 -0700
Hello, all ...
While all of this discussion on Xcode 4 is interesting (please, Apple, make multiple-window development work again), i've an issue that i'm hoping someone could help with. I'm recording audio and video using AVFoundation, and i'm applying a GPU shader to the incoming video frames for effects. The problem is that after I get the pixels from the GPU via glReadPixels(), i'm kinda stuck as to how to make a CMSampleBuffer out of them so I can write it with an AVAssetWriter. The examples i've seen are confusing to me as they only seem to be concerned with video, when i'm recording video _and_ audio (though i'm not doing any processing to the audio).
Could someone post some example code that illustrates how to get pixels from the GPU (i'm assuming glReadPixels() is the best way to do this) and create a CMSampeBuffer with those pixels? I'd really appreciate it :-)
Regards,
John
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden