iOS: AVFoundation, creating sample buffers for an AVAssetWriter from OpenGL ... so confused :-/
iOS: AVFoundation, creating sample buffers for an AVAssetWriter from OpenGL ... so confused :-/
- Subject: iOS: AVFoundation, creating sample buffers for an AVAssetWriter from OpenGL ... so confused :-/
- From: John Michael Zorko <email@hidden>
- Date: Thu, 14 Jul 2011 22:13:57 -0700
Hello, all ...
I'm trying to create a sample buffer from an OpenGL view using glReadPixels(), so I can write the sample buffer with an AVAssetWriter I set up. I'm recording audio and video, and i'm really quite confused as to how to do this. So far, i'm recording audio and video straight from the sample buffers passed into the sample buffer delegate's captureOutput:didOutputSampleBuffer:fromConnection call, using an AVAssetWriter that I already set up. However, since i've got a shader doing things to the video, I actually want to write the shader-modified frame instead of the one passed to the delegate. The examples i've seen so far use AVAssetWriterInputPixelBufferAdaptor, but the examples only handle video, not video and audio. How do I take what glReadPixels() returns and put it into a CMSampleBuffer, so I can write it with my AVAssetWriter?
Any help would be quite appreciated :-)
Regards,
John
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden