AVFoundation and kYUVSPixelFormat from AVPlayer
AVFoundation and kYUVSPixelFormat from AVPlayer
- Subject: AVFoundation and kYUVSPixelFormat from AVPlayer
- From: "Mr. Gecko" <email@hidden>
- Date: Tue, 11 Oct 2011 21:22:06 -0500
Hello, I am needing to have AVPlayer output the video as an CVImageBufferRef in real time instead of AVPlayerLayer. I am able to do this with QTKit and QTPixelBufferContextCreate, however, QTKit is said to be dead and AVFoundation is the future as well as AVFoundation is 64bit.
So far what I've came up with works, however it is not feasible as I cannot skip time without also telling AVAssetReader to change it's position and it takes up more CPU than QTKit does.
Another thought I had with using AVFoundation is that AVFoundation should have GPU accelerated H.264 decoding, which this does not seem to be true in my current testing with QuickTime X as it still uses 33% CPU with a 1080P video (not sure if that's good or bad).
Here is what I currently have, if anyone can improve it, please tell me how.
avMovie = [[AVPlayer alloc] initWithURL:[NSURL fileURLWithPath:[theArguments objectAtIndex:0]]];
AVPlayerItem *item = [avMovie currentItem];
AVAsset *asset = [item asset];
AVAssetTrack *videoTrack = nil;
NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
if ([tracks count] == 1) {
videoTrack = [tracks objectAtIndex:0];
NSError *error = nil;
assetReader = [[AVAssetReader alloc] initWithAsset:asset error:&error];
if (error!=nil) {
NSLog(@"Unable to create asset reader %@", [error localizedDescription]);
} else {
NSMutableDictionary *bufferOptions = [NSMutableDictionary dictionary];
[bufferOptions setObject:[NSNumber numberWithInt:kYUVSPixelFormat] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
[assetReader addOutput:[AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:bufferOptions]];
[assetReader startReading];
}
}
if (videoTrack!=nil) {
NSLog(@"%f", (Float64)1/[videoTrack nominalFrameRate]);
[avMovie addPeriodicTimeObserverForInterval:CMTimeMake(1001, [videoTrack nominalFrameRate]*1001) queue:dispatch_queue_create("eventQueue", NULL) usingBlock:^(CMTime time) {
dispatch_sync(dispatch_get_main_queue(), ^{
AVAssetReaderTrackOutput *output = [[assetReader outputs] objectAtIndex:0];
CMSampleBufferRef sampleBuffer = [output copyNextSampleBuffer];
if (sampleBuffer!=NULL) {
CVImageBufferRef texture = CMSampleBufferGetImageBuffer(sampleBuffer);
// Do code with the CoreVideo Image.
CFRelease(texture);
}
});
}];
}
[avMovie setRate:1.0];
P.S. I know I have leaks in there somewhere, I plan to remove them after getting the concept completed with all of this.
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden