Re: How to capture a video stream
Re: How to capture a video stream
- Subject: Re: How to capture a video stream
- From: email@hidden
- Date: Mon, 10 Dec 2012 22:33:27 -0700
- Importance: Normal
>> On Dec 5, 2012, at 19:53 , email@hidden wrote:
>>
>>> Do I need to use CoreMedia to actually get an image from the
>>> sampleBuffer?
>>> Using Xcode, it appears that UIImage is for iOS (this is just a
>>> supposition). So XCode changes the code to CIImage.
>>>
>>> If this is the wrong direction, please point me in the right direction.
>>>
>>> The AVFoundation examples seem to all be based on IOS and not MAC for
>>> the
>>> Media Steaming. Therefore those examples don't work for what I am
>>> trying
>>> to do.
>>
>> You're correct that the Mac version of the docs haven't been changed
>> from
>> the iOS version, but luckily there is (I think) very little difference
>> you
>> need to take into account.
>>
>> Assuming you have by now got your data capture delegate set up properly,
>> you need to have it do the equivalent of this:
>>
>> https://developer.apple.com/library/mac/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/05_MediaRepresentations.html
>>
>> under the final heading "Converting a CMSampleBuffer to a UIImage". In
>> that sample code, you should only have to change one line from its iOS
>> version:
>>
>> UIImage *image = [UIImage imageWithCGImage:cgImage];
>>
>> to the corresponding Mac version:
>>
>> NSImage *image = [NSImage imageWithCGImage:cgImage];
>>
>> What happens next depends on what you want to do with the image. If you
>> want to display it, you can simply draw this NSImage object. If you want
>> to write each frame to a separate file, you could use a NSImage method
>> to
>> get the TIFF representation as a NSData object and write that out. If
>> you
>> need to convert the image to a specific form, you will likely have to
>> start creating NSImageRep objects (or a subclass like NSBitmapImageRep),
>> but the details will depend on where you're trying to get to.
>>
>> Note that in some scenarios, you could choose to work with the cgImage
>> directly, rather than creating a NSImage (which is, loosely, just going
>> to
>> function as a wrapper around the CGImage, at least to begin with). The
>> advantage of using NSImage is that it's often less lines of code to use
>> than CGImage (and it's an Objective-C rather than a C API).
>>
>> P.S. The intermediary code uses CVPixelBuffer and CGImage objects, not
>> CIImage objects. CIImage objects are something else entirely -- they're
>> image containers to which image filters can be applied -- that is,
>> they're
>> essentially image transformations. If, for example, you wanted to apply
>> a
>> Gaussian blur to every frame, you could use CIImage objects to do this.
>
>
> The changes I had to make are as follows:
>
> Using CIImage:
> CIImage *image = [CIImage imageWithCGImage:quartzImage];
>
> Using NSImage:
> NSImage *image = [[NSImage alloc] initWithCGImage:cgImage
> size:NSZeroSize];
>
> With the NSImage, I was getting an additional Error message that I had to
> trap for, but not with the CIImage.
>
>
>
> _______________________________________________
>
> Cocoa-dev mailing list (email@hidden)
>
> Please do not post admin requests or moderator comments to the list.
> Contact the moderators at cocoa-dev-admins(at)lists.apple.com
>
> Help/Unsubscribe/Update your Subscription:
>
> This email sent to email@hidden
>
I need further assistance. I select the capture device, but after I
execute run, I get the following:
Invalid memory access of location 0x0 rip=0x7fff933e3598
942 Segmentation fault: 11
I am following the AVVideoWall example code. However, I am not outputting
to the desktop and layers.
Can someone point me in the right direction on solving this issue?
Thanks.
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden