Re: How to capture a video stream
Re: How to capture a video stream
- Subject: Re: How to capture a video stream
- From: email@hidden
- Date: Wed, 05 Dec 2012 20:53:22 -0700
- Importance: Normal
> On Dec 4, 2012, at 16:50 , email@hidden wrote:
>
>> The setSampleBufferDelegate:self queue:queue gives a warning in XCode
>> that
>> says Sending '<my object name here>' to parameter of incompatible type
>> 'id'<AVCaptureVideoDataOutputSampleBufferDelegate>'
>
> You need to declare the class of 'self' as conforming to the
> AVCaptureVideoDataOutputSampleBufferDelegate protocol. So, for example, if
> 'self' is your app delegate, of class MyAppDelegate, then in the header
> file instead of this:
>
> @interface MyAppDelegate : NSObject
>
> you'd put this:
>
> @interface MyAppDelegate : NSObject
> <AVCaptureVideoDataOutputSampleBufferDelegate>
>
>
>
Do I need to use CoreMedia to actually get an image from the sampleBuffer?
Using Xcode, it appears that UIImage is for iOS (this is just a
supposition). So XCode changes the code to CIImage.
If this is the wrong direction, please point me in the right direction.
The AVFoundation examples seem to all be based on IOS and not MAC for the
Media Steaming. Therefore those examples don't work for what I am trying
to do.
I don't want to capture to a file, I just want to get the images from the
cam as they are streamed.
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden