Re: How to capture a video stream
Re: How to capture a video stream
On 3 Dec 2012, at 06:02, email@hidden wrote:
> I have been digging for a couple of weeks now and I have progressed to the
> point where I have the AVCaptureDevice and the session.
>
> Here's what I need help/guidance/assistance on:
>
> I want to capture the video images from which ever camera device I select
> and using those images, real time, take the image and pass the bytes to a
> java program.
>
> What I can't seem to figure out is how to get the images real time, like
> preview does. I am trying to emulate how the QuickTime API worked using
> AVFoundation to capture the images and pass them to my java program.
It is relatively easy to move from QTKit to AVFoundation.
>
> The JNI portion I already have figured out.
>
> I have initialized the AVCaptureSession. I have the list of
> AVCaptureDevices. I can select one of the Capture Devices. I can start
> the AVCaptureSession using [session startRunning];. I can't figure out
> the next portion to get the images from the device.
>
> Can anyone tell me what the next steps are? Any snippets of code that I
> can look at? I'm sure I'm not the first to try to do this.
Look at the Apple supplied AVRecorder sample.
You need to define inputs and outputs for the session.
Then you need to configure the assigned AVCaptureOutput subclass delegate in order receive delegate callbacks such as those defined by AVCaptureFileOutputRecordingDelegate
movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
[movieFileOutput setDelegate:self];
[session addOutput:movieFileOutput];
You might want to check out the AVCaptureVideoDataOutput AVCaptureOutput subclass and AVCaptureVideoDataOutputSampleBufferDelegate.
Jonathan
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden