Re: Real time video analysis app under Cocoa
Re: Real time video analysis app under Cocoa
- Subject: Re: Real time video analysis app under Cocoa
- From: Jean-Daniel Dupas <email@hidden>
- Date: Sat, 3 May 2008 22:41:09 +0200
Le 3 mai 08 à 19:49, Yreaction JP a écrit :
Hi there
I have a few questions about what way should I choose for develop an
real time video analysis app. I already have a solid knowledge of
image analysis and objetive c aswell. The basic idea to start with
this is an app that get a real time video signal and pass a set of
filters (thresholding, segmentation) that count or discount an
object (e.a Beans) and shows the number of the current count.
Im not really sure what core video offers, but as fair i understand
while reading the guide, Do i need to think something like the next
workflow?
iSight Signal - Core Video - Buffer - vImage (convolutions to each
frame?) - Core Video (Compose an output video with a eliptical
color around the bean?) - Results?
Thanks in advance.
This sample may give you a start:
http://developer.apple.com/samplecode/CIColorTracking/index.html
It processes movie from a file and not from an iSight, but it show you
how to analyse and update frames from a running movie.
There is also a bunch of sample on the ADC site that show how to
capture and analyse an iSight signal.
The processing path may be
iSight (or any other CVImageBuffer source) -> Core Image (-[CIImage
initWithCVImageBuffer:]) -> Image processing -> -[CIContext
drawImage:] (using an OpenGL View for example).
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden