Re: Blob Detection with Core Image
Re: Blob Detection with Core Image
- Subject: Re: Blob Detection with Core Image
- From: "Bridger Maxwell" <email@hidden>
- Date: Tue, 6 May 2008 08:42:48 -0600
I think I was unclear on where I was lost. I didn't think that I would be
able to use the OpenTouch blob detection framework, because I couldn't pass
it a CIImage, and converting the CIImage to an NSBitMapImageRep was too
slow. The only way to pass the image data to the blob detection library was
through the function:
void computeBlobs(int *pixels);
Therefor I thought that I would have to work with the CIImage only, perhaps
by making a CIFilter. How would the ObjC wrapper work? Oh, and I think the
Cocoa app which you are seeing with the OpenTouch source is actually the one
I am working on right now, I have access the the svn. :)
Thank You,
Bridger Maxwell
On Tue, May 6, 2008 at 3:50 AM, Mike Abdullah <email@hidden>
wrote:
> This seems an awful lot of work to me for little gain. If you check out
> the OpenTouch source, they have an example Cocoa app which really requires
> very little extra work. I think you'd be far better off writing an ObjC
> wrapper than creating your own entirely separate framework. Paweł would
> quite likely be happy to even incorporate it into the framework.
>
> Mike.
>
>
> On 6 May 2008, at 08:33, Bridger Maxwell wrote:
>
> Hello,I am trying to write a program that will detect bright "blobs" of
> > light in an image and then track those blobs of light. I would be a
> > Cocoa
> > version of OpenTouch at http://code.google.com/p/opentouch/. I am
> > wondering
> > the best way to do this sort of image processing with Cocoa frameworks.
> >
> > I have a started this app and use QTKit Capture to grab video from the
> > webcam. I get my images through QTCaptureDecompressedVideoOutput as a
> > CIImage. I can apply some filters to the images and display them in a
> > OpenGLView, but I don't know how I should implement the blob tracking.
> > From
> > experience, making an NSBitmapImageRep from the CIImage so I can work
> > with
> > the image data is far too slow, so I can't work with the blob detection
> > library used in OpenTouch. Is it possible, or recommended, to implement
> > the
> > blob tracking as a CIFilter?
> >
> > I read through CIColorTracking sample code, which is very close to what
> > I
> > want to do. However, CIColorTracking simplifies the areas of interest
> > down
> > to one location (where to place the duck). I am having trouble seeing
> > how it
> > could be adapted to track more than one blob of light. Is it possible to
> > make a CIFilter that would have an output NSArray containing the points
> > where the blobs were found? I could see how it would be possible to
> > simplify
> > the image down to an alpha mask of the blobs, but don't know how I would
> > extract the number of blobs and location of each from that image. Also,
> > getting the size of the blob would be desirable.
> >
> > I have done a lot of reading and don't seem to be getting anywhere. Some
> > advice on how to proceed would be greatly appreciated.
> >
> > Thank You,
> > Bridger Maxwell
> > _______________________________________________
> >
> > Cocoa-dev mailing list (email@hidden)
> >
> > Please do not post admin requests or moderator comments to the list.
> > Contact the moderators at cocoa-dev-admins(at)lists.apple.com
> >
> > Help/Unsubscribe/Update your Subscription:
> >
> >
> > This email sent to email@hidden
> >
>
>
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden