• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: QTKit Capture Filtering
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: QTKit Capture Filtering


  • Subject: Re: QTKit Capture Filtering
  • From: "douglas a. welton" <email@hidden>
  • Date: Fri, 4 Apr 2008 09:33:49 -0400

Bridger,

Check out the documentation on QTCaptureDecopressedvideoOutput. If I read correctly, you should be able to use the CVImageBufferRef sent to the delegate method - captureOutput:didOutputVideoFrame:withSampleBuffer:fromConnection: as the source for your Core Image based processing. Subsequently, you can push the results thru an image compression session and save the results to a file...

Check out the Quicktime-API list archives. This question has been asked there several time... you may find more/better details in the responses.

Also, you should check out the CIVideoDemoGL sample code.

later,

douglas


On Apr 4, 2008, at 3:53 AM, Bridger Maxwell wrote:

Hello,
I would like a little help in designing my application, before I
veer off into the wrong direction. I am making an application that
captures input from a webcam using QTKit Capture, runs the image
through a few filters (background subtraction, contrast, high pass)
then runs that image through a blob detection algorithm. I will then
show the final image (with some additional drawing on it to represent
the blobs that were detected) in a QTCaptureView. I am wondering where
I should apply the filters to the video. In the QTRecorder example the
filters are applied right before being displayed to the view in the
method:
- (CIImage *)view:(QTCaptureView *)view willDisplayImage:(CIImage *)image
Is this what I should use too? I am still reading about Core Image and
Core Video, but in the QTRecorder example Core Video is not used.
Should I not use it either? How do Core Video and QTKit Capture
relate? Your advice will help me to study in the right direction. I am
trying to rewrite the application OpenTouch in Cocoa by May in time
for science fair, so I am fairly pressed for time.


Thank You,
  Bridger Maxwell
_______________________________________________

Cocoa-dev mailing list (email@hidden)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden

_______________________________________________

Cocoa-dev mailing list (email@hidden)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


References: 
 >QTKit Capture Filtering (From: "Bridger Maxwell" <email@hidden>)

  • Prev by Date: Re: Can we write Mac Servies like We write for Windows services
  • Next by Date: Re: Main Thread UI and Detached Thread
  • Previous by thread: QTKit Capture Filtering
  • Next by thread: [MEET] Singapore CocoaHeads Meeting #1 (11th April)
  • Index(es):
    • Date
    • Thread