• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Deinterlacing QTCaptureDecompressedVideoOutput
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Deinterlacing QTCaptureDecompressedVideoOutput


  • Subject: Re: Deinterlacing QTCaptureDecompressedVideoOutput
  • From: "Bram Loogman" <email@hidden>
  • Date: Thu, 6 Mar 2008 11:02:50 +0100

Hi Robert,

I just tried to save the captured image to a mov file with the length
of one frame and than open it as a QTMovie and export it again as an
jpeg. When I look at the QTMovie the image it's ok, but the exported
image looks the same as the image captured with an image buffer. The
strange thing is when i use quicktime to capture and export a frame it
looks fine, so maybe it has something to do with the settings of the
DV component? I'm very new in this cocoa programming so it's just
guessing. Could you show me an example using the CVPixelBuffer so I
know the other possibilities before I continue with this approach?

Thanks,
Bram


2008/3/5, Robert Douglas <email@hidden>:
> I  ran in to a similar problem while analyzing incoming HDV images and
>  I didn't find any simple solution.  My approach now is to create a
>  second CVPixelBuffer with half the number of lines and copy every
>  second line into that buffer.  Or two buffers to get better temporal
>  resolution.   I haven't yet gotten around to writing the field images
>  to a file so there may be issues with rectangular pixels, but output
>  to a CIImage-based view works well.
>  I'd be interested in hearing of other solutions,
>  Rob
>
>
>
>  On 5-Mar-08, at 9:11 AM, Bram Loogman wrote:
>
>  > Hi,
>  >
>  > When I use QTCaptureDecompressedVideoOutput to capture a still image
>  > from a QTCaptureSession with a DV camera as input device, the image is
>  > interlaced. Which looks very ugly for moving objects. I basically use
>  > the code from the 'Creating a QTKit Stop or Still Motion Application'
>  > tutorial in the QTKit Capture Programming Guide to get an image buffer
>  > and than store it as a jpeg image.
>  > I looked at the camera settings but it's not possible to change the
>  > shutter speed or something like that. Is there a way to deinterlace
>  > the output from the QTCaptureSession?
>  >
>  > Thanks,
>  > Bram
>
> > _______________________________________________
>  >
>  > Cocoa-dev mailing list (email@hidden)
>  >
>  > Please do not post admin requests or moderator comments to the list.
>  > Contact the moderators at cocoa-dev-admins(at)lists.apple.com
>  >
>  > Help/Unsubscribe/Update your Subscription:
>  >
>  > This email sent to email@hidden
>
>
_______________________________________________

Cocoa-dev mailing list (email@hidden)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

References: 
 >Deinterlacing QTCaptureDecompressedVideoOutput (From: "Bram Loogman" <email@hidden>)
 >Re: Deinterlacing QTCaptureDecompressedVideoOutput (From: Robert Douglas <email@hidden>)

  • Prev by Date: CoreData fed ImageView blank on first run.
  • Next by Date: CoreAnimation efficiency
  • Previous by thread: Re: Deinterlacing QTCaptureDecompressedVideoOutput
  • Next by thread: Core Data, SQLite, and Housekeeping
  • Index(es):
    • Date
    • Thread