• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: [iPhone] Sample code for live camera stream?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [iPhone] Sample code for live camera stream?


  • Subject: Re: [iPhone] Sample code for live camera stream?
  • From: Conal Elliott <email@hidden>
  • Date: Thu, 4 Feb 2010 22:13:00 -0800

Thanks Luke.  I'm stumped about how affine transforms could account for
these effects, particularly bubble, pinch & swirl.  Also, iiuc,
UIImagePickerController doesn't give access to the camera input stream --
just an image when the user actually takes a picture.  There's a new
takePicture method, but I don't think it can be called frequently.

  - Conal

On Thu, Feb 4, 2010 at 8:51 PM, Luke the Hiesterman <email@hidden>wrote:

> I haven't seen the app, but the simplest way to transform the camera input
> is via the cameraViewTransform property on UIImagePickerController available
> in 3.1
>
> Luke
>
> On Feb 4, 2010, at 8:48 PM, Kyle Sluder wrote:
>
> > You can get the video data now using UIGetScreenImage, though it's not
> > the lightest on the battery.
> >
> > --Kyle Sluder
> >
> > On Thu, Feb 4, 2010 at 5:53 PM, Conal Elliott <email@hidden> wrote:
> >> Any ideas on how Live Effects (http://www.omaxmedia.com/) manages to
> >> spatially transform the live camera input?  It doesn't just overlay like
> the
> >> augmented reality apps.
> >>
> >> Thanks,  - Conal
> >>
> >> On Tue, Dec 15, 2009 at 1:23 PM, Mark Woollard <email@hidden
> >wrote:
> >>
> >>> They place UI elements over the video feed provided on screen by the
> built
> >>> in video capture window, but they don't have access to the actual video
> >>> data.
> >>>
> >>> Mark
> >>>
> >>> On 15 Dec 2009, at 16:04, Gabriel Zachmann wrote:
> >>>
> >>>>> There is not currently API for this. The API allows you to place
> >>> overlays on the screen, but video data is not delivered to your app
> until
> >>> the user is finished recording.
> >>>>
> >>>> So, how do all the so-called augmented reality apps do it?
> >>>>
> >>>> Best regards,
> >>>> Gabriel.
> >>>
> >>>
> >> _______________________________________________
> >>
> >> Cocoa-dev mailing list (email@hidden)
> >>
> >> Please do not post admin requests or moderator comments to the list.
> >> Contact the moderators at cocoa-dev-admins(at)lists.apple.com
> >>
> >> Help/Unsubscribe/Update your Subscription:
> >>
> >>
> >> This email sent to email@hidden
> >>
> > _______________________________________________
> >
> > Cocoa-dev mailing list (email@hidden)
> >
> > Please do not post admin requests or moderator comments to the list.
> > Contact the moderators at cocoa-dev-admins(at)lists.apple.com
> >
> > Help/Unsubscribe/Update your Subscription:
> >
> > This email sent to email@hidden
>
>
_______________________________________________

Cocoa-dev mailing list (email@hidden)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

References: 
 >Re: [iPhone] Sample code for live camera stream? (From: Conal Elliott <email@hidden>)
 >Re: [iPhone] Sample code for live camera stream? (From: Kyle Sluder <email@hidden>)
 >Re: [iPhone] Sample code for live camera stream? (From: Luke the Hiesterman <email@hidden>)

  • Prev by Date: Re: [iPhone] Sample code for live camera stream?
  • Next by Date: RE: NSOutlineView parentobject
  • Previous by thread: Re: [iPhone] Sample code for live camera stream?
  • Next by thread: Re: [iPhone] Sample code for live camera stream?
  • Index(es):
    • Date
    • Thread