• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: iChat perspective view
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: iChat perspective view


  • Subject: Re: iChat perspective view
  • From: Lorenzo <email@hidden>
  • Date: Sun, 18 Jul 2004 11:25:19 +0200

Yes, TextureRange is just the sample code I have learned from.
Thanks for replying.

Best Regards
--
Lorenzo
email: email@hidden

> From: John Stauffer <email@hidden>
> Date: Sat, 17 Jul 2004 17:21:20 -0700
> To: Lorenzo <email@hidden>
> Cc: Cocoa Development <email@hidden>, John Stauffer
> <email@hidden>, Shaun Wexler <email@hidden>
> Subject: Re: iChat perspective view
>
> This is the example you'll want to look at:
>
> http://developer.apple.com/samplecode/TextureRange/TextureRange.html
>
> John
>
> On Jul 17, 2004, at 4:09 PM, Lorenzo wrote:
>
>> Hi,
>> thank you for replying. Yes, I supposed that they used OpenGL.
>> And I have seen some OpenGL samples about video to textures.
>> Personally I use glTexSubImage2D (I have learned just from the Apple
>> samples
>> code) in order to get each video frame to texture. It works well with
>> about
>> 360 x 240 frame. But if the frame is larger, everything runs so slow.
>> So I
>> can't understand how they can run 4 video (I presume about 360 x 240
>> each
>> one too) simultaneously. Can you tell me exactly which Apple sample
>> code did
>> you mentioned? Maybe I lost something better about the latest
>> technologies.
>>
>> Best Regards
>> --
>> Lorenzo
>> email: email@hidden
>>
>>> From: Shaun Wexler <email@hidden>
>>> Date: Sat, 17 Jul 2004 14:21:47 -0700
>>> To: Cocoa Development <email@hidden>
>>> Cc: Lorenzo <email@hidden>
>>> Subject: Re: iChat perspective view
>>>
>>> On Jul 17, 2004, at 12:52 PM, Lorenzo wrote:
>>>
>>>> Hi,
>>>> I have just seen the Tiger QT KeyNote movie at Apple.com and I ask
>>>> that to
>>>> myself: which technology does Apple use to display the iChat Video
>>>> Conference image in a perspective view? OpenGL? As I experienced
>>>> OpenGL
>>>> doesn't allow to manage great amount of movie data. So, when the
>>>> users
>>>> are
>>>> 10, taking the whole screen, how can they use OpenGL? Maybe "Core
>>>> Image"?
>>>> Have you a vague idea?
>>>
>>> It's quite simple to perform in OpenGL, using texture-mapped
>>> quadrilaterals, with efficient streaming of video frames with
>>> hardware-accelerated rendering directly to textures. There are a few
>>> examples in Apple Sample Code to show you how to do this. I believe
>>> that the new iChat AV in Tiger will allow the 3D video conferencing
>>> between up to 4 simultaneous users (including yourself), and audio
>>> conferencing will be limited to 10 users. The maximum frame rate of
>>> [iSight] video is 30 fps... but in my own experience I don't think
>>> it's
>>> often even close to that, more like 10 fps. CoreImage/CoreVideo are
>>> still under NDA.
>>> --
>>> Shaun Wexler
>>> MacFOH
>>> http://www.macfoh.com
>> _______________________________________________
>> cocoa-dev mailing list | email@hidden
>> Help/Unsubscribe/Archives:
>> http://www.lists.apple.com/mailman/listinfo/cocoa-dev
>> Do not post admin requests to the list. They will be ignored.
_______________________________________________
cocoa-dev mailing list | email@hidden
Help/Unsubscribe/Archives: http://www.lists.apple.com/mailman/listinfo/cocoa-dev
Do not post admin requests to the list. They will be ignored.


References: 
 >Re: iChat perspective view (From: John Stauffer <email@hidden>)

  • Prev by Date: Kqueues TOO fast
  • Next by Date: Re: Expression parsing
  • Previous by thread: Re: iChat perspective view
  • Next by thread: Re: iChat perspective view
  • Index(es):
    • Date
    • Thread