• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
VideoConference.framework question
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

VideoConference.framework question


  • Subject: VideoConference.framework question
  • From: James Stanton <email@hidden>
  • Date: Fri, 28 Jan 2005 10:29:07 -0600

Hi,


I'm working on a video conferencing application, and I'd like to use Apple's private "VideoConference.framework", since it has quite a bit of the functionality that i need.  I understand that these frameworks are subject to change, and development using them is not recommended.  The framework seems straightforward enough to initialize and access the camera, the problem comes in when I'm trying to display the Camera to the screen.  If I want to start a preview, it wants to write it to a (char *) buffer.  I can't figure out what graphics library would accept a char * for display on the screen.  it doesn't appear to be a quicktime "Movie" class, nor will it draw if I pass the QuickDraw pointer to it.  the (char *) buffer appears to be some kind of two dimensional buffer, as the max size for it is a struct with height and width.  there are two functions, get_seqg and get_ch_video, which seemed promising, but I was unable to cast them to be a SeqGrabComponent and SGChannel respectively.

    If anyone has any ideas about how I can display the preview to a Cocoa App View of any kind, I'd greatly appreciate it.  I've included some code snippets and possible calls below.


Jamie


jstanton(at)clairvista(dot)com


Code Snippet:


vc = [[VideoConferenceController alloc] init];
NSArray *cList = [VideoConferenceController newcameraList];
NSArray *cList2 = [VideoConferenceController cameraList];
NSObject *camTest = [cList2 objectAtIndex:0];
BOOL selectCamBOOL = [vc selectCamera:camTest];

struct _NSSize nSize = [vc previewBufferSize];  //_NSSize has 2 floats, width and height

VCCamera *selCam = [vc selectedCamera];  //just to look at the camera data
BOOL isOpen = [vc openCamera];
BOOL canStart = [vc canStartPreview];  //returns true

//I'm not sure I should be using localBuffer for this next line. but I don't know

//any graphics lib object to pass it instead.
int test2 = [vc startPreviewWithBuffer:[vc localBuffer] size:nSize]; //sets test2 to 0



localBuffer is declared in VideoConferenceController as 

char *_pLocalBuffer;


VideoConferenceController methods:

- (id)get_seqg;
- (id)get_ch_video;

- (int)startPreviewWithBuffer:(char *)fp8 size:(struct _NSSize)fp12;

- (char *)localBuffer;    



 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Cocoa-dev mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

  • Follow-Ups:
    • Re: VideoConference.framework question
      • From: Nicko van Someren <email@hidden>
    • Re: VideoConference.framework question
      • From: Guy English <email@hidden>
  • Prev by Date: Re: using 'cp' with NSTask
  • Next by Date: Re: Debugging Bindings
  • Previous by thread: Re: Cocoa Bundle and weak-linking
  • Next by thread: Re: VideoConference.framework question
  • Index(es):
    • Date
    • Thread