Re: AudioDeviceRead vs captureInputIOProc (in sample app Daisy)
Re: AudioDeviceRead vs captureInputIOProc (in sample app Daisy)
- Subject: Re: AudioDeviceRead vs captureInputIOProc (in sample app Daisy)
- From: email@hidden
- Date: Sun, 18 May 2003 14:12:14 -0700
email@hidden writes:
>
I'm trying to write an app that captures input from the mic and then do
>
some real time analysis on it. AudioDeviceRead() in AudioHardware.h
I was just working with this yesterday...
>
seems like the write function to use, but I cant find any examples of
>
its use. When I look at Daisy I see that they have written a client
>
side AudioDeviceIOProc, captureInputIOProc. When would you use
>
AudioDeviceRead?
You can use AudioDeviceRead if your callback doesn't supply you with
the input data you want to process. One way this can happen, if you
are using the HAL, is when using different devices for input and
output. (If you are using the same device, you get a buffer of input
as part of your callback.) Another case where AudioDeviceRead is
useful is when using an OutputAudioUnit to interact with a device,
since the AudioUnit callback doesn't provice any input data.
To use AudioDeviceRead, you need to do some setup:
- Create an AudioBufferList matching the device's stream configuration
- Register the buffer list with the device
- Get the input safety offset of the device
Then in your callback, you call AudioDeviceRead. In order to read
numFrames from AudioDeviceRead, The timestamp you provide should be
set to callbackTime.mSampleTime - safetyOffset - numFrames.
- sekhar
--
C. Ramakrishnan email@hidden
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.