Re: Reduce Audio Time Latency
Re: Reduce Audio Time Latency
- Subject: Re: Reduce Audio Time Latency
- From: Jeff Moore <email@hidden>
- Date: Tue, 23 Jun 2009 12:55:29 -0700
OK. I'm totally not following your current question at all. The code
snippets and the description of the problem don't hang together.
At any rate, blocking the main thread of a program is usually not a
good idea because it is used to dispatch UI events and what not.
Blocking it can make your app seem unresponsive or bring about the
spinning beach ball of doom.
--
Jeff Moore
Core Audio
Apple
On Jun 23, 2009, at 11:22 AM, Ravneet Kaur wrote:
Thanks Jeff.
You are right and I followed your point to solve the issue.
OSStatus mySimpleIOProc (
AudioDeviceID inDevice,
const AudioTimeStamp* inNow,
const AudioBufferList* inInputData,
const AudioTimeStamp* inInputTime,
AudioBufferList* outOutputData,
const AudioTimeStamp* inOutputTime,
void* inClientData)
{
if(rendererFlag==1)
{
rendererFlag++;
soundStartTime = inOutputTime->mHostTime;
gettimeofday(&audioTimeOfDay, NULL);
double timeOffsetInMilliseconds= ((double)(inOutputTime-
>mHostTime-nowTimeRenderProc))*1E-6;
double timeOffsetInMilliseconds2= ((double)(inNow->mHostTime-
nowTimeRenderProc))*1E-6;
}
return 0;
}
It seems the InNow gave me value with correct latency value. Also
the outOutputData tells me the next schedule data packet delivery
time.
One question that I could not understand. I invoke my program
through Python code and call the api written C which is below
- (void) play:(char *) fileName:(int) volumeValue
{
volume = volumeValue;
pthread_attr_t theThreadAttributes;
int ret;
ret = pthread_attr_init(&theThreadAttributes);
ret = pthread_create(&tid, &theThreadAttributes, playThread,
fileName);
nowTime = mach_absolute_time();
while(rendererFlag<2)
{
usleep(2000);
}
}
The PlayThread created does all the work shown below
void playThread(char* fileName)
{
AURenderCallbackStruct renderCallback;
OSStatus err = noErr;
AudioStreamBasicDescription fileASBD, outputASBD;
FSRef fileRef;
err = setupAudioUnit(&theOutputUnit);
err = MatchAUFormats(&theOutputUnit,&outputASBD, 0); //"0" is
the output bus, use "1" for the input bus
err = getFileInfo(&fileRef, &musicFileID, &fileASBD, fileName);
err = MakeAUConverter(&musicFileID, &converter,&fileASBD,
&outputASBD );
err = setupCallbacks(&theOutputUnit, &renderCallback);
outputUnit = theOutputUnit;
uint64_t nowTime = mach_absolute_time();
// printf("calling audio unit start --------------------------
>\n");
err =AudioOutputUnitStart(theOutputUnit);
// printf("after call audio unit start
-------------------------->%lld\n", (mach_absolute_time()-nowTime));
checkStatus(err);
gIsPlaying=TRUE;
}
Now after my call from Python, if I keep my thread to sleep then the
time is accurate in 1ms range[(time.sleep(secs)) of time.time.]
But if my python thread keep doing any work then I dont get the
correct offset. It seems the proc is run on Main thread of
application (which is python) and that has to sleep to get correct
results.
Please clarify my doubt.
On Tue, Jun 23, 2009 at 3:15 AM, Jeff Moore <email@hidden> wrote:
First off, latency doesn't change over time. The hardware only
reports the one figure. You don't re-fetch all the time.
Second, it looks like you are measuring time intervals based on when
your IOProc is called. As I said, this is not correct. The time
when your IOProc is invoked has nothing really to do with the
latency. The time stamp that matters is the output time stamp your
IOProc is provided. That is the time stamp that tells you when the
data your are providing is going to hit the hardware.
Finally, a word on style. Your IOProc is making blocking calls to
the HAL and it is calling NSLog which allocates memory and blocks in
fun and unexpected ways. You absolutely cannot be making these calls
from inside your IOProc. You also cannot be making calls to any ObjC
or CF objects from inside your IOProc. Doing any of these will
eventually cause glitching.
On Jun 22, 2009, at 1:24 PM, Ravneet Kaur wrote:
Hello Jeff
Thanks for the reply. I added following code in my Callback Proc as
shown below
OSStatus mySimpleIOProc (
AudioDeviceID inDevice,
const AudioTimeStamp* inNow,
const AudioBufferList* inInputData,
const AudioTimeStamp* inInputTime,
AudioBufferList* outOutputData,
const AudioTimeStamp* inOutputTime,
void* inClientData)
{
if(rendererFlag==1)
{
rendererFlag++;
UInt32 propSize;
propSize = sizeof(UInt32);
i_param_size = sizeof( UInt32 );
AudioDeviceGetProperty(inDevice, 0, 0, kAudioDevicePropertyLatency,
&propSize, &frameLatency);
UInt32 i_param_size = 0
AudioStreamGetProperty(inDevice,0, kAudioStreamPropertyLatency,
&i_param_size, &audioStreamLatency);
// struct timeval timeOfDay;
// gettimeofday(&timeOfDay, NULL);
// uint64_t nowTimeOfMachine = (timeOfDay.tv_sec) * 1E+6 +
timeOfDay.tv_usec;
NSLog(@"<======================= frameLatency 111
===================== %ld \n", frameLatency);
NSLog(@"<======================= Stream 111
===================== %ld \n", audioStreamLatency);
}
return 0;
}
The frame latency is always constant and the output is 30. While the
audio stream latency is always zero. I was expecting to see these
numbers correspond to hardware drift reported by external testing
device(1ms - 200 ms).
Do you think I am doing something wrong over here? I can send you
the whole source code if needed.Do you recommend Audio Queues for
better time synchronization?
Best Regards
Ravneet
On Tue, Jun 23, 2009 at 1:32 AM, Jeff Moore <email@hidden> wrote:
You should get no differences in terms of IO latency when using
AUHAL versus when using the HAL directly. AUHAL does not induce any
additional latency at all. So, I'm not sure I follow what you mean
when you say that "the Callback method was having an offset which
varied between 1ms-200 ms".
At any rate, the output time stamp that you get in your Render
callback or your IOProc is the time at which the audio stack will be
finished getting the first sample frame of the data to the hardware.
So to get the time at which the first sample frame will hit the
speaker, all you need to do is add the device's latency. The latency
figure can be gotten from adding together the HAL with the
properties, kAudioDevicePropertyLatency and
kAudioStreamPropertyLatency for the device and stream combination
you are interested in.
--
Jeff Moore
Core Audio
Apple
On Jun 22, 2009, at 11:17 AM, Ravneet Kaur wrote:
Hello All,
I am working on CoreAudio API to play an audio file and get exact
time of audio playback. I have used following structure
A) Get Components
ComponentDescription desc;
err = OpenAComponent(comp, theOutputUnit); //gains access to the
services provided by the component
B) Get The Audio Unit
verify_noerr(AudioUnitInitialize(*theOutputUnit));
C)Set Output Unit Properties
//Set the stream format of the output to match the input
result = AudioUnitSetProperty
(*theUnit
,kAudioUnitProperty_StreamFormat
,kAudioUnitScope_Input,theInputBus,theDesc,size);
result = AudioUnitSetParameter(*theUnit,
kAudioUnitParameterUnit_LinearGain, kAudioUnitScope_Output, 0,
(Float32)volume, 0);
D) Read File Information
FSPathMakeRef ((const UInt8 *)fileName, fileRef, 0); //Obtain
filesystem reference to the file
err = AudioFileOpen(fileRef, fsRdPerm,0,fileID); //Obtain
AudioFileID
E)Make Converter
err = AudioConverterNew( inASBD,outASBD , conv);
F)Setup Callback
//Sets the callback for the Audio Unit
err = AudioUnitSetProperty (*theOutputUnit,
kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Input,
0,renderCallback, sizeof(AURenderCallbackStruct));
G)Start the Audio Unit
Now the time I get in the Callback method was having an offset which
varied between 1ms-200 ms in worst cases. So I decided to get more
low level and binded callback method with HAL layer. I binded a
simple Proc with my default device
OSStatus mySimpleIOProc ( AudioDeviceID inDevice,const
AudioTimeStamp* inNow,const AudioBufferList* inInputData,const
AudioTimeStamp* inInputTime, AudioBufferList* outOutputData,
const AudioTimeStamp* inOutputTime, void* inClientData)
{
if(rendererFlag==1)
{
rendererFlag++;
UInt32 propSize;
propSize = sizeof(UInt32);
AudioDeviceGetProperty(inDevice, 0, 0, kAudioDevicePropertyLatency,
&propSize, &frameLatency);
struct timeval timeOfDay;
gettimeofday(&timeOfDay, NULL);
uint64_t nowTimeOfMachine = (timeOfDay.tv_sec) * 1E+6 +
timeOfDay.tv_usec;
NSLog(@"<======================= frameLatency 111
===================== %ld \n", frameLatency);
}
return 0;
}
I used MTCoreAudio as I have limited familiarity with HAL layer and
its c libraries.
myDevice = [MTCoreAudioDevice defaultOutputDevice];
[myDevice setIOProc:mySimpleIOProc withClientData:nil];
[myDevice deviceStart];
Now as soon as I start my Audio Unit, my callback method gets called
and then my HAL device Proc "mySimpleIOProc" (as shown above) gets
called. I was assuming that mySimpleIOProc will have better timing
accuracy in terms of inNow or if I take current time inside this
proc.
Can someone please help me in finding the exact playback time? I
read that HAL does provide synchronization capability but could not
find any samples for the same. The sample in CoreAudio SDK is too
difficult for a beginner like me.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden