I've got a AUGraph set up with multiple AudioFilePlayer audio units into a MultiChannelMixer. The files being played are isolated tracks from a music recording session. For example, a bass part, a drum part, and a guitar part. I need to schedule all AudioFilePlayer instances to play a region starting at exactly the same time. The issue I'm having is that when I schedule the new region, one or two tracks will occasionally be off by a bit (the guitar or drums or some track will be behind the rest of the instruments by a small but perceptible amount). I tried stopping the graph and resetting all of the AudioFilePlayer units, but the issue persists.
I'm seeking to a particular part in a recording, like this (a lot of this is copied from Apple documentation):
- (void) seekTo:(UInt32)sampleNum {
regionSampleStart = sampleNum;
bool wasPlaying = self.isPlaying;
if (self.isPlaying) {
AUGraphStop(processingGraph);
}
for (int i=0; i<sourceURLArray.count; i++) {
AudioFileID audioFile = audioFiles[i];
AudioStreamBasicDescription fileFormat = fileFormats[i];
AudioUnit fileAU = fileAUs[i];
AudioUnitReset(fileAU, kAudioUnitScope_Global, 0);
//
// calculate the duration
UInt64 nPackets;
UInt32 propsize = sizeof(nPackets);
AudioFileGetProperty(audioFile, kAudioFilePropertyAudioDataPacketCount, &propsize, &nPackets);
ScheduledAudioFileRegion rgn;
memset (&rgn.mTimeStamp, 0, sizeof(rgn.mTimeStamp));
rgn.mTimeStamp.mFlags = kAudioTimeStampSampleTimeValid;
rgn.mTimeStamp.mSampleTime = 0;
rgn.mCompletionProc = NULL;
rgn.mCompletionProcUserData = NULL;
rgn.mAudioFile = audioFile;
rgn.mLoopCount = 0;
rgn.mStartFrame = sampleNum;
rgn.mFramesToPlay = (UInt32)(nPackets * fileFormat.mFramesPerPacket) - sampleNum;
// tell the file player AU to play from seekTime to end of the file
AudioUnitSetProperty(fileAU, kAudioUnitProperty_ScheduledFileRegion,
kAudioUnitScope_Global, 0,&rgn, sizeof(rgn));
// prime the fp AU with default values
UInt32 defaultVal = 0;
AudioUnitSetProperty (fileAU, kAudioUnitProperty_ScheduledFilePrime,
kAudioUnitScope_Global, 0, &defaultVal, sizeof(defaultVal));
// tell the fp AU when to start playing (this ts is in the AU's render time stamps; -1 means next render cycle)
AudioTimeStamp startTime;
memset (&startTime, 0, sizeof(startTime));
startTime.mFlags = kAudioTimeStampSampleTimeValid;
startTime.mSampleTime = -1;
AudioUnitSetProperty(fileAU, kAudioUnitProperty_ScheduleStartTimeStamp,
kAudioUnitScope_Global, 0, &startTime, sizeof(startTime));
}
if (wasPlaying) {
AUGraphStart(processingGraph);
}
}
I can think of two solutions: Setting startTime.mSampleTime to some appropriate point in the future for all audio units so that all of the files will play perfectly in sync. Is there a way to schedule future audio regions other than by the startTime.mSampleTime = -1 method?
Another solution is to flush the AUGraph so that all AudioFilePlayers will need to render, with no staggering of scheduled regions. Is there a way to do this, akin to the AudioQueueFlush method?
Thanks,
Adam Bellard