Re: ExtAudioFileWrite writes zeroes?
Re: ExtAudioFileWrite writes zeroes?
- Subject: Re: ExtAudioFileWrite writes zeroes?
- From: patrick machielse <email@hidden>
- Date: Tue, 3 Jul 2007 17:02:44 +0200
Op 3-jul-2007, om 04:38 heeft William Stewart het volgende geschreven:
If you are getting zeroes in the file that is because you are
feeding in zeroes, so it really depends on what (and where) exactly
you are getting the data from to write.
For instance, if you get the data from a render notification
callback (AudioUnitAddRenderNotify), then make sure you are only
grabbing the data when the *ioAction flag signals PostRender...
But, I don't know if this is the problem or not (you need to
provide more info)
I'm using the following AUGraph:
AUAudioFilePlayer -> 3rd party Audio Unit -> GenericOutput -> Render
callback for file writing.
The render callback will call a recorder object which can be either a
wav recorder (ext. audio file) or mp3 (LAME).
===
// file writing callback
OSStatus recordCallback(void * recorder,
AudioUnitRenderActionFlags * ioActionFlags,
const AudioTimeStamp * inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList * ioData)
{
// record only after rendering
if ( (*ioActionFlags & kAudioUnitRenderAction_PostRender) != 0 ) {
[(PNRecorder *)recorder recordData:ioData
frames:inNumberFrames];
}
return noErr;
}
// in recorder
- (void)recordData:(AudioBufferList *)data frames:(UInt32)numberOfFrames
{
if ( !wavfile ) {
NSLog(@"no audio file to record to! %s", __PRETTY_FUNCTION__);
return;
}
OSStatus error;
if ( error = ExtAudioFileWriteAsync(wavfile, numberOfFrames,
data) ) {
NSLog(@"could not record wav data (%ld)\n", error);
}
}
===
I'm not starting or stopping the AUGraph, I don't know if that is
required?
I'm running the AUGraph manually using this code:
===
int FRAMES = 512;
// render settings
AudioTimeStamp timestamp;
timestamp.mSampleTime = 0;
timestamp.mFlags = kAudioTimeStampSampleTimeValid;
AudioBufferList bufferList;
bufferList.mNumberBuffers = 1;
bufferList.mBuffers[0].mNumberChannels = nChannels;
bufferList.mBuffers[0].mDataByteSize = FRAMES * nChannels * sizeof
(float);
// processing loop
do {
// output unit provides its own buffer
bufferList.mBuffers[0].mData = NULL;
// pull in some audio data from the generic output
AudioUnitRenderActionFlags flags = 0;
AudioUnitRender(outputUnit,
&flags,
×tamp,
0,
FRAMES,
&bufferList);
timestamp.mSampleTime += FRAMES;
} while ( timestamp.mSampleTime < totalFrames );
===
I've noticed that the signal never goes to 0 when I'm recording mp3
files. When I'm recording wav files it happens regularly. Once it
goes to 0 it stays there for the duration of the track. The 0s are
coming out of the AudioFilePlayerAU.
One theory I have is that things go wrong when I'm processing
uncompressed -> uncompressed. In my case caf -> wav files. I've
measured the typical throughput of the processing loop:
caf -> wav : 12.0 MB/S
caf -> mp3 : 2.5 MB/S
I've seen some buffer properties for the file player in the header
files, but I've also read on this list that these are not in fact
implemented?
Grateful for any insights,
patrick
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden