Re: reading audiofile into floatbuffer
Re: reading audiofile into floatbuffer
- Subject: Re: reading audiofile into floatbuffer
- From: Robert Grant <email@hidden>
- Date: Thu, 15 Jan 2004 08:28:06 -0500
Hi Stiwi,
I'll have a crack...
On Jan 15, 2004, at 8:05 AM, stiwi kirch wrote:
Thanks Philippe,
but i still stuck with AudioConverterFillComplexBuffer. How do i have
to set up a AudioBufferList for the AudioConverterFillComplexBuffer
call if my AudioStreamBasicDescription target looks like this:
The AudioBufferList has to be able to hold the data that's coming out
of the converter which as you say is deinterleaved channels which means
you'll need two buffers in the list as you correctly surmise.
// WaveTable::printStreamDescription
Sample Rate: 44100.000000
Format ID: lpcm
Format Flags: 2B
Bytes per Packet: 4
Frames per Packet: 1
Bytes per Frame: 4
Channels per Frame: 2
Bits per Channel: 32
Here's my description of the canonical internal format:
AudioStreamBasicDescription floatFormat;
floatFormat.mSampleRate = 44100;
floatFormat.mFormatID = kAudioFormatLinearPCM;
floatFormat.mFormatFlags = kLinearPCMFormatFlagIsBigEndian |
kLinearPCMFormatFlagIsNonInterleaved |
kLinearPCMFormatFlagIsPacked |
kLinearPCMFormatFlagIsFloat;
floatFormat.mBytesPerPacket = 4;
floatFormat.mFramesPerPacket = 1;
floatFormat.mBytesPerFrame = 4;
floatFormat.mChannelsPerFrame = 2;
floatFormat.mBitsPerChannel = 32;
Which is I think what you have.
// WaveTable::readFileIntoMemory
BytesReturned: 233732
Packets: 58433
Do i need something like this?
AudioBufferList floatBuffer;
floatBuffer.mNumberBuffers = 2; // 2 because i need 2 mono float
buffers?
Correct - one for the left channel, one for the right. The trouble is
you need to allocate space for the second buffer - the AudioBufferList
only gives you one AudioBuffer for free. The best way to do this is not
to fire one up on the stack but to malloc one:
AudioBufferList* floatBuffer =
(AudioBufferList*)malloc(sizeof(AudioBufferList) +
sizeof(AudioBuffer));
floatBuffer->mNumberBuffers = 2;
etc...
It's not pretty but it's what you need to do.
floatBuffer.mBuffers[0].mData = malloc(?); // how can i calculate the
memSize of one buffer?
This can be as large as you want. A 4K buffer would probably be good.
floatBuffer.mBuffers[0].mNumberChannels = 1; // because it's one mono
channel?
Yep
floatBuffer.mBuffers[0].mDataByteSize = ?; // some size i don't know
how to calulate
This is where you let the converter know how big the buffer you
allocated just above is - so it won't overflow.
floatBuffer.mBuffers[1].mData = malloc(?);
floatBuffer.mBuffers[1].mNumberChannels = 1;
floatBuffer.mBuffers[1].mDataByteSize = ?;
That's right - do the same again for the right channel.
Robert.
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.