Re: reading audiofile into floatbuffer
Re: reading audiofile into floatbuffer
- Subject: Re: reading audiofile into floatbuffer
- From: stiwi kirch <email@hidden>
- Date: Fri, 16 Jan 2004 21:07:30 +0100
On 15.01.2004, at 14:28, Robert Grant wrote:
Hi Stiwi,
I'll have a crack...
On Jan 15, 2004, at 8:05 AM, stiwi kirch wrote:
Thanks Philippe,
but i still stuck with AudioConverterFillComplexBuffer. How do i have
to set up a AudioBufferList for the AudioConverterFillComplexBuffer
call if my AudioStreamBasicDescription target looks like this:
The AudioBufferList has to be able to hold the data that's coming out
of the converter which as you say is deinterleaved channels which
means you'll need two buffers in the list as you correctly surmise.
// WaveTable::printStreamDescription
Sample Rate: 44100.000000
Format ID: lpcm
Format Flags: 2B
Bytes per Packet: 4
Frames per Packet: 1
Bytes per Frame: 4
Channels per Frame: 2
Bits per Channel: 32
Here's my description of the canonical internal format:
AudioStreamBasicDescription floatFormat;
floatFormat.mSampleRate = 44100;
floatFormat.mFormatID = kAudioFormatLinearPCM;
floatFormat.mFormatFlags = kLinearPCMFormatFlagIsBigEndian |
kLinearPCMFormatFlagIsNonInterleaved |
kLinearPCMFormatFlagIsPacked |
kLinearPCMFormatFlagIsFloat;
floatFormat.mBytesPerPacket = 4;
floatFormat.mFramesPerPacket = 1;
floatFormat.mBytesPerFrame = 4;
floatFormat.mChannelsPerFrame = 2;
floatFormat.mBitsPerChannel = 32;
Which is I think what you have.
// WaveTable::readFileIntoMemory
BytesReturned: 233732
Packets: 58433
Do i need something like this?
AudioBufferList floatBuffer;
floatBuffer.mNumberBuffers = 2; // 2 because i need 2 mono float
buffers?
Correct - one for the left channel, one for the right. The trouble is
you need to allocate space for the second buffer - the AudioBufferList
only gives you one AudioBuffer for free. The best way to do this is
not to fire one up on the stack but to malloc one:
AudioBufferList* floatBuffer =
(AudioBufferList*)malloc(sizeof(AudioBufferList) +
sizeof(AudioBuffer));
floatBuffer->mNumberBuffers = 2;
etc...
It's not pretty but it's what you need to do.
floatBuffer.mBuffers[0].mData = malloc(?); // how can i calculate the
memSize of one buffer?
This can be as large as you want. A 4K buffer would probably be good.
Thats exactly the part i don't understand! :( If i allocate a 4k buffer
then it can only be used as a temp buffer for the AudioConverter to
fill in 4k chunks of converted data. So i have to call
AudioConverterFillComplexBuffer repeatedly until all the data is
convert. AudioConverterFillComplexBuffer is using the same buffer over
and over again. Right? But i need the converted data to be in one or
two (depending on the number of channels ) float buffers to be useful
as a wavetable. Does that mean that i have to copy the 4k chunks after
each AudioConverterFillComplexBuffer call from my 4k buffer to my
wavetable buffer? If so how do i calculate the final size of all the
converted data to allocate enough space for my wavetable buffer?
floatBuffer.mBuffers[0].mNumberChannels = 1; // because it's one mono
channel?
Yep
floatBuffer.mBuffers[0].mDataByteSize = ?; // some size i don't know
how to calulate
This is where you let the converter know how big the buffer you
allocated just above is - so it won't overflow.
floatBuffer.mBuffers[1].mData = malloc(?);
floatBuffer.mBuffers[1].mNumberChannels = 1;
floatBuffer.mBuffers[1].mDataByteSize = ?;
That's right - do the same again for the right channel.
Robert.
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.