• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: reading audiofile into floatbuffer
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: reading audiofile into floatbuffer


  • Subject: Re: reading audiofile into floatbuffer
  • From: stiwi <email@hidden>
  • Date: Tue, 20 Jan 2004 14:19:19 +0100

Hi again.

There is one (hopefully) last question about the AudioConverter. The
size of the buffer i am allocating for the AudioConverter seems to be
exactly four times the size of the actual audio data.

For example.

AudioFileReadPackets(fileID, false, &bytes, NULL, 0, &packets,
entireFileBuffer)

returns with bytes = 657530 and packets = 328765. So I allocate a
buffer with packets. Right?

UInt32 packetSize = packets;
AUBufferList auBuffer;
AudioBufferList table;

auBuffer.Allocate(floatASBD, packets);
table = &auBuffer.PrepareBuffer(floatASBD, packets);

AudioConverterFillComplexBuffer(converter, ACInputProc, this,
&packetSize, &auBuffer.GetBufferList(), NULL);

UInt32 numBuffers = table->mNumberBuffers;
UInt32 waveSize = table->mBuffers[0].mDataByteSize;

returns with packetSize = 328765, numBuffers = 2 and waveSize = 1315060.

Now if i try to do something with my table data like this

int waveSize = table->mBuffers[0].mDataByteSize;

float* outputL = (float*)(table->mBuffers[0].mData);
float* outputR = (float*)(table->mBuffers[1].mData);

for (int sample = 0; sample < waveSize; sample++)
{
do something with(outputL[sample]);
do something with(outputR[sample]);
}

the data i get sounds only right for waveSize / 4. (1315060 / 4 =
328765 which equals packetSize) mmhh?
I tried about 20 different soundfiles of different type and size (all
in stereo) and its always the same.

Doesn't auBuffer.Allocate(floatASBD, packets) only allocate exactly
the memory it needs to hold the data described in floatASBD?

floatASBD looks like this.

Sample Rate: 44100.000000
Format ID: lpcm
Format Flags: 2B
Bytes per Packet: 4
Frames per Packet: 1
Bytes per Frame: 4
Channels per Frame: 2
Bits per Channel: 32

What am i doing wrong?

Greetings,

stiwi
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives: http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.

References: 
 >reading audiofile into floatbuffer (From: stiwi <email@hidden>)
 >Re: reading audiofile into floatbuffer (From: stiwi kirch <email@hidden>)
 >Re: reading audiofile into floatbuffer (From: Robert Grant <email@hidden>)
 >Re: reading audiofile into floatbuffer (From: stiwi kirch <email@hidden>)
 >Re: reading audiofile into floatbuffer (From: stiwi kirch <email@hidden>)

  • Prev by Date: Re: Audio Unit capabilites?
  • Next by Date: MIDIPacketList behaviour problem
  • Previous by thread: Re: reading audiofile into floatbuffer
  • Next by thread: Testing Audio Unit Effects and Music Devices
  • Index(es):
    • Date
    • Thread