Hello,
I am trying to convert PCM audio to iLBC audio using AudioConverterFillComplexBuffer.
For this I have a sourceFormat and an destinationFormat to create my AudioConverterRef:
AudioStreamBasicDescription sourceFormat = {0}; sourceFormat.mSampleRate = 44100; sourceFormat.mFormatID = kAudioFormatLinearPCM; sourceFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked; sourceFormat.mFramesPerPacket = 1; sourceFormat.mChannelsPerFrame = 1; sourceFormat.mBitsPerChannel = 16; sourceFormat.mBytesPerPacket = 2; sourceFormat.mBytesPerFrame = 2;
AudioStreamBasicDescription destinationFormat = {0}; destinationFormat.mFormatID = kAudioFormatiLBC; destinationFormat.mFormatFlags = 0; destinationFormat.mSampleRate = 8000; destinationFormat.mChannelsPerFrame = 1;
//160 frames per packet = 20ms mode destinationFormat.mFramesPerPacket = 160; destinationFormat.mBytesPerPacket = 38;
OSStatus status = AudioConverterNew(&sourceFormat, & destinationFormat, &audioConverter);
CAShow() logs the following:
PCMConverter2 0x13fe73590 Input: 1 ch, 44100 Hz, 'lpcm' (0x0000000C) 16-bit little-endian signed integer Output: 1 ch, 44100 Hz, 'lpcm' (0x00000009) 32-bit little-endian float SampleRateConverter 0x13fe60710 Input: 1 ch, 44100 Hz, 'lpcm' (0x00000009) 32-bit little-endian float Output: 1 ch, 8000 Hz, 'lpcm' (0x00000009) 32-bit little-endian float Algorithm 'norm', quality 96, Resampler2Wrapper @ 0x13fe511b0 PCMConverter2 0x13fe3b4a0 Input: 1 ch, 8000 Hz, 'lpcm' (0x00000009) 32-bit little-endian float Output: 1 ch, 8000 Hz, 'lpcm' (0x0000000C) 16-bit little-endian signed integer CodecConverter 0x13fe33180 Input: 1 ch, 8000 Hz, 'lpcm' (0x0000000C) 16-bit little-endian signed integer Output: 1 ch, 8000 Hz, 'ilbc' (0x00000000) 0 bits/channel, 38 bytes/packet, 160 frames/packet, 0 bytes/frame codec: 'aenc'/'ilbc'/'appl' Input layout tag: 0x640001 Output layout tag: 0x640001
Here is my first question: why is the bits per channel and bytes per frame for my resulting output both 0?
The OSStatus for creating the AudioConverterRef says noErr, so everything seems to be fine.
Now I want to call the AudioConverterFillComplexBuffer. For this I need an outOutputData which I create like this:
AudioBufferList outOutputData; outOutputData.mNumberBuffers = 1; outOutputData.mBuffers[0].mNumberChannels = 1; outOutputData.mBuffers[0].mDataByteSize = 38; outOutputData.mBuffers[0].mData = malloc(38);
The mDataByteSize is 38, because I assume that this should be the 20ms 38 bytes/packet from the AudioConverter. Is this correct? And so is my UInt32 ioOutputDataPacketSize = 38;
My user data is a circular buffer void *userData = &circularbuffer;
This is how my dataProc looks like:
static OSStatus myAudioConverterComplexInputDataProc( AudioConverterRef inAudioConverter, UInt32* ioNumberDataPackets, AudioBufferList* ioData, AudioStreamPacketDescription** outDataPacketDescription, void* inUserData){
TPCircularBuffer *sendBuffer = (TPCircularBuffer*) inUserData;
int32_t availableBytes;
SInt16 *audiobuffer = TPCircularBufferTail(sendBuffer, &availableBytes);
availableBytes = 1024; //min(availableBytes, 1024);
ioData->mBuffers[0].mData = malloc(availableBytes);
memcpy(ioData->mBuffers[0].mData, audiobuffer, availableBytes); ioData->mBuffers[0].mDataByteSize = availableBytes;
TPCircularBufferConsume(sendBuffer, availableBytes);
return noErr; }
So how many bytes should we provide? 1024? How is this defined or calculated?
So all in all I call AudioConverterFillComplexBuffer using the given parameters. OSStatus status = AudioConverterFillComplexBuffer(audioConverter, myAudioConverterComplexInputDataProc, userData, &ioOutputDataPacketSize, &outOutputData, NULL);
This results in an Error: Conversion ('insz’), which represents kAudioConverterErr_InvalidInputSize.
Do you have any idea where something is wrong with this code?
Thanks a lot and have nice day! Frederik |