Playing ADPCM IMA4 data stream on iOS
Playing ADPCM IMA4 data stream on iOS
- Subject: Playing ADPCM IMA4 data stream on iOS
- From: John McKerrell <email@hidden>
- Date: Thu, 24 Feb 2011 13:47:32 +0000
Hi all
I've recently been implementing an ASF stream parser to use in an iPhone app. The app works with a specific brand of webcam and so the stream is quite consistent. The ASF parsing is pretty much complete and I am able to view the video stream (MJPG, so pretty simple) but I'm having issues with the audio.
From my ASF parser I get an NSData object every time it receives a complete audio data packet. I was hoping to "simply" pass this to iOS and have it play it. That part is so far confounding me.
Mplayer reports the following information about the audio when I use it to view the stream:
==========================================================================
Opening audio decoder: [ffmpeg] FFmpeg/libavcodec audio decoders
AUDIO: 8000 Hz, 1 ch, s16le, 32.0 kbit/25.00% (ratio: 4000->16000)
Selected audio codec: [ffadpcmimawav] afm: ffmpeg (FFmpeg WAV IMA ADPCM audio)
==========================================================================
AO: [coreaudio] 8000Hz 1ch s16le (2 bytes per sample)
The ASF "Audio Media Type" object I'm getting back describing the audio stream contains:
$1 = {
codecID = 17, // this appears to be WAVE_FORMAT_DVI_ADPCM also known as WAVE_FORMAT_IMA_ADPCM
numberOfChannels = 1,
samplesPerSecond = 8000,
averageBytesPerSecond = 4000,
blockAlignment = 164,
bitsPerSample = 4,
codecSpecificDataSize = 2
}
FWIW each of my chunks of data is 656 bytes long.
I was hoping to use AudioQueue to play this and have set up an AudioStreamBasicDescription as follows:
AudioStreamBasicDescription asbd;
asbd.mSampleRate = 8000;
asbd.mFormatID = kAudioFormatAppleIMA4;
asbd.mFormatFlags = 0;
asbd.mBytesPerPacket = 0;
asbd.mFramesPerPacket = 0;
asbd.mBytesPerFrame = 0;
asbd.mChannelsPerFrame = 1;
asbd.mBitsPerChannel = 0;
asbd.mReserved = 0;
I'm now getting no errors, but I'm also not getting any sound and so would appreciate any pointers you can offer. Am I doing this in the correct way? Should I be playing the audio using something else? Have I posted this to the wrong mailing list?
Hope you can help, I've pasted all the audio playing code at the following gist, FYI self.audioBuffer is simply an NSMutableArray containing a buffer of up to 5 data packets.
https://gist.github.com/842162
Thanks
John
P.S. I posted this yesterday and it seemed to get held for moderation due to size, but then didn't show up in the archives and I didn't receive anything telling me it had been rejected. Apologies if you've received it twice. _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden