• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: How can I use a WAV based DSP library with AudioQueue?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: How can I use a WAV based DSP library with AudioQueue?


  • Subject: Re: How can I use a WAV based DSP library with AudioQueue?
  • From: William Stewart <email@hidden>
  • Date: Tue, 9 Dec 2008 17:51:43 -0800

There is two basic concepts to get sorted out here:

(1) File formats
files are containers of data and they have meta-data to describe the format of the data they contain. Examples of file formats are WAV, CAF, AIFF


(2) Audio data formats
this is the "actual audio data" - in core audio this is always described by an AudioStreamBasicDescription.


LinearPCM, AAC, MP3 are data formats - there are also .mp3 files and .aac files and these are very "simple" file formats that are essentially just the raw mp3 or aac data laid out in the file. LinearPCM can come in various flavours - different bit depths, different endian order, etc... Some files have restrictions on the different flavours of linear pcm that they can contain. CAF files can contain any known audio data format, including mp3, aac and the various flavours of linear pcm.

AudioFile is an API that reads and writes audio data to/from an audio file. It does NOT transform the audio data in anyway. So, if you are dealing with an .mp3 file, then your usage of the AudioFile API will be to read/write mp3 data.

So, to do what you described, you would need to:

Use audio file to read the MP3 packets
Use audio queue (offline render) to decode those packets (convert them into linear pcm)
process the linear pcm using your DSP library
use (probably) audio queue again to output your processed linear pcm data



Now, this is alot of work, but I guess it depends on what your application is for that will determine what parts of this chain you can modify/adapt


Hopefully, that gives you a basic orientation to help you get started

Bill

On Dec 9, 2008, at 1:06 PM, Andrew E. Davidson wrote:

Hi Tahome

Sorry I am really new to audio programming. I think I have some miss
understanding about linear PCM, WAV, CAF, and AIFF.

The library I want to use expects the buffer to be a linear PCM
representation. It can work with 6, 16 bit int, or floating point big-endian
or little-endian. It comes with a sample program that reads a wav file. The
sample reads the header section, and then just loads the bytes of the file
up into a buffer that is then processed by the library. It calls fread().


CoreAudioOverview.pdf says there are several different linear PCM variants.

So my question is I start with an audio file. I call AudioFileReadPackets()
to load the buffer into memory. What format will the buffer be in? For
example if I started with a file in mp3 format, I assume the buffer will be
in linear PCM but which variant?


CoreAudioOverview.pdf mentions using 'Audio Converter Services' to go
between the variants but does not provide a pointer the documentation! Any
idea where I can look for more info?



Eventually I will get back a buffer from the audio library. I will want to
call AudioQueueEnqueueBuffer() to cause it to be played. Do I need to do
some sort of conversion before on the buffer first? If so what format do I
need to convert to?



Thanks

Andy

________________________________

Replace Ads with your Stuff! Learn how @  www.prowebsurfer.com
-----Original Message-----
From: tahome izwah [mailto:email@hidden]
Sent: Tuesday, December 09, 2008 12:01 PM
To: Andrew E. Davidson
Cc: email@hidden
Subject: Re: How can I use a WAV based DSP library with AudioQueue?

I think you are confusing different terms here: "WAV" (MS-WAVE) is a
file format for storing PCM (and compressed formats like IMA ADPCM) in
a file that has a specific format.

Is your "WAV based DSP library" really using MS-WAVE files to apply
some kind of DSP processing to them?

Or are you in fact talking about using DSP algorithms on a linear PCM
representation of an audio signal? If so, what data format does that
library expect? IEEE754 float? 16bit signed ints? You can convert MP3
content to these data types using the CoreAudio API.

--th



2008/12/9 Andrew E. Davidson <email@hidden>:
Hi



I want to modify the iPhone "Speak Hear" sample application to use a WAV
based DSP library.




I figured out how to play audio files in just about any format using
AudioFileReadPackets() AudioQueueEnqueueBuffer(). How ever I am not sure
what format the buffer returned by AudioFileReadPackets will be in or what
format the buffer needs to be in before I can call
AudioQueueEnqueueBuffer().




At a high level I want do something like



1) read buffer from mp3

2) convert to wav format

3) process buffer using WAV based DSP library

4) play buffer



I think WAV is a linear PCM format. Will AudioFileReadPackets() returns
the
buffer in a format I can use with my WAV library or do I need to run some
sort of conversation / codec? Can some one point me at the library for
conversions?




I assume that if I need to convert the format between step 2) & 3) I will
need to perform another conversation from wav to ??? between step 3) and
4)
Is this correct?



I expect my application will get more complicated in the future. Should I
be
using AudioQueues or something else? The Audio documentation talks about
audio graphs, how ever its not clear to me if they are available on iPhone
or if they will make my job easier or not.




Thanks



Andy





_______________________________________________ Do not post admin requests to the list. They will be ignored. Coreaudio-api mailing list (email@hidden) Help/Unsubscribe/Update your Subscription: This email sent to email@hidden

_______________________________________________ Do not post admin requests to the list. They will be ignored. Coreaudio-api mailing list (email@hidden) Help/Unsubscribe/Update your Subscription: This email sent to email@hidden
References: 
 >How can I use a WAV based DSP library with AudioQueue? (From: "Andrew E. Davidson" <email@hidden>)
 >Re: How can I use a WAV based DSP library with AudioQueue? (From: "tahome izwah" <email@hidden>)
 >RE: How can I use a WAV based DSP library with AudioQueue? (From: "Andrew E. Davidson" <email@hidden>)

  • Prev by Date: RE: PCI Audio Driver and takeTimeStamp
  • Next by Date: IOAudioStream::clipIfNecessary() - Error???
  • Previous by thread: RE: How can I use a WAV based DSP library with AudioQueue?
  • Next by thread: IOAudioStream::clipIfNecessary() - Error???
  • Index(es):
    • Date
    • Thread