RE: How can I use a WAV based DSP library with AudioQueue?
RE: How can I use a WAV based DSP library with AudioQueue?
- Subject: RE: How can I use a WAV based DSP library with AudioQueue?
- From: "Andrew E. Davidson" <email@hidden>
- Date: Tue, 9 Dec 2008 13:06:16 -0800
Hi Tahome
Sorry I am really new to audio programming. I think I have some miss
understanding about linear PCM, WAV, CAF, and AIFF.
The library I want to use expects the buffer to be a linear PCM
representation. It can work with 6, 16 bit int, or floating point big-endian
or little-endian. It comes with a sample program that reads a wav file. The
sample reads the header section, and then just loads the bytes of the file
up into a buffer that is then processed by the library. It calls fread().
CoreAudioOverview.pdf says there are several different linear PCM variants.
So my question is I start with an audio file. I call AudioFileReadPackets()
to load the buffer into memory. What format will the buffer be in? For
example if I started with a file in mp3 format, I assume the buffer will be
in linear PCM but which variant?
CoreAudioOverview.pdf mentions using 'Audio Converter Services' to go
between the variants but does not provide a pointer the documentation! Any
idea where I can look for more info?
Eventually I will get back a buffer from the audio library. I will want to
call AudioQueueEnqueueBuffer() to cause it to be played. Do I need to do
some sort of conversion before on the buffer first? If so what format do I
need to convert to?
Thanks
Andy
________________________________
Replace Ads with your Stuff! Learn how @ www.prowebsurfer.com
-----Original Message-----
From: tahome izwah [mailto:email@hidden]
Sent: Tuesday, December 09, 2008 12:01 PM
To: Andrew E. Davidson
Cc: email@hidden
Subject: Re: How can I use a WAV based DSP library with AudioQueue?
I think you are confusing different terms here: "WAV" (MS-WAVE) is a
file format for storing PCM (and compressed formats like IMA ADPCM) in
a file that has a specific format.
Is your "WAV based DSP library" really using MS-WAVE files to apply
some kind of DSP processing to them?
Or are you in fact talking about using DSP algorithms on a linear PCM
representation of an audio signal? If so, what data format does that
library expect? IEEE754 float? 16bit signed ints? You can convert MP3
content to these data types using the CoreAudio API.
--th
2008/12/9 Andrew E. Davidson <email@hidden>:
> Hi
>
>
>
> I want to modify the iPhone "Speak Hear" sample application to use a WAV
> based DSP library.
>
>
>
> I figured out how to play audio files in just about any format using
> AudioFileReadPackets() AudioQueueEnqueueBuffer(). How ever I am not sure
> what format the buffer returned by AudioFileReadPackets will be in or what
> format the buffer needs to be in before I can call
> AudioQueueEnqueueBuffer().
>
>
>
> At a high level I want do something like
>
>
>
> 1) read buffer from mp3
>
> 2) convert to wav format
>
> 3) process buffer using WAV based DSP library
>
> 4) play buffer
>
>
>
> I think WAV is a linear PCM format. Will AudioFileReadPackets() returns
the
> buffer in a format I can use with my WAV library or do I need to run some
> sort of conversation / codec? Can some one point me at the library for
> conversions?
>
>
>
> I assume that if I need to convert the format between step 2) & 3) I will
> need to perform another conversation from wav to ??? between step 3) and
4)
> Is this correct?
>
>
>
> I expect my application will get more complicated in the future. Should I
be
> using AudioQueues or something else? The Audio documentation talks about
> audio graphs, how ever its not clear to me if they are available on iPhone
> or if they will make my job easier or not.
>
>
>
> Thanks
>
>
>
> Andy
>
>
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden