• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: vDSP and iOS render callback
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: vDSP and iOS render callback


  • Subject: Re: vDSP and iOS render callback
  • From: Gregory Wieber <email@hidden>
  • Date: Wed, 25 Apr 2012 14:33:54 -0700

Worth noting that many of the new audio unit processors on iOS expect float. I switched all of my projects over to floating point based over a year ago and have seen tremendous speed-ups using vDSP. The ARM chips have had float support for a while now, but most of the tutorials and docs around the web were written when that wasn't the case.

Be aware that trying to convert back and forth (either by dividing, typecasting, whatever) yourself can be slow and that it's better to leverage Apple's code to do that for you where possible (using extAudioFile, converter audio units, etc). 

On Tue, Apr 24, 2012 at 8:23 PM, Brian Willoughby <email@hidden> wrote:

On Apr 24, 2012, at 18:25, Paul Davis wrote:
On Mon, Apr 23, 2012 at 4:16 PM, Kevin Dixon <email@hidden> wrote:
Maybe I'm missing something, I was looking at this question stack
overflow (http://stackoverflow.com/questions/3398753/using-the-apple-fft-and-accelerate-framework/3534926#3534926).
I was under the impression that the render callback supplied sample
data in 8.24 fixed format, yet I don't see any sort of type conversion
going on in the example autocorrelation implementation when they call
the FFT.
Can anyone shed some light on this?

this is  wrong. CoreAudio (like most audio SDKs) uses 32 bit floating
point as its only "native" data type. this absolutely includes the
render callback for an audio unit. unless iOS is totally different, in
which case forgive my emphasis.


Hey, Paul, the second thing you said is true: iOS is totally different.

CoreAudio for OSX does default to Float32 unless the code specifically requests otherwise.

CoreAudio for iOS defaults to Q8.24, a fixed-point format that is more efficient on the older ARM processors without fast float support.  In that world, I/O devices are 16-bit integer, and DSP is run at 8.24 for speed.

The confusing thing is that the ARM chips are now getting to where they support float as well as int, and so CoreAudio/iOS is moving towards Float32.

Brian Willoughby
Sound Consulting


_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

References: 
 >vDSP and iOS render callback (From: Kevin Dixon <email@hidden>)
 >Re: vDSP and iOS render callback (From: Paul Davis <email@hidden>)
 >Re: vDSP and iOS render callback (From: Brian Willoughby <email@hidden>)

  • Prev by Date: Re: vDSP and iOS render callback
  • Next by Date: AU Reverb2 parameters
  • Previous by thread: Re: vDSP and iOS render callback
  • Next by thread: Re: vDSP and iOS render callback
  • Index(es):
    • Date
    • Thread