• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: The structure of an iOS AudioUnit app
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: The structure of an iOS AudioUnit app


  • Subject: Re: The structure of an iOS AudioUnit app
  • From: Schell Scivally <email@hidden>
  • Date: Tue, 1 Feb 2011 00:43:48 -0800

Okay - thanks! So I'll be creating a separate graph that pulls from
one main callback...

On Tue, Feb 1, 2011 at 12:32 AM, tahome izwah <email@hidden> wrote:
> As far as I know (and please correct me if I'm wrong) it is not
> possible to write system wide custom audio units on iOS (yet?). You
> can use the ones that Apple provides in your app, but if you want to
> do custom DSP you will need to do this in your app by plugging your
> DSP code into the render callback.
>
> I'm not sure if you can use AudioUnits that live in your own app
> sandbox or app resource structure but the App Store's terms of service
> seem to preclude that possibility.
>
> --th
>
> 2011/2/1 Schell Scivally <email@hidden>:
>> Hi all, this is my first post to this list - thanks in advance for
>> taking the time to read this.
>>
>> There's a gap in my very novice understanding of the structure of an
>> iOS AudioUnit hosting application. I've seen numerous example
>> applications that create an AUGraph, usually consisting of a RemoteIO
>> unit and one more unit (Converter or Mixer). The custom DSP is done in
>> a render callback attached to one of the two component descriptions,
>> more often the RemoteIO unit. What I'm lacking is an explanation of
>> how to write arbitrary Audio Units to chain together. Should I write
>> full fledged Audio Units (a .component) like the TremeloUnit example
>> project? Can I use those in an iOS app? Will I have to roll my own
>> 'graph' that hooks into a main render callback?
>>
>> I think I'm confused because instantiating an Audio Unit is done in a
>> black box. I see the AudioComponentDescription being used to retrieve
>> a reference to an Audio Unit that lives in the graph, but I don't see
>> how I can get a reference to a custom Audio Unit that I've written
>> separately.
>>
>> --
>> Schell Scivally
>  _______________________________________________
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list      (email@hidden)
> Help/Unsubscribe/Update your Subscription:
>
> This email sent to email@hidden
>



--
Schell Scivally
email@hidden (email@hidden)
http://blog.efnx.com
http://github.com/efnx
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

  • Follow-Ups:
    • Re: The structure of an iOS AudioUnit app
      • From: Schell Scivally <email@hidden>
References: 
 >The structure of an iOS AudioUnit app (From: Schell Scivally <email@hidden>)
 >Re: The structure of an iOS AudioUnit app (From: tahome izwah <email@hidden>)

  • Prev by Date: Re: The structure of an iOS AudioUnit app
  • Next by Date: Re: The structure of an iOS AudioUnit app
  • Previous by thread: Re: The structure of an iOS AudioUnit app
  • Next by thread: Re: The structure of an iOS AudioUnit app
  • Index(es):
    • Date
    • Thread