• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: iPhone remoteIO audio unit question
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: iPhone remoteIO audio unit question


  • Subject: Re: iPhone remoteIO audio unit question
  • From: Doug Wyatt <email@hidden>
  • Date: Mon, 2 Nov 2009 08:17:35 -0800


On Oct 30, 2009, at 17:07 , Bruce Meagher wrote:

Hi all,

I have a question about the RemoteIO audio unit on the iPhone and I'm hoping someone on the list can comment if what I'm doing is OK (or is just plain dumb!)

I've created a little Objective-C class that plays my pcm based sound effects through a RemoteIO audio unit. I pass a buffer to a play method and the audio starts playing with very little delay (my render callback starts playing the buffer within CurrentHardwareIOBufferDuration samples). Exactly what I needed.

Since I'm such an OO guy... when I wanted to play two sounds simultaneously I just created another instance of my little Objective-C class. This seems to work fine on the simulator and device, and the sounds appear to be mixed just fine. However, I'm wondering if I'm breaking some type of rule by creating two RemoteIO audio units inside my app (and just by luck this is working). Clearly there's some underlying system process that mixes audio from all the different sources (ipod, sms, phone, calender,... etc) so is having two RemoteIO audio units in one app ok?

I can create graph with a mixer connected to the remoteIO audio unit, but creating two instances just seemed to be so easy (although in hindsight maybe not the right thing).

Anyone know if this is OK?


You're not breaking a rule but it is pretty inefficient to have two remote I/O instances. I've never measured it, so I can't quantify this, but I can tell you that you are getting a separate realtime thread in your process per instance, and that there is a second pair of mach messages between your app and the media server process on each I/O cycle.

I would suggest that you change your ObjC class to represent one of any number of inputs to a single mixer connected to a single remote I/ O instance -- unless your sources are at varying sample rates in which case you have to create a more complex graph.

Now, if you're just playing PCM sound effects, you might consider using system sounds (if you really are going to be doing nothing more than sound effects) or OpenAL (which gives you that shared mixer/ output unit and is pretty simple to use for this kind of thing).

Doug

_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


  • Prev by Date: Re: XCode AU project template issue
  • Next by Date: AU, plugin tempo, musical time information
  • Previous by thread: Re: Audio Video sync with Core Audio Queue Services
  • Next by thread: AU, plugin tempo, musical time information
  • Index(es):
    • Date
    • Thread