• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Using AVAudioSession with remote I/O audio unit to read
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Using AVAudioSession with remote I/O audio unit to read


  • Subject: Re: Using AVAudioSession with remote I/O audio unit to read
  • From: Edward Agabeg <email@hidden>
  • Date: Sat, 07 Dec 2013 17:04:56 -0500

Since as of iOS 7, the C API has been deprecated, how do we use AVAudioSession to get live microphone data using remote I/O unit, and process or read it on the fly for a mic based application?

>From what I understand, AVAudioRecorder only records the data to a local file.

I was to get access to the audio buffer on the fly and process live on the app.

I haven’t managed to find any sample code that uses remote I/O audio unit with AVAudioSession API.

Could any please point me towards the right resources?

Don't confuse the 'Audio Session' with the API's that actually allow you to work with audio data itself. To use our terminology the Audio Session simply "sets the audio context for your app.". It doesn't actually 'play' or 'record' anything.

The C Audio Session API's are indeed deprecated but all the functionality (and more) has been moved into AVAudioSession(.h) and the transition is quite trivial. AVAudioSession has been available for a long time and it was over 2 years ago we told folks to slowly transition away from the C API, only using it if something wasn't yet available in AVAudioSession. That transition period is now done with iOS 7.

Regarding the RemoteIO - As Daniel mentioned, the aurioTouch2 sample is a place to start, unfortunately it's lagged behind in being updated so yes it's still using the C Audio Session API's but if you're trying to learn how to use the RemoteIO, Audio Session isn't really directly related so you can ignore the deprecation warnings, check out how the RIO is used then take this knowledge and write your own app. using AVAudioSession.

A couple of things to note about aurioTouch2:
I) it's way more complicated than it needs to be due to some older FFT code and extra audio format conversions and not being updated for iOS7 yet - we know and we'll fix this in the next completely new version.
2) it performs input by calling AudioUnitRender for the input on the output render proc. It doesn't actually use an input proc. So in terms of the question being asked here, for folks *just* wanting to do input only, you need to change the way the unit is setup from what is shown in the sample. Code seems to be available readily in the community for this with a bit of searching.
3) by just commenting out a bit of code in the render proc., you can completely bypass all the extra conversion, FFT, OGL stuff, turning the sample into a simple "Thru" box allowing even a beginner with the RIO to experiment.

Back to AVAudioSession - the API Reference does list some other samples that have been updated to use AVAudioSession if you want some example usage of this object.
  https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVAudioSession_ClassReference/Reference/Reference.html

And I've pretty much updated all our Q&A's that use to mention the old C API to now discuss AVAudioSession along with some expanded information added. Just search for AVAudioSession in the iOS reference library:

https://developer.apple.com/library/ios/qa/qa1799/_index.html
https://developer.apple.com/library/ios/qa/qa1631/_index.html
https://developer.apple.com/library/ios/qa/qa1715/_index.html
https://developer.apple.com/library/ios/qa/qa1749/_index.html
https://developer.apple.com/library/ios/qa/qa1803/_index.html
https://developer.apple.com/library/ios/qa/qa1754/_index.html

Finally, don't forget the WWDC videos where Audio Session has pretty much been covered in depth over the last few years (even if some of those older session talk about the C APIs, the concepts are important and still useful) and any suggestions for content that would be helpful to add or update in the Audio Session Programming Guide should be filed as bugs <bugreport.apple.com> for the Documentation folks to sort out.

Hope some of this is helpful,
edward

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

  • Prev by Date: Re: Simulate HIView using NSView for AudioUnits?
  • Next by Date: Re: Interruption listener is not called when pausing music from music app.
  • Previous by thread: Only one module for 2 Mac plugin bundles (AU, VST) gets loaded
  • Next by thread: AU won't open in Logic Pro 9, but will in Logic Pro X
  • Index(es):
    • Date
    • Thread