• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Echo removal vs latency: Switching between VPIO and Remote IO?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Echo removal vs latency: Switching between VPIO and Remote IO?


  • Subject: Echo removal vs latency: Switching between VPIO and Remote IO?
  • From: Michael Tyson <email@hidden>
  • Date: Mon, 18 Jun 2012 19:12:31 +0200

Hi!

I'm updating my live-looper app which simultaneously plays back loops, while recording. When used without headphones, it's necessary to use the Voice Processing IO unit to kill the echo/feedback. The downside of VPIO is the increased latency, of the order of tens of ms. Without VPIO, I'm able to achieve a nice 6ms playthrough latency.

So, my technique to date has been tearing down and re-creating the audio graph when headphones are plugged/unplugged to switch between the VPIO unit and the low-latency and smaller-footprint Remote IO unit.

This has been fine so far (although it introduces an uncomfortably long delay in the teardown/setup process), but recently I've noticed that after switching, the input level of the Remote IO unit is extremely (and unaccountably) low. I can reproduce this - I suspect it may be an issue of some kind with a recent update.

So, it's made me want to explore alternative options. Does anyone know of a nicer way to accomplish lightweight, low-latency audio when echo removal isn't required, and echo removal when it is?

I experimented with using the kAUVoiceIOProperty_BypassVoiceProcessing flag - setting that to 1, and setting kAUVoiceIOProperty_VoiceProcessingQuality to 0 seems to enable lower-latency playback without the stuttering, but it still messes up low-latency playthrough - it seems to introduce a large amount of jitter, which means greater latency to overcome it.

Anyone know about this?

Or failing that, anyone have any theories as to why switching to Remote IO after using VPIO messes up the input signal?

Many thanks,
Michael
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

  • Prev by Date: Scrubbing and aufileplayer
  • Next by Date: how to know when AudioGraph as finished? ( I.E. the last sample has been rendered by kAudioUnitSubType_RemoteIO?
  • Previous by thread: Scrubbing and aufileplayer
  • Next by thread: how to know when AudioGraph as finished? ( I.E. the last sample has been rendered by kAudioUnitSubType_RemoteIO?
  • Index(es):
    • Date
    • Thread