• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
16ms of play through latency on iPod Touch? Reasonable?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

16ms of play through latency on iPod Touch? Reasonable?


  • Subject: 16ms of play through latency on iPod Touch? Reasonable?
  • From: Nick Porcaro <email@hidden>
  • Date: Thu, 26 Apr 2012 03:59:49 -0700

Hi folks,

I compiled Chris Adamson's play through example "CH10_iOSPlayThrough"
and I see about 16ms of latency on a recent iPod touch:

http://www.porcaro.org/MacToChrisAdamsonPlayThruApp.wav

The recording was made by feeding the output of my MacBook Pro (playing a click track) to the input of the iPod touch
using an iRig.

Then the output of the iRig and the MacBookPro was fed to a mixer, what you here above is this mix.

You can see if you zoom in there is about 16ms of delay (about 716 samples at 44100 sampling rate).

I set the preferred latency to 0.005:

   // Set preferred latency - Added by Nick 4/22/2012
   Float32 aBufferLength = 0.005; // In seconds
   OSStatus err = AudioSessionSetProperty(kAudioSessionProperty_PreferredHardwareIOBufferDuration,
                                          sizeof(aBufferLength), &aBufferLength);
   if (noErr != err) {
       NSLog(@"Cannot set preferred audio buffer duration");
   }

So I am wondering where the other 11ms come from?

I only copy the samples to the output in the input render callback:

   // just copy samples
   UInt32 bus1 = 1;
   CheckError(AudioUnitRender(effectState->rioUnit,
                              ioActionFlags,
                              inTimeStamp,
                              bus1,
                              inNumberFrames,
                              ioData),
              "Couldn't render from RemoteIO unit");


(As a side note:  Is there anyway to render the samples without them actually going to the output?,
All I really want to do is send them over the network using CocoaAsyncSocket: https://github.com/robbiehanson/CocoaAsyncSocket)

Any ideas?

I was going to try to run the Profiler next to see where the time is going.

- Nick



 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

References: 
 >AU Reverb2 parameters (From: ben kamen <email@hidden>)

  • Prev by Date: AU Reverb2 parameters
  • Next by Date: Fwd: AU Reverb2 parameters
  • Previous by thread: AU Reverb2 parameters
  • Next by thread: Fwd: AU Reverb2 parameters
  • Index(es):
    • Date
    • Thread