• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Minimizing input/output latency ?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Minimizing input/output latency ?


  • Subject: Re: Minimizing input/output latency ?
  • From: Stéphane Letz <email@hidden>
  • Date: Tue, 19 Apr 2005 20:45:09 +0200


Le 19 avr. 05, à 20:10, William Stewart a écrit :

In addition to David's suggestion, there is also a new property available in Tiger called I/O Usage - kAudioDevicePropertyIOCycleUsage (this is also available on Panther with QT 7)

What this enables you to do is the following:

When an I/O Proc fires, it today asks you for output data that is: Now + SafetyOffset + OneBuffer - so you are producing audio for an output time (OT) which is buffer (BT) + safety offset(ST) time in the future.

With the I/O Usage property, you can adjust this output time by specifying some fraction of 1 that you want to cut down your I/O time by. So, the above formula becomes:

OT = BT * I/OUsage + ST

where the default value of I/OUsage is 1. This does not change the number of frames of audio in an I/O Proc, just the time at which the device will be given that data.

For example, if I/O Usage is set to 0.5 (and your buffer size is 512 sample frames), that means the audio data you are producing in a given I/O proc is for:

OT = 512 * 0.5 + ST

in other words, you are producing 512 frames of audio to be given to the device at OT of the safety offset + 256 (512 * 0.5) samples in the future.

Thus, your output latency has now been reduced by 256 samples in this example.

The trade-off that you make of course, is that you have also halved the time that you have to produce those samples - so the maximum CPU usage for that I/O proc would be now 50%

The reason why the HAL's default is the way it is today, is that it gives a client essentially 100% of the CPU to produce the output data, because we call you at a time where you have the duration of the I/O buffer size, before the data *has* to be given to the driver for it to output. So, this property enables you to reduce the actual output time to reduce latency.

There is no lower limit on the I/O usage, though of course it can't be too low to the point where you can't get the output data to the device in time!, that generates an overload of course.

We've implemented this property in AU Lab (which is the app we're shipping in Tiger - /Developer/Applications/Audio) - so you can set up a latency test and see how this can work for you.

Just as a side note - the input data is unchanged by this setting - it only affects the time for the output data. Input data is always the most recent data we can give you.


Bill


Thanks a lot Bill, I think this is exactly what I needed...

In my case where an IO callback uses the output buffer computed at the *previous* cycle, i have an almost constant and low IO processing cost, something like:

- read Input buffers
- do some minimal processing (that I can guaranty to be time bounded in my application because actual processing is done in another thread)
- write output buffers from the previous cycle.


Thus I should be able to use an I/OUsage value that almost exactly represent what percentage of the IO cycle my application cycle represent, in this case a *low* I/OUsage and it should gain almost one entire buffer size latency...

Is this correct?

Are we allowed to ask specific questions on Tiger now  ((-:  ?

Thanks

Stephane Letz
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


  • Follow-Ups:
    • Re: Minimizing input/output latency ?
      • From: William Stewart <email@hidden>
References: 
 >Minimizing input/output latency ? (From: Stéphane Letz <email@hidden>)
 >Re: Minimizing input/output latency ? (From: David Duncan <email@hidden>)
 >Re: Minimizing input/output latency ? (From: William Stewart <email@hidden>)

  • Prev by Date: Re: Minimizing input/output latency ?
  • Next by Date: Re: timestamping in AUHal renderer
  • Previous by thread: Re: Minimizing input/output latency ?
  • Next by thread: Re: Minimizing input/output latency ?
  • Index(es):
    • Date
    • Thread