Re: Changing the latency from within the render callback.
Re: Changing the latency from within the render callback.
- Subject: Re: Changing the latency from within the render callback.
- From: Jim Wintermyre <email@hidden>
- Date: Thu, 26 Feb 2004 13:55:13 -0800
At 10:35 PM +0100 2/26/04, Philippe Wicker wrote:
On Feb 26, 2004, at 9:50 AM, Jim Wintermyre wrote:
On Mon, 23 Feb 2004, William Stewart wrote:
On 23/02/2004, at 1:51 PM, Philippe Wicker wrote:
> Hi all,
>
> I've read in the documentation (html AudioUnits doc coming
with the > SDK) that an AU can publish a Latency property and
that the AU is free > to change the value of this property. I
could have to use this feature > and have some questions about
this.
Not typically = the property is described generally as a read
only property and in most cases will be read only (though of
course, some AU's can/will want to have this configurable).
Just to clarify, Bill you mean that a host would not change the property,
but it is expected that a plugin still may change it at any time,
right? Because Philippe was talking about the plugin itself
changing it, and I
think that then you were replying about hosts changing it. Maybe I'm
misreading, though...
My read on that is that Bill was saying that typical AU's will not
change that property. But some (like ours) will want to set it
depending on other things (buffer size for example). However it
seems to me that it might be unreasonable to expect hosts to read
this at *any* time, i.e. every render call. If you were changing
this on render calls I would imagine that would cause mayhem (or
just not work) in hosts that are doing automatic delay
compensation. I think typically plugs would want to set this in
Initialize(), and possibly change it when the stream format or the
max frames per slice changes.
I myself cannot figure how a host could manage delay compensation if
AUs latency is changed at *each* render calls or even only very
often. Notice however that "changing this latency at *any* time" -
including while in the render callback - does not mean changing it
at *each* render calls. In my particular case, the latency will be
exactly the size of one render buffer, ie the size which is passed
in the render callback parameter inNumberFrames. As far as I have
understood, this value is under the control of the host. Running
with Logic, which is the host I use for my testings, this size is
chosen in the "Audio Drivers" panel. I cannot query this information
from the host before Initialize method is running (no api I know
about for this). I don't have either this information when
ChangeStreamFormat is called because the stream format does not tell
anything about the buffer size at render time. SetMaxFramesPerSlice
does not do it either. The value I receive here is 1156 (which is a
default value set somewhere in the SDK code) while the size I've
chosen in the Audio panel is 256.
GetLatency is called after Initialize if the AU is actually used in
a track and before a call to the render callback. The earliest time
I get the information I need (the size of the render buffer and
therefore my true latency) is the first call to my render callback,
so I cannot return here a valid value. That's the reason why I was
considering using the notification mechanism to tell the host what
is my latency. If some special constant was defined, such as
kAULatencyIsOneBuffer, this is the value I'd have returned to the
host in GetLatency. If anyone has a better idea, I'll take it :))
There are some problems with this:
- You cannot depend on the render frames being any particular number.
All you are guaranteed of is that this number will be less than or
equal to MaxFramesPerSlice. So normally you could make your latency
dependent on this value, which you know before Render, and you are
notified of when it changes, but...
- Logic currently doesn't tell plugins the correct MaxFramesPerSlice
value. This is a known issue and I believe they are working on it.
This value should actually be the *larger* of the process buffer
range setting and the HW I/O buffer size. This is the same as it
used to be on OS 9 with the VST version.
- In Logic, there are two types of tracks, which I call "live mode"
and "non-live mode". Live mode tracks include those that are
record-armed, are receiving MIDI, are sending data to/from external
equipment via I/O helper, and any busses those tracks are sent to.
Basically they are any tracks that are working with "live" audio. If
a track is in live mode, the render frames size is equal to the HW
I/O buffer size. If a track is not in live mode, the render frames
size is equal to the larger of the process buffer range setting and
the HW I/O buffer size, i.e. MaxFramesPerSlice (what it *should* be
anyway, but currently is not).
- The current process buffer range setting sizes are:
Small = 512 samples
Medium = 1024 samples
Large = 2048 samples
This means that the smallest MaxFramesPerSlice that you would ever
see would be 512. Ideally I would like to see a "None" setting, in
which case MaxFramesPerSlice (which determines the latency in the
case of our plugs) would actually be determined by the HW I/O buffer
size. This is how it works in most other hosts.
Jim
BTW, would asking for definition of such special constants be a
stupid suggestion?
Cheers.
Jim
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.
Philippe Wicker
email@hidden
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.