Re: Changing the latency from within the render callback.
Re: Changing the latency from within the render callback.
- Subject: Re: Changing the latency from within the render callback.
- From: Philippe Wicker <email@hidden>
- Date: Thu, 26 Feb 2004 22:35:23 +0100
On Feb 26, 2004, at 9:50 AM, Jim Wintermyre wrote:
On Mon, 23 Feb 2004, William Stewart wrote:
On 23/02/2004, at 1:51 PM, Philippe Wicker wrote:
> Hi all,
>
> I've read in the documentation (html AudioUnits doc coming with
the > SDK) that an AU can publish a Latency property and that the
AU is free > to change the value of this property. I could have to
use this feature > and have some questions about this.
Not typically = the property is described generally as a read only
property and in most cases will be read only (though of course, some
AU's can/will want to have this configurable).
Just to clarify, Bill you mean that a host would not change the
property,
but it is expected that a plugin still may change it at any time,
right? Because Philippe was talking about the plugin itself changing
it, and I
think that then you were replying about hosts changing it. Maybe I'm
misreading, though...
My read on that is that Bill was saying that typical AU's will not
change that property. But some (like ours) will want to set it
depending on other things (buffer size for example). However it seems
to me that it might be unreasonable to expect hosts to read this at
*any* time, i.e. every render call. If you were changing this on
render calls I would imagine that would cause mayhem (or just not
work) in hosts that are doing automatic delay compensation. I think
typically plugs would want to set this in Initialize(), and possibly
change it when the stream format or the max frames per slice changes.
I myself cannot figure how a host could manage delay compensation if
AUs latency is changed at *each* render calls or even only very often.
Notice however that "changing this latency at *any* time" - including
while in the render callback - does not mean changing it at *each*
render calls. In my particular case, the latency will be exactly the
size of one render buffer, ie the size which is passed in the render
callback parameter inNumberFrames. As far as I have understood, this
value is under the control of the host. Running with Logic, which is
the host I use for my testings, this size is chosen in the "Audio
Drivers" panel. I cannot query this information from the host before
Initialize method is running (no api I know about for this). I don't
have either this information when ChangeStreamFormat is called because
the stream format does not tell anything about the buffer size at
render time. SetMaxFramesPerSlice does not do it either. The value I
receive here is 1156 (which is a default value set somewhere in the SDK
code) while the size I've chosen in the Audio panel is 256.
GetLatency is called after Initialize if the AU is actually used in a
track and before a call to the render callback. The earliest time I get
the information I need (the size of the render buffer and therefore my
true latency) is the first call to my render callback, so I cannot
return here a valid value. That's the reason why I was considering
using the notification mechanism to tell the host what is my latency.
If some special constant was defined, such as kAULatencyIsOneBuffer,
this is the value I'd have returned to the host in GetLatency. If
anyone has a better idea, I'll take it :))
BTW, would asking for definition of such special constants be a stupid
suggestion?
Cheers.
Jim
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.
Philippe Wicker
email@hidden
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.