Re: Changing the latency from within the render callback.
Re: Changing the latency from within the render callback.
- Subject: Re: Changing the latency from within the render callback.
- From: Philippe Wicker <email@hidden>
- Date: Fri, 27 Feb 2004 08:04:16 +0100
On Feb 27, 2004, at 7:08 AM, Marc Poirier wrote:
After reading this, I wonder, why is your latency dependent on each
render
slice size?
Some times ago I published on my web site the sources of a set of tools
enabling audio communication between processes. II share my spare time
between this project and another more "usual" one for which there is no
such latency problem. Recently there has been a thread on this list
called "multithreaded mixer" which showed me that the synchronization
method I'm currently using between a source audio thread in one process
and a destination audio thread in another process is not compliant with
real time thread scheduling constraints (the destination thread is
blocked waiting for the source thread to complete its rendering, which
is the "number one bad thing to do" according Jeff Moore's answers). So
I have to change my synchronization technique. In some configuration
(e.g. if an AU subgraph belonging to process A is inserted in an AU
chain in a host running in process B, then the audio is sent through a
"send plug" and received back via a "return plug") I need the audio be
rendered by the part of the AU chain B located ahead of the "send" port
before I can send it to the external AU subgraph A and get it back via
the "return" port to end the render. Because I cannot block the audio
thread B, one solution is to use a ping-pong shared buffer between both
threads. Hence the latency of one buffer.
Marc
On Thu, 26 Feb 2004, Philippe Wicker wrote:
I myself cannot figure how a host could manage delay compensation if
AUs latency is changed at *each* render calls or even only very often.
Notice however that "changing this latency at *any* time" - including
while in the render callback - does not mean changing it at *each*
render calls. In my particular case, the latency will be exactly the
size of one render buffer, ie the size which is passed in the render
callback parameter inNumberFrames. As far as I have understood, this
value is under the control of the host. Running with Logic, which is
the host I use for my testings, this size is chosen in the "Audio
Drivers" panel. I cannot query this information from the host before
Initialize method is running (no api I know about for this). I don't
have either this information when ChangeStreamFormat is called because
the stream format does not tell anything about the buffer size at
render time. SetMaxFramesPerSlice does not do it either. The value I
receive here is 1156 (which is a default value set somewhere in the
SDK
code) while the size I've chosen in the Audio panel is 256.
GetLatency is called after Initialize if the AU is actually used in a
track and before a call to the render callback. The earliest time I
get
the information I need (the size of the render buffer and therefore my
true latency) is the first call to my render callback, so I cannot
return here a valid value. That's the reason why I was considering
using the notification mechanism to tell the host what is my latency.
If some special constant was defined, such as kAULatencyIsOneBuffer,
this is the value I'd have returned to the host in GetLatency. If
anyone has a better idea, I'll take it :))
Philippe Wicker
email@hidden
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.