Re: Preroll semantics & clarification of kAudioUnitProperty_OfflineRender
Re: Preroll semantics & clarification of kAudioUnitProperty_OfflineRender
- Subject: Re: Preroll semantics & clarification of kAudioUnitProperty_OfflineRender
- From: Brian Willoughby <email@hidden>
- Date: Tue, 22 Nov 2011 21:10:38 -0800
On Nov 22, 2011, at 10:45, Stefan Gretscher wrote:
Am 22.11.2011 um 15:19 schrieb Heinrich Fink:
I have a hunch that, rather than needing some means to determine
whether an AU support non-real-time rendering, what you really
need is some means to determine whether it is safe to assume that
you can mix offline and online rendering from one call to the
next without producing an invalid state.
Switching between offline and online mode flawlessly is a premise
that I just assumed to be possible, at least when assuring that
the "context switch" happens only in between render calls. Of
course you are right that this might not always be the case.
I would argue that for the above reasoning all plug-ins that choose
to support kAudioUnitProperty_OfflineRender must also handle
toggling between realtime and offline rendering properly without
getting stuck in invalid states, and that it is correct and
required for a host to toggle this accordingly when implementing a
pre-rolling as discussed earlier in this thread.
I did not mean to imply that the AU would get "stuck" in an invalid
state, but I do think that there might be a glitch in the audio if
you change render quality on the fly.
I think there are basically two reasons for OfflineRender. One, as
you stated, is that the real-time constraint is removed, and thus
blocking calls can be made even though their response time may be
unbounded. Two would be that the render quality can be increased by
using more CPU time than is available in real time mode.
For the latter cases, using more CPU for rendering could easily alter
the basic algorithm used, and thus the state variables for a
different algorithm may not even be valid. Sure, in the simplest
cases the state might only be a literal buffer of some window of
samples. However, other cases might involve specific digital filters
that have state variables that are not literal input samples, but
rather might be values that are specific to that filter. If the
OfflineRender quality uses a totally different filter, then the state
variables might not be interchangeable between offline and online
rendering.
Thus, it makes sense that there might be some side effect to changing
between offline and online, such that the only way to guarantee no
audio glitch is to execute a Reset() call between. You would not
necessarily need a Reset(), but there might be an audio glitch
without it.
I would also argue that the host would be responsible for making
sure it doesn't toggle this property while calling the rendering
from another thread at the same time.
Is there any requirement in the CoreAudio specification that an AU
object be re-entrant? In other words, can you safely call any AU
from more than one thread at the same time?
That all said, just like Brian I am not aware that any of this is
explicitly stated in the documentation, so current plug-in
implementations may behave differently.
Not only is there the opportunity for legacy and current
implementations to differ, but certain classes of plugins that do not
fall into obvious constraints, logically, can differ on how they
interpret these vaguely documented features. While it's reasonably
to infer that a given class of plugin might always perform in a
certain way, that's no guarantee that all valid classes of plugins
must necessarily perform in the same way.
Brian Willoughby
Sound Consulting
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden