Re: Preroll semantics & clarification of kAudioUnitProperty_OfflineRender
Re: Preroll semantics & clarification of kAudioUnitProperty_OfflineRender
- Subject: Re: Preroll semantics & clarification of kAudioUnitProperty_OfflineRender
- From: Heinrich Fink <email@hidden>
- Date: Tue, 22 Nov 2011 15:19:34 +0100
Hi Brian,
thanks for your detailed and very helpful comments!
> I have a hunch that, rather than needing some means to determine whether an AU support non-real-time rendering, what you really need is some means to determine whether it is safe to assume that you can mix offline and online rendering from one call to the next without producing an invalid state.
Switching between offline and online mode flawlessly is a premise that I just assumed to be possible, at least when assuring that the "context switch" happens only in between render calls. Of course you are right that this might not always be the case.
> The only safe assumption would seem to be that you can either use OfflineRender exclusively, or Online Render. In either case, your safest bet would be to issue a Reset() call to clear any possible invalid state in the AU, and then use a single mode only.
Calling AudioUnitReset in between is not a viable option, unfortunately. This would, for example, cut off a delay effect while switching between an offline preroll and the following realtime feed, which obviously is not desirable.
> I'm sure that most AUs have state that is not formatted any different in each mode (if the AU even has an offline mode), in which case my caveat is moot. But this sort of thing is completely opaque to the AU host.
In this case we will probably just test a couple of audio unit effects popular in broadcasting (relevant to our application scenario) for their ability to render faster than realtime, and then recommend these. Of course I would rather have our audio engine to load any audio effect available on the system and open the effect chain to the user.
In the ideal case, we would not even have to distinguish between preroll and realtime feeds since audio units render only relative to sample time and not CPU time (as you mentioned).
This is with the exception of AUFilePlayer, of course. I would still love to hear some comments about AUFilePlayer and its offline (in)capabilities. It seems strange to me that file streaming to audio is not able to catch up with faster render requests.
best regards,
Heinrich Fink
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden