Hi Paul,
thanks for your answer. This is quite a relief! I have read over some related messages in this list again, and I can now follow your rationale.
Just for the record, this related statement from Bill Stewart about kAudioUnitProperty_OfflineRender is very clear:
Bill Stewart, March 2006, http://lists.apple.com/archives/coreaudio-api/2006/Mar/msg00237.html
I should also add that we don't expect all AUs to implement this property - for those AUs that don't do anything different, there is no need for them to implement this property.
Out of some reasons I have missed that statement while doing my research.
So the correct behavior as a host implementor would be to first check if kAudioUnitProperty_OfflineRender is supported. If this is the case and if you are about to switch to an offline context (as a host), you should set this to 'true'. If an audio unit does NOT support this property, i.e. it returns kAudioUnitErr_InvalidProperty, I could assume (at least for effect units) that rendering in offline mode - possibly rendering faster than real time - should just work. Of course this is not the case with AUFilePlayer. According to previous discussions AUFilePlayer does not play well with offline rendering behavior. Just to be sure: Is this still the case?
So I will further assume that our "preroll" scenario is a valid use case. To be on the safe side, I will further expect that using AUFilePlayer as the primary file playback generator unit is probably not an option, and that we will probably have to roll our own (e.g. use ExtAudioFileRead directly).
I can now see that kAudioUnitProperty_OfflineRender does not provide the means to determine whether an audio unit supports a non-realtime rendering context. So I ask myself: What would be the correct way to ask an audio unit if it supports offline rendering? For example, a property for which AUFilePlayer would return false.
best regards,
Heinrich Fink On Nov 21, 2011, at 17:55 , Paul Davis wrote: On Mon, Nov 21, 2011 at 11:49 AM, Heinrich Fink < email@hidden> wrote: Hi,
I am currently designing an audio engine for a broadcast application based on the AudioUnit and AUGraph APIs. It's basically file-based audio input, a bit of routing plus some filtering effects. We might have to use a different output path than the usual (preferred) way of using the AUHal. This has been discussed previously here: http://lists.apple.com/archives/coreaudio-api/2011/Nov/msg00077.html - and is not the issue of my question. To quickly sum up the previous discussion: we get a callback from a broadcasting card's SDK to fill buffers with audio data (e.g. Blackmagic UltraStudio 3D). In order to make this work, we will not use the AUHal approach, but rather call the audio graph by ourselves.
The crux of the matter is the following:
Before playback starts, we have to fill hardware buffers as quickly as possible during a "preroll" phase. This should be faster than realtime, it is basically offline rendering of the AUGraph for about 2 seconds. After the desired watermark level of the hardware buffers has been reached, the callback semantics change back to realtime behavior (similar to being called by the AUHal).
According to the documentation and related discussions on this mailing list, I understand that a render context like "preroll phase" would require the property kAudioUnitProperty_OfflineRender to be supported by each audio unit in the graph, and to be set to "true". This works under the assumption that if kAudioUnitProperty_OfflineRender is not supported by an audio unit and is not set to "true", then I must not call AudioUnitRender faster than real time, i.e. correct behavior of the audio unit would not be guaranteed anymore.
As usual, its not (well) documented, but as a host implementor I can tell you that this absolutely not my interpretation of that flag. My reading is that it exists to tell the plugin "we are not rendering in realtime mode, so if you want to take extra time for any reason, you can". Nothing more, or less. If anyone has any evidence that this interpretation is wrong, I'd love to hear about it. host implementations just ignore setting this property and assume that AU effects would be fine being called faster than realtime? It is understandable that units
I certainly assume that I can call any AU as rapidly as I want.
|