Re: Offline processing
Re: Offline processing
- Subject: Re: Offline processing
- From: James Chandler Jr <email@hidden>
- Date: Wed, 05 Feb 2003 18:49:50 -0500
on 2/5/03 4:08 AM, Alberto Ricci at email@hidden wrote:
>
In fact, it makes sense being able to define real time even if the
>
input material is not live input, but something that was previously
>
recorded or generated on the fly.
Here's a wacky idea...
In the case of a "real time" plugin operating on previously recorded or
on-the-fly synthesized material, PERHAPS one could abuse the current AU spec
sufficiently to write a normalizer or other off-line effect?
Though the approach might drive a host completely insane, perhaps one could
set the AU Latency property of a normalizer plugin to a crazy-big value like
ten minutes, an hour, or whatever?
Assuming that the song track lengths are shorter than the outrageous Latency
we specify, perhaps this would force the host to pre-spool the entire track
to the AU at the start of playback?
The AU could cache the huge input to disk or memory, process the entire
stream, then return the normalized or reversed data to the host?
Or maybe there are reasons this wouldn't work even if the Host could
tolerate the confusion... Perhaps the host would feed the big stream as
small sequential buffers, and expect return of each buffer before sending
the next? In that case the plugin would still never get "random i/o access"
to the entire track data.
>
If, on the other hand, the input is readily-available,
>
pre-recorded or pre-generated material, and the host provides
>
random-access callbacks, then you can of course run them in order to
>
get real time output, with no particular efforts on your side,
>
provided you have a fast enough CPU of course, but that's a different
>
issue.
It may be that an offline plugin API is too application-specific for Apple
to concern itself with.
As others have said, offline plugin API might be simplest if the host
provides random-access and other utility callbacks.
Perhaps it would work most predictably if the Host is expected to pre-set
relevant properties like the audio region to modify, markers, whatever. Then
the Host would transfer control to the plugin and wait around plugin
completion.
IOW, a very simple and not-over-engineered modal synchronous model. Am not
trying to be spitefully-sarcastic, but Apple seems to find it difficult to
avoid over-engineering an API (GRIN).
James Chandler Jr.
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.