Re: Offline processing
Re: Offline processing
- Subject: Re: Offline processing
- From: Kurt Bigler <email@hidden>
- Date: Mon, 03 Feb 2003 23:08:32 -0800
on 2/3/03 8:13 PM, Steve Hoek <email@hidden> wrote:
>
> Offline - I'd rather another thread was started on this, restating the
>
> desired needs, usage scenarios and so forth.
>
> We've never scene the distinction between off-line and real time as a
>
> necessary distinction to make at the AU level - typically that is a host
>
> level decision.
>
>
Okay, let's kick off the offline discussion.
I'm taking this to be a brainstorm, so here goes, with no apologies, and my
full ignorance exposed as usual...
Another thought that wasn't on Steve's list is the simple need to do
processing that is "nothing special" i.e. might well be still 1-to-1 in
terms of buffering, but just requires too much processing to be completed in
real-time. This is probably obvious, but I thought it should be made
explicit.
Part of the picture for support of this kind of processing, maybe, is the
ability to set up virtual (?) audio drivers that are simply disk spoolers.
I don't know whether this consideration vanishes in the context of a typical
host application, but it occurred to me there might be some advantage in
making this actually _transparent_ to host applications by virtue of being
hidden inside of a non-real-time driver that supports disk i/o but which
otherwise appears to provide an audio engine to the host app, except that
the IOProc threads would be neither high-priority nor time-constraint.
Perhaps the current design tries to keep this out of driver-land by solving
these problems on the AudioUnit level, and as I recall the issue of disk i/o
is at least half solved there.
Jumping subjects slightly, Since OS X can so easily be a web server, it
seems likely that the possibility of audio streaming functionality based on
CoreAudio services (i.e. rather than on QuickTime) should be part of the
overall vision. In that case you are talking about processing that will be
non-real-time at one level, but buffered into real-time for use on the
client side. In that case it might be that some special supports for
predictive/adaptive control over streaming might suggest some enhancements
to the current APIs. The server side might want to help assure that it can
keep its end of the bargain and not introduce any glitches into the
client-side expectations of what will be coming down the pipe. Thus the
piece that is like the HAL but in a network server context might try to
adjust callback rates to cover rough spots in the server-side computations,
i.e. places where cpu usage is not meeting previously projected
expectations. ... These are just guesses - I really mean to imply all the
other things about streaming over the web that I know nothing about.
And maybe there is something special needed to support MIDI-over-IP
streaming technology, and/or remote access to AudioUnit parameter changes?
I would love to see a single user-level modality that expresses all the
following as equals:
streaming to/from audio devices
streaming between applications (real-time or not)
streaming across a network to/from a published device or application
(real-time or not)
Ideally this implies (I think) that the distinction between real-time or
non-real time is determined _entirely_ by the final output endpoint, i.e.
the end that does the pulling of the data. But his does not rule out the
possibility of a non-real-time source being used with a real-time
destination, and brings up the buffered-media-stream issues I mentioned
above in the web server context. Automatic insertion of spooling buffers
into a distributed AUGraph driven by a transport control?
-Kurt Bigler
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.