• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Offline processing
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Offline processing


  • Subject: Re: Offline processing
  • From: Alberto Ricci <email@hidden>
  • Date: Tue, 4 Feb 2003 10:29:00 +0100

At 5:13 PM +1300 2/4/03, Steve Hoek wrote:
Another desired consequence of time stretching some audio is that the host's
corresponding "automation" is remapped in time. I fear this functionality
may be a pipe dream.

Yes, non 1-1 time relatioships may make some remapping necessary. And it should be the AU doing the remapping, since the host can't know in advance how time will be remapped.

I'm also thinking about markers in a sound editing application. A time reversal operation should reverse the marker positions as well, whereas a time stretch should move them, and so on.

I can think of two solutions:

1 - the host provides callbacks for getting and setting marker positions as well as automation times. The AU is responsible for modifying the times in order to match its transformation.

2 - the AU implements a MapInputTimeToOutputTime function. The host calls it as necessary. For instance, a time reversal would implement it as:
t' = L-t (where L is the duration of the selection)
whereas a linear time stretch would implement it as:
t' = f*t (where f is the constant time stretching factor)
but a time-varying time stretch would be required to compute an integral of f(t) over time. Well, that's no big deal after all.

Again, I think that all this can be done through AU properties.

The problem is that all this introduces some complications for both AUs and hosts. We must make sure that an AU can be written without supporting time remapping, in which case no parameter scheduling and automation is supported, and markers are not preserved.

In some cases, such remapping does not need to be implemented at all, even though the times are not preserved 1-1.
Think of a vibrato effect. This can be currently implemented as a simple AU doing the following:

y[t] = x[t - a*sin(w*t)], where a is the vibrato amplitude (time), w the vibrato frequency.

It is non-causal, but since a is usually pretty small, you can make the process causal by introducing a small latency:

y[t] = x[t - a - a*sin(w*t)],
y[t] = x[t - a*(1+sin(w*t))]

We essentially reduced the vibrato to a causal, time-varying time stretch. Since a is pretty small, we don't usually care if markers or parameter automation times are remapped to their new exact position, since they would be offset by a few milliseconds at most.

This was meant to show two things: that time-remapping occurs even in simple and common effects such as a vibrato, and that it is nonetheless not always important to remap all time-related data based on the new mapping, sometimes you can just ignore it.

Alberto.
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives: http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.

References: 
 >Offline processing (From: Steve Hoek <email@hidden>)

  • Prev by Date: Re: AU "offline" processing
  • Next by Date: Offline processing, Non-causal effects, etc.
  • Previous by thread: Re: Random access [was: Re: Offline processing]
  • Next by thread: Re: AU "offline" processing
  • Index(es):
    • Date
    • Thread