Re: Sample clock, gui synch issues
Re: Sample clock, gui synch issues
- Subject: Re: Sample clock, gui synch issues
- From: Paul Davis <email@hidden>
- Date: Wed, 25 Aug 2010 17:05:55 -0400
On Wed, Aug 25, 2010 at 4:49 PM, Andrew Coad <email@hidden> wrote:
> The problem that I found with using mSampleTime as the fundamental timing
> reference is that it does not have sufficient granularity when you have high
> BPM rates with subdivisions within each beat. For example, if you define
> your smallest unit of time (maybe a 1/32nd of 1/64th of a measure) in terms
> of integral numbers of samples, your actual playback BPM can differ from
> your desired BPM by a small but noticeable amount (professional musicians
> are amazing - they can pick up on very small deviations).
at even just 44kHz, consider a piece at 220bpm (with 1 beat = 1/4
note). thats 3+2/3 beats per second. so each beat is represented by
more than 12,000 samples. lets suppose that you want use 1/128ths in a
piece. Each 1/128th is represented by nearly 376 samples. if you are
off by 1 sample, that corresponds to 1 83-millionth of a beat. Nobody
can detect this. Ever.
now, there *is* a problem, but its not caused by the lack of
resolution of the sample clock. the problem is that you will
constantly be rounding positions that are inter-sample to whole
samples, and when done across a moderately long timeline this can lead
to rounding errors that *are* audible. i suspect that your problem is
not that you were using the sample clock as a time reference, but that
you kept running into the effects of rounding. If you make the 1
sample error 400 times and in the same direction, you end up with a
position that is off by about 1/128th measure, and although that's
hard to detect, its certainly not impossible.
> What I am now doing is using machine "ticks" as the fundamental timing
> reference and convert between machine ticks and samples when I want to
> calculate where in a render buffer I need to start the audio data. A
this is a fundamental mistake when done simplistically as i believe
you have done. if you put a DLL (or even a PLL) between them to
provide a sync between the two, then it can work, but as it stands
you've got two independent clocks running on their own and you're
pretending that they run in sync. They do not run in sync, and in the
long run (or on the wrong machine) this will cause errors similar to
rounding but much faster than rounding would.
your timing should use the sample clock of the audiointerface (via its
presentation within CoreAudio) as its timing reference, and for
musical time you need to use either fixed point or (simpler) floating
point positions (e.g. 1189.2292 beats). this is incredibly accurate,
has no sync issues and almost no rounding issues. there's nothing new
or clever to invent here - its been done this way for more than a
decade by many different programs. And as either Bill or Jeff noted,
NSTimer is absolutely not appropriate as a timing source for music
sequencing.
--p
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden