inOffsetSampleFrame?
inOffsetSampleFrame?
- Subject: inOffsetSampleFrame?
- From: Takeshi Yokemura <email@hidden>
- Date: Sun, 30 Apr 2006 19:52:59 +0900
Hi, everyone.
I'm now developing a software synthsizer,
and it kinda finished for now.
But it seemes not to be accurate about time control.
The timing of the notes randomly shift.
The amount of the shifts seem to be about 10ms at maximum.
As the result, when I play a sereis of 32nd notes for example,
it doesn't come with the same intervals.
I noticed that note start/stop occurs
only at the beginning of Render( ) in my program,
because I didn't consider "when in a slice" an event has occured.
To fix this, I have to refer to inOffsetSampleFrame , right?
But I cannot figure the time relationship of
- timing that Render( ) is called
- time origin of the value of inOffsetSampleFrame
My understanding so far is:
slice n slice n+1 slice n+2
|--------------------|--------------------|--------------------|
^Note-on event ^Render() call
|<----->| |<------------------>|
inOffsetSampleFrame output
^Note starts playing
- if an event occurs in Slice n , it's put into the queue
- at somewhere in next slice( Slice n+1) , rendering Render() is
called , and output waveform is formed on the buffer
- in Slice n+2 the buffer is read and the sound actually comes out
Is it right or not??
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden