and got a recording rate about 2 times too fast, with loud static
Everyone is probably off enjoying festivities :) Mike suggested the
initializer above. So, it seems like the right one. Could you post
an example of the actual values that are getting plugged into those
variables and perhaps an example of how you're using the converters?
The symptoms you describe sound like the outConverter is expecting a
sampling rate twice the amount of the inConvertor's output (assuming
you're passing the data from one to the other).
... but these are just observations from the Peanut Gallery, I am a
hope that helps,
i appreciate your reply and it is a good suggestion.
so comparing values to what i see in my working tiger version these
are the issues i see:
> minimumBufferSeconds in recent version is taken as a multiplier to
get a max value in numBufferFramesForSourceSampleRate which causes a
totalBufferFrames of 44100 instead of 196.
so i changed minimumBufferSeconds to 0 as it was in older version of
>recent version of MTConversionBuffer.m, initWithCapacityFrames, uses
a minimum of srcChans and dstChans which is 1; older code just used
srcChans, which is 2 for the outConverter, so i put it back to srcChans.
this gave me almost same values as older version except older code has
in MTConversionBuffer.m, 'initWithSourceDescription':
srcFrames = ((srcFrames * srcDecription.mBytesPerFrame) / srcChans) /
and likewise for dstFrames.
well, i don't have a srcDescription or dstDescription in the new
version at this point in the code, but i guess i could get that the
same way the old version of the code did using MTConversionBuffer
am i going about this correctly? i am fudging it all up to produce the
same values as the older version of MTConversionBuffer. this isn't
feeling good -- i am having doubts that it will produce good results.
perhaps i should be doing more of what the recent version of the code
so i take a step back and say 'initWithSourceSampleRate:' takes a set
of arguments that i pass in but get bad record: why? is there an issue
of #bytes per frame or per packet? was tiger using 16 or 32 bit
somethings but leopard is using 32 or 64 bit somethings?
it is hard to track down what is going wrong.
i think i will pull out the descriptions from inputDevice and the
outputDevice and compare them what i see in older working code version.
i guess no one has code that uses MTConversionBuffer for simple
recording and playback.
here are the values i see now:
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden