Re: Issues for loading an array of data into a stream buffer
Re: Issues for loading an array of data into a stream buffer
- Subject: Re: Issues for loading an array of data into a stream buffer
- From: Jeff Moore <email@hidden>
- Date: Tue, 04 Aug 2009 09:24:03 -0700
So, we can state with some certainty that the sample code works
outside of Java. So, my guess would be that something about how the
Java side of things is involved is messed up. For example, you aren't
trying to call back into Java from your render function are you? That
definitely wouldn't work.
A couple of other things that come to mind (in no particular order):
- Are all the ASBD's you are filling out correct? We fixed one
already, but there may be others...
- Is Java rendering into the buffer in an asynchronous way?
- 470 is a very odd buffer frame size. The default is normally 512.
Are you setting it differently? Why?
- Are you trying to copy mono data into an interleaved stereo buffer
perhaps? Or trying to copy interleaved data into one side of a de-
interleaved stream? You might want to check the ASBD of the output
format to be sure it matches what you expect, especially since most
audio devices have at least two channels and your ASBD is talking
about just one channel of data...
On Aug 4, 2009, at 7:52 AM, David Lecoutre wrote:
I see, if I set the fields with the right values, I'm getting some
sound. But it is pretty bad, I hear only noise as if I was playing
the buffer very fast.
I did double check and the buffer is fully load one time. The
Callback function "RenderSin" is called 90-100 times per sec, the
"inFrame" variable is equal to 470 ( this is the number of values
that will be load in the "mBuffers[0].mData" each time the callback
function is called). The size of my buffer from java is equal 44000
( around 1 sec of sound ). It is taking 100 callback to fully load
my buffer into "mBuffers[0].mData".
This is my callback function :
OSStatus MyRenderer(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData)
{
// load 1/90 of the buffer in mBuffers[0].mData
RenderSin (sSinWaveFrameCount,
inNumberFrames,
ioData->mBuffers[0].mData,
sSampleRate,
sAmplitude,
sToneFrequency,
sWhichFormat,
MySoundBuffer,
SizeBuffer);
sSinWaveFrameCount += inNumberFrames;
return noErr;
}
This is how the call back function is called :
// call the callback function for 1 second...
CFRunLoopRunInMode(kCFRunLoopDefaultMode, 1, false);
This is how I initialize the callback function :
AURenderCallbackStruct input;
input.inputProc = MyRenderer;
input.inputProcRefCon = NULL;
err = AudioUnitSetProperty
(gOutputUnit,kAudioUnitProperty_SetRenderCallback,
kAudioUnitScope_Input,0, &input, sizeof(input));
I think that the call back function is called at 90-100hz and the
buffer "mBuffers[0].mData" is send to the output Unit in real time
( each time that the callback function is call ). Maybe I didn't
fully understand how the callback function is working.
Thanks,
David
On Mon, Aug 3, 2009 at 8:05 PM, Jeff Moore <email@hidden> wrote:
You problem is probably that you are not filling out the ASBD
correctly. Specifically, you are not setting the fields,
mBytesPerPacket, mBytesPerFrame, and mBitsPerChannel to the proper
values to describe a 32 bit floating point number, which is four
bytes in size. You have filled out the ASBD as if the 32 bit float
was only two bytes long.
--
Jeff Moore
Core Audio
Apple
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden